CN115113793A - Gesture erasing method and device and electronic equipment - Google Patents

Gesture erasing method and device and electronic equipment Download PDF

Info

Publication number
CN115113793A
CN115113793A CN202210583448.5A CN202210583448A CN115113793A CN 115113793 A CN115113793 A CN 115113793A CN 202210583448 A CN202210583448 A CN 202210583448A CN 115113793 A CN115113793 A CN 115113793A
Authority
CN
China
Prior art keywords
touch point
function
touch
target range
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210583448.5A
Other languages
Chinese (zh)
Inventor
王敏
张振宝
高萌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Qingdao Hisense Commercial Display Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Qingdao Hisense Commercial Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd, Qingdao Hisense Commercial Display Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202210583448.5A priority Critical patent/CN115113793A/en
Publication of CN115113793A publication Critical patent/CN115113793A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a gesture erasing method and device and electronic equipment, relates to the technical field of touch control, and is used for solving the problem that in the prior art, a user cannot keep an erasing action for a long time in an erasing process, so that the electronic equipment is mistakenly quitted from a gesture erasing function. The method comprises the following steps: after the first function is started, if a second touch point except the first touch point exists in the target range, marking the second touch point as a standby touch point; and after the first touch point disappears, if the standby touch point exists in the target range, continuing to execute the first function.

Description

Gesture erasing method and device and electronic equipment
Technical Field
The present disclosure relates to the field of touch technologies, and in particular, to a gesture erasing method and apparatus, and an electronic device.
Background
In the prior art, a user writes content on an electronic device through a writing function of the electronic device. When erasing the written content, an erasing action needs to be maintained to erase the written content, and when the erasing action is changed, the electronic device exits the gesture erasing function, so that the user experience is poor.
Disclosure of Invention
In view of this, the present disclosure provides a gesture erasing method and apparatus, and an electronic device, to solve the problem that in the prior art, a user cannot keep an erasing action for a long time during an erasing process, so that the electronic device exits from a gesture erasing function by mistake.
In order to achieve the above object, the present disclosure provides the following technical solutions:
in a first aspect, the present disclosure provides a gesture erasing method applied to an electronic device including a touch screen, including: after the first function is started, if second touch points except the first touch points exist in the target range, marking the second touch points as standby touch points; the first touch point is used for triggering a first function; and after the first touch point disappears, if a standby touch point exists in the target area, continuing to execute the first function.
In an implementation manner, after the electronic device starts the first function, if a second touch point other than the first touch point exists in the target range, the gesture erasing method provided by the present disclosure further includes, before marking the second touch point as a standby touch point: acquiring a first contact area corresponding to at least one first touch point on a touch screen; the first function is initiated if the first contact area is greater than or equal to an area threshold.
In an implementation manner, the gesture erasing method provided by the present disclosure further includes: under the condition that the first contact area is smaller than an area threshold value and newly-added third touch points exist in a target range within first preset time, acquiring a second contact area corresponding to the third touch points; and in the case that the second contact area is greater than or equal to the area threshold, activating the second function.
In a practical manner, the gesture erasing method provided by the present disclosure further includes: and deleting the fourth touch point when the fourth touch point exists in other ranges except the target range and the electronic equipment is in the first mode.
In a practical manner, the gesture erasing method provided by the present disclosure further includes: under the condition that a fourth touch point exists in a range other than the target range and the electronic equipment is not in the first mode, acquiring a third contact area corresponding to the fourth touch point; and activating the third function if the third contact area is greater than or equal to the area threshold.
In an implementation manner, the gesture erasing method provided by the present disclosure further includes: and after the first touch point disappears, if the standby touch point does not exist in the target range, exiting the first function.
In an implementation manner, the gesture erasing method provided by the present disclosure further includes: after the first touch point disappears, if the standby touch point does not exist in the target range and a newly added fifth touch point exists in the target range within a second preset time, the first function is continuously executed.
In an implementation manner, the gesture erasing method provided by the present disclosure further includes: and after the first touch point disappears, if the standby touch point does not exist in the target range and the newly added fifth touch point does not exist in the target range within the second preset time, exiting the first function.
In a second aspect, the present disclosure provides a gesture erasing apparatus, including: the processing unit is used for marking a second touch point as a standby touch point if the second touch point except the first touch point exists in the target range after the electronic equipment starts the first function; the first touch point is used for triggering a first function; and the processing unit is further used for continuing to execute the first function if the standby touch point exists in the target area after the first touch point disappears.
In one implementation manner, the gesture erasing apparatus further includes an obtaining unit; the touch screen comprises an acquisition unit, a display unit and a control unit, wherein the acquisition unit is used for acquiring a corresponding first contact area of at least one first touch point on the touch screen; and the processing unit is used for starting the first function when the first contact area acquired by the acquisition unit is larger than or equal to the area threshold value.
In an implementation manner, the processing unit is further configured to control the obtaining unit to obtain a second contact area corresponding to the third touch point when the first contact area obtained by the obtaining unit is smaller than the area threshold and a newly added third touch point exists in the target range within a first preset time; and the processing unit is also used for starting the second function under the condition that the second contact area acquired by the acquisition unit is larger than or equal to the area threshold value.
In an implementation manner, the processing unit is further configured to delete the fourth touch point when the fourth touch point exists in a range other than the target range and the electronic device is in the first mode.
In one implementation, the gesture erasing apparatus further includes an obtaining unit; the processing unit is further used for controlling the obtaining unit to obtain a third contact area corresponding to a fourth touch point when the fourth touch point exists in a range except the target range and the electronic device is not in the first mode; and the processing unit is also used for starting the third function under the condition that the third contact area acquired by the acquisition unit is larger than or equal to the area threshold value.
In an implementation manner, the processing unit is further configured to exit the first function if the standby touch point does not exist within the target range after the first touch point disappears.
In an implementation manner, the processing unit is further configured to continue to execute the first function if the standby touch point does not exist in the target range and the newly added fifth touch point exists in the target range within a second preset time after the first touch point disappears.
In an implementation manner, the processing unit is further configured to exit the first function if the standby touch point does not exist in the target range and the newly added fifth touch point does not exist in the target range within a second preset time after the first touch point disappears.
In a third aspect, the present disclosure provides an electronic device comprising: communication interface, processor, memory, bus; the memory is used for storing computer execution instructions, and the processor is connected with the memory through a bus. When the electronic device is running, the processor executes the computer-executable instructions stored by the memory to cause the electronic device to perform the gesture erasure method as provided in the first aspect above.
In a fourth aspect, the present disclosure provides a computer-readable storage medium, wherein the computer-readable storage medium has stored thereon a computer program, which, when executed by a computing device, causes the computing device to implement the gesture wipe method as provided in the first aspect above.
In a fifth aspect, the present disclosure provides a computer program product for causing a computer to perform the gesture erasure method as designed in the first aspect when the computer program product is run on the computer.
It should be noted that all or part of the above computer instructions may be stored on the first computer readable storage medium. The first computer readable storage medium may be packaged with the processor of the gesture erasing apparatus or may be packaged separately from the processor of the gesture erasing apparatus, which is not limited in this disclosure.
Reference may be made to the detailed description of the first aspect for the description of the second, third, fourth and fifth aspects of the disclosure; in addition, for the beneficial effects described in the second aspect, the third aspect, the fourth aspect and the fifth aspect, reference may be made to beneficial effect analysis of the first aspect, and details are not repeated here.
In the present disclosure, the names of the above-mentioned gesture erasing apparatuses do not constitute limitations on the devices or functional modules themselves, and in actual implementations, these devices or functional modules may appear by other names. Insofar as the functions of the respective devices or functional modules are similar to those of the present disclosure, they are within the scope of the claims of the present disclosure and their equivalents.
These and other aspects of the disclosure will be more readily apparent from the following description.
Compared with the prior art, the technical scheme provided by the disclosure has the following advantages:
according to the gesture erasing method provided by the disclosure, a user triggers a first function through at least one first touch point input on a touch screen of an electronic device, and if the first function is a gesture erasing function, the first function is triggered. Thus, the electronic device turns on the gesture erasing function. After the electronic equipment starts the first function, the user can erase the written content. Meanwhile, in the erasing process, the electronic equipment detects the target range containing the first touch point. Therefore, when the electronic equipment has a second touch point except the first touch point in the target range, the second touch point is marked as a standby touch point. Therefore, even if the first touch point disappears, if the standby touch point exists in the target range, the electronic device can continue to execute the gesture erasing function. The problem in the prior art is solved. In the erasing process, a user cannot keep an erasing action for a long time, so that the electronic equipment is mistakenly quitted from the gesture operation function.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic view of a scene of a gesture erasing method according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a display device in a gesture erasing method according to an embodiment of the present disclosure;
fig. 3 is a second schematic structural diagram of a display device in the gesture erasing method according to the embodiment of the present disclosure;
FIG. 4 is a flowchart illustrating a gesture erasing method according to an embodiment of the present disclosure;
FIG. 5 is a second schematic view illustrating a second exemplary scenario of the gesture erasing method according to the present invention;
FIG. 6 is a second schematic flowchart of a gesture erasing method according to an embodiment of the present application;
FIG. 7 is a third schematic flowchart illustrating a gesture erasing method according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a display device provided in an embodiment of the present application;
fig. 9 is a schematic diagram of a chip system according to an embodiment of the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, aspects of the present disclosure will be further described below. It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced otherwise than as described herein; it is to be understood that the embodiments disclosed in the specification are only a few embodiments of the present disclosure, and not all embodiments.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The gesture erasing function in the embodiment of the present disclosure refers to a function of erasing written contents on the touch screen by using a palm to contact the touch screen, and regarding the palm as an eraser.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control device according to one or more embodiments of the present application, as shown in fig. 1, a user may operate the display device 200 through a mobile terminal 300 and the control device 100. The control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication, bluetooth protocol communication, wireless or other wired method to control the display device 200. The user may input a user command through a key on a remote controller, a voice input, a control panel input, etc. to control the display apparatus 200. In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so that the display device 200 with the synchronous display function can also perform data communication with the electronic device 400 in multiple communication modes. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The electronic device 400 may provide various content and interactions to the display device 200. The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function.
In some embodiments, the electronic device provided in the embodiments of the present application may be the display device 200 described above. The display device 200 triggers the first function by receiving at least one first touch point input by the user on the touch screen, where the first function is a gesture erasing function. In this way, the display device 200 turns on the gesture wipe function. After the display device 200 is turned on, the user can erase the written content. Meanwhile, in the erasing process, the display device 200 performs detection through a target range including the first touch point. In this way, the display apparatus 200 marks the second touch point as the spare touch point in the case where the second touch point other than the first touch point exists within the target range. In this way, even if the first touch point disappears, if the standby touch point exists in the target range, the display device 200 may continue to perform the gesture erasing function. Solves the problems in the prior art. In the erasing process, a user cannot keep an erasing action for a long time, so that the electronic equipment is mistakenly quitted from the gesture operation function.
Fig. 2 shows a hardware configuration block diagram of a display device 200 according to an exemplary embodiment. The display apparatus 200 as shown in fig. 2 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface 280. The controller includes a central processor, a video processor, an audio processor, a graphic processor, a RAM, a ROM, and first to nth interfaces for input/output. The display 260 may be a display with a touch function, such as a touch display. The tuner demodulator 210 receives a broadcast television signal through a wired or wireless reception manner, and demodulates an audio/video signal, such as an EPG data signal, from a plurality of wireless or wired broadcast television signals. The detector 230 is used to collect signals of the external environment or interaction with the outside. The controller 250 and the tuner-demodulator 210 may be located in different separate devices, that is, the tuner-demodulator 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200.
In some embodiments, a touch sensor, also referred to as a "touch screen". The touch sensor may be disposed on the display 260, and a touch screen, also called a "touch screen", is formed by the touch sensor and the display 260. The touch sensor is used to detect a touch operation applied thereto or nearby. The touch sensor may communicate the detected touch operation to the central processor to determine the touch event type. Visual output related to the touch operation may be provided through the display 260. In other embodiments, the touch sensor may be disposed on a surface of the display device 200 at a different location than the display 260. The pressure sensor is used for sensing a pressure signal and converting the pressure signal into an electric signal. In some embodiments, the pressure sensor may be disposed on the display 260. There are many types of pressure sensors, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor, the capacitance between the electrodes changes. The display device 200 determines the intensity of the pressure according to the change of the capacitance. When a touch operation is applied to the display 260, the display device 200 detects the intensity of the touch operation based on the pressure sensor. The display device 200 may also calculate the touched position based on the detection signal of the pressure sensor. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions.
In some examples, taking the display device 200 applying for one or more embodiments as the television set 1 and the operating system of the television set 1 as the Android system as an example, as shown in fig. 3, the television set 1 may be logically divided into an application (Applications) layer (abbreviated as "application layer") 21, a kernel layer 22 and a hardware layer 23.
As shown in fig. 3, the hardware layer may include the detector 230 and the display 260 shown in fig. 2. The application layer 21 includes one or more applications. The application may be a system application or a third party application. For example, the application layer 21 includes a first application that may provide a writing function, as well as a gesture erasing function. The kernel layer 22 acts as software middleware between the hardware layer and the application layer 21 for managing and controlling hardware and software resources.
In some examples, the core layer 22 includes a first drive to send touch operations acquired by the detector 230 to a first application. The first application in the television 1 is started, and the first driver is used to send the touch operation of the user input collected by the detector 230 to the first application for recognition. Then, the processing unit 211 of the first application determines a target function that needs to be executed by the user according to the touch operation acquired by the acquisition unit 210, such as: when the target function is a writing function, the user can write required contents on the touch screen. Or when the target function is the gesture erasing function, the first application determines whether to start the gesture erasing function according to a first contact area, corresponding to a first touch point corresponding to the touch operation, on the touch screen. After that, when the processing unit 211 of the first application determines that the first contact area is greater than or equal to the area threshold, the gesture erasing function is started. In this way, the user can erase the written content. In the erasing process, the processing unit 211 of the first application performs detection through a target range including the first touch point. Such as: the processing unit 211 marks the second touch point as a spare touch point when a second touch point other than the first touch point exists in the target range. Thus, even if the first touch point disappears, whether the standby touch point exists in the target range can be judged. If the standby touch point exists in the target range, the television 1 may further continue to execute the gesture erasing function.
Specifically, the storage unit 212 of the first application may be configured to store the program code of the write display device 200, and may also be configured to store data generated by the write display device 200 during the operation process, such as data in a write request.
The methods in the following embodiments may be implemented in the display device 200 having the above-described hardware structure. In the following embodiments, the method of the embodiments of the present application is described by taking the electronic device provided in the embodiments of the present disclosure as the display device 200 as an example.
The embodiment of the present application provides a gesture erasing method, and as shown in fig. 4, the gesture erasing method may include S11 and S12.
S11, after the first function is started, if a second touch point other than the first touch point exists in the target range, marking the second touch point as a spare touch point. The first touch point is used for triggering a first function.
In some examples, after the user starts the first application on the display device 200, an interface 501 as shown in (a) of fig. 5 is displayed. Thereafter, after the user writes the content through the writing function provided by the first application on the interface 501, the interface 502 as shown in (b) in fig. 5 is displayed. When erasing the written content, the user can start the first function by clicking a button 5030 in the interface 503 as shown in (c) in fig. 5. Alternatively, the user displays the interface 504 as shown in (d) in fig. 5 by inputting a touch operation on the touch screen of the display device 200 (5040 as shown in (d) in fig. 5, where the dotted line in 5040 corresponds to the target range), such as pressing the palm 1 against the touch screen of the display device 200. And then, the first application determines whether to start the first function according to a first contact area, corresponding to the touch operation, of a first touch point on the touch screen. Thereafter, the first application initiates a first function when the first application determines that the first contact area is greater than or equal to the area threshold.
It should be noted that, in the gesture erasing method provided in the embodiment of the present disclosure, when the first function is started, the first application needs to determine a size relationship between a first contact area corresponding to a first touch point on the touch screen corresponding to the touch operation and an area threshold in a target range corresponding to the touch operation, so as to determine whether to start the first function. The first function is only activated when the first contact area is greater than an area threshold. Therefore, the problem that when a plurality of users operate the touch screen at the same time, the sum of the contact areas of the touch points corresponding to the plurality of touch operations on the touch screen is larger than the area threshold value, so that the first function is started by mistake can be avoided. Specifically, in order to better recognize the touch operation, in the gesture erasing method provided in the embodiment of the present disclosure, the size of the target range is set as the width of the palm multiplied by the length of the palm. In this way, when the user uses the first function, even if the erasing gesture of the user is changed (that is, there is a second touch point other than the first touch point in the target range), the display device 200 does not exit the first function by mistake, and the user experience is ensured. When the touch operation input by the user only comprises one first touch point, the center of the target range is equal to the pixel coordinate of the first touch point. When the first touch point disappears as the erasing operation is performed, but one or more second touch points exist in the target range, the center of the target range is equal to the average value of the pixel coordinates of the one or more second touch points.
The above example is described by taking an example in which the size of the target range is equal to the width of the palm multiplied by the length of the palm. In other examples, the size of the target range may be set according to the actual requirement of the user, and is not limited herein.
S12, after the first touch point disappears, if there is a spare touch point in the target range, continuing to execute the first function.
In some examples, a user cannot guarantee to maintain one touch operation for a long time while using the first function. Such as: in connection with the example given in S11 above, when erasing the written content, the user presses 3 fingers (finger 2, finger 3, and finger 4, respectively) on the touch screen again, and the interface 505 shown in fig. 5 (e) is displayed. It can be seen that the touch operation input by the user at this time is the touch operation 5050 shown in (e) of fig. 5. When the display device 200 determines that a second touch point corresponding to the finger 2, a second touch point corresponding to the finger 3, and a second touch point corresponding to the finger 4, which are other than the first touch point corresponding to the palm 1, exist in the target range, the second touch point corresponding to the finger 2, the second touch point corresponding to the finger 3, and the second touch point corresponding to the finger 4 are marked as spare touch points. Thus, even if the user lifts the palm during the process of erasing the written content, at this time, the touch operation input by the user is as the touch operation 5060 shown in (f) of fig. 5, that is, the first touch point corresponding to the palm 1 disappears, since the standby touch point exists in the target range, the first function is continuously executed, and the problem that the user experiences worse due to the fact that the user changes the touch operation during the process of erasing the written content, so that the display device 200 mistakenly exits the first function is avoided. In some examples, after the first touch point disappears, if a spare touch point exists in the target range, the first function continues to be executed, including: and after the first touch point disappears, if the standby touch point exists in the target range, selecting the standby touch point meeting the target condition as the first touch point, and continuing to execute the first function. Such as: the target condition is that the distance between the standby touch points is closest to the first touch point, and the standby touch points include 3 standby touch points, namely a standby touch point 1, a standby touch point 2 and a standby touch point 3. At this time, after the first touch point disappears in the target range, the distance 1 between the standby touch point 1 and the first touch point, the distance 2 between the standby touch point 2 and the first touch point, and the distance 3 between the standby touch point 3 and the first touch point need to be calculated. When the distance 1 is the minimum distance, the standby touch point 1 is used as the first touch point at this time, and the first function is continuously executed. In the erasing process, if the first touch point (the spare touch point 1) disappears in the target range, the distance 4 between the spare touch point 2 and the first touch point (the spare touch point 1) and the distance 5 between the spare touch point 3 and the first touch point (the spare touch point 1) need to be calculated. When the distance 4 is the minimum distance, the standby touch point 2 is used as the first touch point (the standby touch point 1 is deleted), and the first function is continuously executed. And exiting the first function until the first touch point and the standby touch point are not available in the target range.
It should be noted that, in order to enable a user to more intuitively check whether the first function is currently started, in the gesture erasing method provided in the embodiment of the present disclosure, after the first application starts the first function, the first icon is displayed on the touch screen. During the erasing process, the user can drag the first image to erase the written content. And the pixel coordinate corresponding to the center of the first icon is the same as the pixel coordinate corresponding to the first touch point.
As can be seen from the above, in the gesture erasing method provided by the present disclosure, the user triggers the first function through at least one first touch point input on the touch screen of the display device 200, for example, the first function is a gesture erasing function. Thus, the display device 200 turns on the gesture erasing function, and the user can erase the written content. In the erasing process, the display device 200 detects a target range including the first touch point. Accordingly, the display apparatus 200 marks the second touch point as the spare touch point in the case where the second touch point other than the first touch point exists within the target range. In this way, even if the first touch point disappears, if the standby touch point exists in the target range, the display device 200 may continue to perform the gesture erasing function.
In some examples, in combination with fig. 4, as shown in fig. 6, the gesture erasing method provided in the embodiment of the present disclosure further includes S13 and S14 before performing S11.
S13, acquiring a first contact area corresponding to at least one first touch point on the touch screen.
In some examples, according to the gesture operation method provided by the embodiment of the disclosure, a user triggers a first function of a first application through gesture operation, so that the user operation is facilitated. Such as: by aggregating the example given in the above-described S11, the interface 504 shown in (d) in fig. 5 is displayed. And then, the first application determines whether to start the first function according to a first contact area, corresponding to the touch operation, of a first touch point on the touch screen.
And S14, starting the first function when the first contact area is larger than or equal to the area threshold value.
In some examples, the gesture operation method provided by the embodiment of the disclosure can avoid the problem that the user experience is poor due to the fact that the user frequently starts the first function in the using process by setting the area threshold. Such as: the first function is initiated if the first contact area is greater than or equal to an area threshold.
Therefore, according to the gesture erasing method provided by the disclosure, by setting the area threshold, the first function is prevented from being frequently started by a user in the using process through the size relation between the first contact area and the area threshold, and the user experience is improved.
In some examples, in conjunction with fig. 4, as shown in fig. 6, the gesture erasing method provided by the embodiments of the present disclosure further includes S15 and S16.
And S15, acquiring a second contact area corresponding to the third touch point when the first contact area is smaller than the area threshold and the newly added third touch point exists in the target range within the first preset time.
In some examples, in connection with the example given at S14 above, to avoid the user frequently activating the first function during use, the first application does not activate the first function when the first contact area is less than the area threshold. However, in some cases, the user needs to use the first function, but the first contact area of the touch operation input by the user is smaller than the area threshold, and the first application may continue to monitor the target range, for example: in a case that the newly added third touch point exists within the target range within the first preset time, it is described that the user may need to use the second function. And then, the first touch point and the third touch point correspond to a second contact area. Then, the first application determines whether to start the second function according to the magnitude relation between the second contact area and the area threshold.
Specifically, the first preset time may be set according to a user requirement, for example, the first preset time is 300 ms.
It should be noted that, the above example is described by taking an example that the first contact area is smaller than the area threshold, and a newly added third touch point exists in the target range within the first preset time. In some other examples, when the first contact area is smaller than the area threshold and the newly added third touch point does not exist within the target range within the first preset time, the first application executes the preset function, for example, the preset function is a writing function, so that the user can continue writing on the touch screen.
And S16, starting the second function when the second contact area is larger than or equal to the area threshold value.
In some examples, the first application may have started a predetermined function before starting the second function (e.g., the second function is a gesture erasing function), e.g., the predetermined function is a writing function, and at this time, in order to erase the written content, the user inputs a touch operation so that the contact area is greater than or equal to the area threshold, so that the gesture erasing function is started (during the starting of the gesture erasing function, the user cannot continue to write new content). The user may then erase the written content. After the user quits the gesture erasing function, the first application restarts the writing function (for example, the first application starts the writing function by default under the condition that other functions are not executed, or the user quits the gesture erasing function and starts the writing function by clicking a button corresponding to the writing function), so that the operation of the user is facilitated.
In the above example, the second contact area is equal to or larger than the area threshold. In some other examples, the second contact area is smaller than the area threshold, and the first application executes a preset function, for example, the preset function is a writing function, so that the user can continue writing on the touch screen.
As can be seen from the above, in the gesture erasing method provided by the present disclosure, in order to avoid a situation that the user needs to use the first function, but the first contact area of the touch operation input by the user is smaller than the area threshold, after the first touch point is detected, the target range is continuously monitored, for example: in the case that the newly added third touch point exists in the target range within the first preset time, it is described that the user may need to use the second function. In this way, whether the second function is activated can be determined according to the magnitude relation between the second contact area and the area threshold. Therefore, more convenient operation can be provided for the user, and the user experience is guaranteed.
In some examples, in conjunction with fig. 4, as shown in fig. 6, the gesture erasing method provided by the embodiments of the present disclosure further includes S17.
And S17, deleting the fourth touch point when the fourth touch point exists in the range except the target range and the electronic equipment is in the first mode.
In some examples, the first application provides multiple operation modes, such as a single-user mode, when the first mode is a single-user mode, in order to prevent other users from inputting touch operations on the touch screen, for example: the other users input the touch operation in other ranges than the target range, that is, a fourth touch point exists in other ranges than the target range. At this time, since the first application is in the single-person mode, that is, the touch operation of only one user input can be currently recognized, in conjunction with the example given at S12 described above, the first application is currently erasing the written content according to the touch operation of the current user within the target range. And for the other users inputting the touch operation in other ranges except the target range, the first application does not execute the function corresponding to the touch operation any more, namely the first application deletes the fourth touch point.
The above example is described by taking the first mode as the single mode as an example. In some other examples, the first mode may also be a dual mode, that is, the first application needs to recognize touch operations input by two users at the same time, for example, the touch operations are used to trigger the first function, and at this time, the first application needs to determine whether to start the first function or to close the first function according to the gesture erasing method provided in the embodiment of the present disclosure, which is not described herein again.
Therefore, according to the gesture erasing method provided by the disclosure, when the fourth touch point exists in the range other than the target range, different services can be provided by judging the mode of the electronic device. Such as: when the first mode is the single mode, the electronic device only allows one user to use the first function, so that the user experience can be guaranteed due to mistaken touch control or interference of other users.
In some examples, in conjunction with fig. 4, as shown in fig. 6, the gesture erasing method provided by the embodiments of the present disclosure further includes S18 and S19.
And S18, acquiring a third contact area corresponding to the fourth touch point when the fourth touch point exists in the range except the target range and the electronic device is not in the first mode.
In some examples, in combination with the example given in S17 above, the first application provides multiple operation models, and when the first application does not start the first mode, such as the first mode is a single-person mode, the first application may further identify a touch operation input by the user in a range other than the target range, such as when a fourth touch point exists in the range other than the target range and the electronic device is not in the first mode, and obtain a third contact area corresponding to the fourth touch point. In this way, the first application may determine whether to start the third function according to the third contact area corresponding to the fourth touch point.
And S19, when the third contact area is larger than or equal to the area threshold value, starting a third function.
In particular, the third function may be a gesture wipe function.
The above example is described by taking the first model as the single-person mode. In some other examples, the first mode may be any model (e.g., a two-person mode) other than the single-person mode, which is not limited herein.
Therefore, according to the gesture erasing method provided by the disclosure, when the fourth touch point exists in the range other than the target range, different services can be provided by judging the mode of the electronic device. Such as: when the first mode is a double mode, the electronic device allows the two users to use the first function, so that service can be provided for the two users, and user experience is guaranteed.
In some examples, in conjunction with fig. 4, as shown in fig. 6, the gesture erasing method provided by the embodiments of the present disclosure further includes S20.
And S20, after the first touch point disappears, if the standby touch point does not exist in the target range, exiting the first function.
It should be noted that, in the above example, after the first touch point disappears, if there is no spare touch point in the target range, the first function is exited. In some other examples, when the first touch point does not disappear, the first function is continuously executed.
As can be seen from the above, in the gesture erasing method provided by the present disclosure, the user triggers the first function through at least one first touch point input on the touch screen of the display device 200, for example, the first function is a gesture erasing function. Thus, the display device 200 turns on the gesture erasing function, and the user can erase the written content. In the erasing process, the display device 200 detects a target range including the first touch point. Therefore, after the first touch point disappears, if the standby touch point does not exist in the target range, it indicates that the user is not in need of executing the first function, and thus the user can exit from the first function, and the user experience is ensured.
In some examples, in conjunction with fig. 6, as shown in fig. 7, the gesture erasing method provided by the embodiments of the present disclosure further includes S21.
S21, after the first touch point disappears, if there is no spare touch point in the target range and there is a newly added fifth touch point in the target range within a second preset time, continuing to execute the first function.
In some examples, in combination with the example given in S15 above, to avoid that the user lifts both the palm and the finger by mistake without erasing the written content, the first application cannot detect the first touch point and the spare touch point within the target range. At this time, the first application further needs to continue to detect the target range for a second preset time under the condition that the first touch point cannot be detected in the target range and the standby touch point cannot be detected. Such as: and if a newly added fifth touch point appears in the target range within the second preset time, continuing to execute the first function.
Specifically, the first preset time may be set according to a user requirement, for example, the first preset time is 100 ms.
Therefore, the gesture erasing method provided by the disclosure can avoid that the first application cannot detect the first touch point and the standby touch point within the target range due to the fact that the palm and the fingers are lifted by mistake under the condition that the written content is not erased by the user. At this time, the first application further needs to continue to detect the target range for a second preset time under the condition that the first touch point cannot be detected in the target range and the standby touch point cannot be detected. Such as: if a newly added fifth touch point appears in the target range within the second preset time, the first function is continuously executed, and the user experience is ensured.
In some examples, in conjunction with fig. 6, as shown in fig. 7, the gesture erasing method provided by the embodiments of the present disclosure further includes S22.
S22, after the first touch point disappears, if there is no spare touch point in the target range and there is no newly added fifth touch point in the target range within the second preset time, exiting the first function.
In some examples, in combination with the example given in S21 above, the first application further needs to continue to detect the target range for the second preset time in the case that neither the first touch point nor the spare touch point is detected in the target range. Such as: the newly added fifth touch point does not exist in the target range within the second preset time, which indicates that the user does not need the first application to execute the first function, and therefore the first application exits from the first function and continues to execute the preset function.
It should be noted that, the above example is described by taking how to turn on the first function or how to turn off the first function as an example. In other examples, when the second function is the same as the first function, or the third function is the same as the first function, the second function is exited, or the third function is exited, and the exiting processes of the second function and the third function are the same as the exiting process of the first function, and therefore, the details are not repeated here.
Therefore, the gesture erasing method provided by the disclosure can avoid that the first application cannot detect the first touch point and the standby touch point within the target range due to the fact that the palm and the fingers are lifted by mistake under the condition that the written content is not erased by the user. At this time, the first application further needs to continue to detect the target range for a second preset time under the condition that the first touch point cannot be detected in the target range and the standby touch point cannot be detected. Such as: and if the newly added fifth touch point does not appear in the target range within the second preset time, the user does not need to execute the first function, so that the user can quit the first function and the user experience is ensured.
The scheme provided by the embodiment of the application is mainly introduced from the perspective of a method. To implement the above functions, it includes hardware structures and/or software modules for performing the respective functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional modules according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
As shown in fig. 8, an embodiment of the present application provides a schematic structural diagram of a display device 200. Including a processor 101.
The processor 101 is configured to mark a second touch point as a standby touch point if the second touch point is present in the target range after the first function is started, except the first touch point; the first touch point is used for triggering a first function; the processor 101 is further configured to continue to execute the first function if the standby touch point exists in the target range after the first touch point disappears.
In some implementable examples, display device 200 also includes communicator 102; the communicator 102 is configured to acquire a first contact area corresponding to at least one first touch point on the touch screen; and the processor 101 is configured to start the first function when the first contact area acquired by the communicator 102 is greater than or equal to the area threshold.
In some practical examples, the processor 101 is further configured to control the communicator 102 to acquire a second contact area corresponding to both the first touch point and the third touch point if the first contact area acquired by the communicator 102 is smaller than the area threshold and a newly added third touch point exists in the target range within the first preset time; the processor 101 is further configured to activate the second function if the second contact area acquired by the communicator 102 is greater than or equal to the area threshold.
In some practical examples, the processor 101 is further configured to delete the fourth touch point if the fourth touch point exists in a range other than the target range and the electronic device is in the first mode.
In some implementable examples, device 200 is displayed, further comprising communicator 102; the processor 101 is further configured to control the communicator 102 to obtain a third contact area corresponding to a fourth touch point when the fourth touch point exists in a range other than the target range and the electronic device is not in the first mode; and the processor 101 is further configured to start a third function if the third contact area acquired by the communicator 102 is greater than or equal to the area threshold.
In some practical examples, the processor 101 is further configured to exit the first function if there is no spare touch point in the target range after the first touch point disappears.
In some practical examples, the processor 101 is further configured to continue to execute the first function after the first touch point disappears if there is no spare touch point in the target range and there is a new fifth touch point in the target range within a second preset time.
In some practical examples, the processor 101 is further configured to exit the first function if there is no spare touch point in the target range and there is no newly added fifth touch point in the target range within a second preset time after the first touch point disappears.
All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and the function thereof is not described herein again.
Of course, the display device 200 provided in the embodiment of the present application includes, but is not limited to, the above modules, for example, the display device 200 may further include the memory 103. The memory 103 may be used for storing the program code of the write display device 200, and may also be used for storing data generated by the write display device 200 during operation, such as data in a write request.
As an example, in conjunction with fig. 3, the acquisition unit 210 in the display device 200 implements the same function as the communicator 102 in fig. 8, the processing unit 211 implements the same function as the processor 101 in fig. 8, and the storage unit 212 implements the same function as the memory 103 in fig. 8.
An embodiment of the present application further provides a server, where the server may include: a memory and one or more processors. The memory is coupled to the processor. The memory is for storing computer program code comprising computer instructions. The server may perform the various functions or steps performed by electronic device 400 in the above-described method embodiments when the processor executes the computer instructions.
The embodiment of the present application further provides a chip system, which can be applied to the electronic device 400 in the foregoing embodiment. As shown in fig. 9, the system-on-chip includes at least one processor 1501 and at least one interface circuit 1502. The processor 1501 may be a processor in the electronic device 400 described above. The processor 1501 and the interface circuit 1502 may be interconnected by wires. The processor 1501 may receive and execute computer instructions from the memory of the electronic device 400 described above via the interface circuit 1502. The computer instructions, when executed by the processor 1501, may cause the electronic device 400 to perform the various steps performed by the electronic device 400 in the embodiments described above. Of course, the chip system may further include other discrete devices, which is not specifically limited in this embodiment of the present application.
The embodiment of the present application further provides a computer-readable storage medium for storing the computer instructions executed by the electronic device 400.
Embodiments of the present application also provide a computer program product, which includes computer instructions executed by the electronic device 400.
Through the description of the above embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the foregoing discussion in some embodiments is not intended to be exhaustive or to limit the implementations to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A gesture erasing method is applied to electronic equipment comprising a touch screen, and comprises the following steps:
after the first function is started, if a second touch point except the first touch point exists in the target range, marking the second touch point as a standby touch point; the first touch point is used for triggering the first function;
and after the first touch point disappears, if the standby touch point exists in the target area, continuing to execute the first function.
2. The gesture erasing method according to claim 1, wherein after the electronic device starts the first function, if a second touch point other than the first touch point exists in the target range, the second touch point is marked as a spare touch point, and the method further comprises:
acquiring a first contact area corresponding to at least one first touch point on the touch screen;
initiating a first function if the first contact area is greater than or equal to an area threshold.
3. The gesture wipe method of claim 2, further comprising:
acquiring a second contact area corresponding to a third touch point when the first contact area is smaller than an area threshold and the newly added third touch point exists in the target range within a first preset time;
initiating the second function if the second contact area is greater than or equal to the area threshold.
4. The gesture wipe method of claim 1, further comprising:
and deleting a fourth touch point in other ranges except the target range under the condition that the electronic equipment is in the first mode.
5. The gesture wipe method of claim 1, further comprising:
under the condition that a fourth touch point exists in a range other than the target range and the electronic equipment is not in the first mode, acquiring a third contact area corresponding to the fourth touch point;
initiating a third function if the third contact area is greater than or equal to the area threshold.
6. The gesture wipe method of any one of claims 1-5, further comprising:
and after the first touch point disappears, if the standby touch point does not exist in the target range, exiting the first function.
7. The gesture wipe method of any one of claims 1-5, further comprising:
and after the first touch point disappears, if the standby touch point does not exist in a target range and a newly added fifth touch point exists in the target range within a second preset time, continuing to execute the first function.
8. A gesture wiping apparatus, comprising:
the processing unit is used for marking second touch points except the first touch points as standby touch points if the second touch points exist in the target range after the first function is started; the first touch point is used for triggering the first function;
the processing unit is further configured to continue to execute the first function if the standby touch point exists in the target area after the first touch point disappears.
9. An electronic device, comprising: communication interface, processor, memory, bus; the memory is used for storing computer execution instructions, and the processor is connected with the memory through a bus; the processor executes the computer-executable instructions stored by the memory when the electronic device is running to cause the electronic device to perform the gesture wipe method of any of claims 1-7.
10. A computer-readable storage medium, having stored thereon a computer program which, when executed by a computing device, causes the computing device to implement the gesture wipe method of any of claims 1-7.
CN202210583448.5A 2022-05-25 2022-05-25 Gesture erasing method and device and electronic equipment Pending CN115113793A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210583448.5A CN115113793A (en) 2022-05-25 2022-05-25 Gesture erasing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210583448.5A CN115113793A (en) 2022-05-25 2022-05-25 Gesture erasing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN115113793A true CN115113793A (en) 2022-09-27

Family

ID=83325530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210583448.5A Pending CN115113793A (en) 2022-05-25 2022-05-25 Gesture erasing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN115113793A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014169597A1 (en) * 2013-09-09 2014-10-23 中兴通讯股份有限公司 Text erasure method and device
CN106126101A (en) * 2016-06-24 2016-11-16 维沃移动通信有限公司 The operational approach of a kind of mobile terminal and mobile terminal
CN108958627A (en) * 2018-07-04 2018-12-07 Oppo广东移动通信有限公司 touch operation method, device, storage medium and electronic equipment
CN112363657A (en) * 2016-10-26 2021-02-12 海信视像科技股份有限公司 Gesture erasing method and device
CN114003145A (en) * 2021-11-01 2022-02-01 深圳市康冠商用科技有限公司 Touch screen writing and erasing method and device, electronic whiteboard and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014169597A1 (en) * 2013-09-09 2014-10-23 中兴通讯股份有限公司 Text erasure method and device
CN106126101A (en) * 2016-06-24 2016-11-16 维沃移动通信有限公司 The operational approach of a kind of mobile terminal and mobile terminal
CN112363657A (en) * 2016-10-26 2021-02-12 海信视像科技股份有限公司 Gesture erasing method and device
CN108958627A (en) * 2018-07-04 2018-12-07 Oppo广东移动通信有限公司 touch operation method, device, storage medium and electronic equipment
CN114003145A (en) * 2021-11-01 2022-02-01 深圳市康冠商用科技有限公司 Touch screen writing and erasing method and device, electronic whiteboard and storage medium

Similar Documents

Publication Publication Date Title
US10831314B2 (en) Method and electronic device for preventing touch button from being false triggered
TWI426416B (en) Touch-sensitive depressible button and method ,computer readable storage medium, and computing system for the same
WO2018107898A1 (en) Method and device for preventing false triggering of touch button, terminal and storage medium
EP3133483B1 (en) Touchscreen apparatus and user interface processing method for the touchscreen apparatus
CN105320417B (en) Page switching method and client
CN110727369B (en) Electronic device
EP3435209B1 (en) Method for recognizing a screen-off gesture, and storage medium and terminal thereof
CN107219988B (en) Interface operation guiding method and mobile terminal
KR20140082411A (en) Method for managing security for applications and an electronic device thereof
JP2001202192A (en) Information processor, its method and program storage medium
US20140055369A1 (en) Single-gesture mobile computing device operations
CN109543378B (en) Fingerprint unlocking method and related equipment
CN102981704A (en) Icon placement method and mobile terminal of display interface
JP4980105B2 (en) Coordinate input device and control method of coordinate input device
KR102344356B1 (en) Input device, electronic apparatus for receiving signal from the input device and controlling method thereof
CN108958569B (en) Control method, device, system and terminal of smart television and smart television
WO2015078300A1 (en) Television control method and remote controller for television
CN106020698A (en) Mobile terminal and realization method of single-hand mode
WO2018010223A1 (en) Method and apparatus for controlling applications, and terminal
CN105210023A (en) Apparatus and associated methods
AU2019440748A1 (en) Method, device and apparatus for controlling touch operation mode, and storage medium
US20140223328A1 (en) Apparatus and method for automatically controlling display screen density
US9442580B2 (en) Electronic apparatus and touch operating method thereof
WO2016033750A1 (en) Terminal, terminal control device and method
CN112988246A (en) System switching method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination