WO2023016076A1 - Procédé de commande pour écran tactile de terminal, dispositif terminal et support de stockage - Google Patents

Procédé de commande pour écran tactile de terminal, dispositif terminal et support de stockage Download PDF

Info

Publication number
WO2023016076A1
WO2023016076A1 PCT/CN2022/097915 CN2022097915W WO2023016076A1 WO 2023016076 A1 WO2023016076 A1 WO 2023016076A1 CN 2022097915 W CN2022097915 W CN 2022097915W WO 2023016076 A1 WO2023016076 A1 WO 2023016076A1
Authority
WO
WIPO (PCT)
Prior art keywords
trigger condition
current gesture
gesture information
current
touch
Prior art date
Application number
PCT/CN2022/097915
Other languages
English (en)
Chinese (zh)
Inventor
张光华
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2023016076A1 publication Critical patent/WO2023016076A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Definitions

  • the present disclosure relates to the field of computers, and in particular, to a method for controlling a terminal touch screen, a terminal device, and a storage medium.
  • the touch screen is mainly divided into a resistive touch screen and a capacitive touch screen.
  • the resistive touch screen is controlled by pressure sensing, and the voltage value of the pressed point on the touch screen is converted into a coordinate value in a rectangular area.
  • the current electronic terminal judges whether the touch point of the user's touch screen is a valid touch event point to be reported according to the trigger condition set by the manufacturer.
  • the present disclosure provides a method for controlling a terminal touch screen, a terminal device and a storage medium.
  • the present disclosure provides a method for controlling a touch screen of a terminal.
  • the method includes: when detecting a user's current gesture operation on the touch screen, acquiring current gesture information corresponding to the current gesture operation; condition and/or the target trigger condition, determine that the current gesture operation is a valid gesture operation, and trigger the corresponding response operation; wherein, the initial trigger condition includes the default trigger condition set at the factory, and the target trigger condition includes A trigger condition generated based on the user's historical gesture operations.
  • the present disclosure also provides a terminal device, including a touch screen, a processor, and a memory; the touch screen is coupled to the processor; the memory is used to store programs; the processor is used to execute the The above program is executed and the method for controlling the touch screen of the terminal as described above is realized when the program is executed.
  • the present disclosure also provides a storage medium for readable storage, the storage medium stores one or more programs, and the one or more programs can be executed by one or more processors to implement the above-mentioned A method for controlling the terminal touch screen.
  • FIG. 1 is a schematic structural diagram of a terminal device provided by an embodiment of the present disclosure
  • FIG. 2 is a schematic flowchart of a method for controlling a terminal touch screen provided by an embodiment of the present disclosure
  • Fig. 3 is a schematic flowchart of comparing the current gesture information with the initial trigger condition provided by an embodiment of the present disclosure
  • Fig. 4 is a schematic flowchart of a step of comparing the current gesture information with the target trigger condition provided by an embodiment of the present disclosure.
  • Embodiments of the present disclosure provide a method for controlling a terminal touch screen, a terminal device, and a storage medium.
  • the control method of the terminal touch screen can be applied to the terminal device, by comparing the current gesture information corresponding to the user's current gesture operation with the initial trigger condition set at the factory and the target trigger condition generated based on the user's operating habits, Prevent misjudgment that the current gesture operation is an invalid gesture operation when the current gesture information does not meet the initial trigger conditions because the gesture operation habit of the user operating the touch screen is different from the gesture operation set at the factory, and further combine the current gesture information with the target trigger Conditions are compared to determine whether the current gesture operation is a valid gesture operation, which improves the sensitivity of the touch screen.
  • FIG. 1 is a schematic structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • the terminal device 100 may include a processor 101, a memory 102, and a touch screen 103, wherein the processor 101 and the memory 102 may be connected through a bus, such as any applicable bus such as an I2C (Inter-integrated Circuit) bus.
  • the screen is coupled with the processor.
  • the memory 102 may include a non-volatile storage medium and an internal memory.
  • Non-volatile storage media can store operating systems and computer programs.
  • the computer program includes program instructions. When the program instructions are executed, the processor 101 may execute any method for controlling the touch screen of the terminal.
  • the processor 101 is used to provide calculation and control capabilities, and support the operation of the entire terminal device 100 .
  • the processor 101 can be a central processing unit (Central Processing Unit, CPU), and the processor can also be other general processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (application specific integrated circuits, ASIC), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the processor 101 is configured to run a computer program stored in the memory 102, and implement the following steps when executing the computer program: when the current gesture operation of the user on the touch screen is detected, obtain the current gesture operation corresponding current gesture information; when the current gesture information meets the initial trigger condition and/or the target trigger condition, determine that the current gesture operation is a valid gesture operation, and trigger the corresponding response operation; wherein, the initial trigger condition includes The target trigger condition includes a trigger condition generated based on the user's historical gesture operations.
  • the processor 101 is further configured to implement the following steps when executing the computer program: when the current gesture information does not meet the initial trigger condition and the current gesture information does not meet the target trigger condition, It is determined that the current gesture operation is a false touch gesture operation.
  • the processor 101 determines that the current gesture operation is a false touch gesture when the current gesture information does not meet the initial trigger condition and the current gesture information does not meet the target trigger condition. After the operation, it is also used to realize: judging whether the current gesture information exists in the false touch gesture information database; if the current gesture information exists in the false touch gesture information database, then in the false touch gesture information database, Counting the number of false touches corresponding to the current gesture information, and updating the target trigger condition according to the current gesture information when the number of false touches corresponding to the current gesture information reaches a preset threshold of false touch times.
  • the processor 101 after the processor 101 detects the current gesture operation of the user on the touch screen, after acquiring the current gesture information corresponding to the current gesture operation, it is further configured to: according to the initial trigger condition, Determine at least two first effective touch points among the touch points of the current gesture information, and calculate the first slope of the line between the two first effective touch points; combine the first slope with the initial trigger The first slope threshold in the condition is compared to determine whether the current gesture information meets the initial trigger condition.
  • the processor 101 when the touch screen is a flexible screen, after the processor 101 realizes judging whether the current gesture information exists in the false touch gesture information database, it is further configured to: Determine at least two of the first effective touch points in the points, and calculate the initial slope of the line between the two first effective touch points; obtain the current bending angle of the touch screen, and calculate the current bending angle according to the current bending angle correcting the initial slope, and generating the first slope.
  • the processor 101 after the processor 101 detects the current gesture operation of the user on the touch screen, after obtaining the current gesture information corresponding to the current gesture operation, it is further configured to: according to the target trigger condition, in the Determine at least two second effective touch points among the touch points of the current gesture information, and calculate the second slope of the line between the two second effective touch points; combine the second slope with the target trigger The second slope threshold in the condition is compared to determine whether the current gesture information meets the target trigger condition.
  • the processor 101 after the processor 101 detects the current gesture operation of the user on the touch screen, after obtaining the current gesture information corresponding to the current gesture operation, it is further configured to: according to each of the current gesture information Touching a point, determining the current sliding trajectory corresponding to the current gesture information; combining the current sliding trajectory with the first preset sliding trajectory in the initial trigger condition and/or the second preset sliding trajectory in the target trigger condition The trajectories are compared to determine whether the current gesture information meets the initial trigger condition and/or the target trigger condition.
  • the processor 101 when the current gesture information meets the initial trigger condition and/or the target trigger condition, the processor 101 further uses To achieve: when a trigger condition setting instruction is received or a false touch instruction is detected, the user's historical gesture operations are acquired, and the target trigger condition is generated based on historical gesture information corresponding to the historical gesture operations.
  • FIG. 2 is a schematic flowchart of a method for controlling a terminal touch screen provided by an embodiment of the present disclosure.
  • the method for controlling the terminal touch screen is applied to a terminal device, and the terminal device includes a touch screen event collection component, a gesture learning component, and a gesture judgment and trigger component, and the touch screen event collection component is used to collect user touch capacitive or resistive touch screen triggers
  • the coordinate information of the event point that is, the touch point
  • the gesture learning component is used to generate a trigger condition that meets the user’s gesture operation habits based on the historical touch point collected by the touch screen event collection component and the corresponding judgment result of whether it is valid.
  • the gesture determination and trigger component is used to compare the current gesture information collected by the touch screen event collection component with the trigger condition generated based on the gesture learning component or the initial trigger condition set when the terminal leaves the factory, and determine whether The method for controlling the touch screen of the terminal includes step S10 to step S30 by triggering and responding to the corresponding gesture instruction.
  • Step S10 when a current gesture operation by the user on the touch screen is detected, current gesture information corresponding to the current gesture operation is acquired.
  • the current gesture information in addition to comparing the current gesture information corresponding to the user's current gesture operation with the initial trigger condition set at the factory, the current gesture information is further compared with the target trigger condition corresponding to the user. By comparison, the misjudgment caused by the user's operating habits is reduced, and the sensitivity of the touch screen is improved.
  • the gesture determining and triggering component acquires current gesture information corresponding to the current gesture operation when detecting the current gesture operation generated by the user touching the touch screen.
  • the current gesture information may be each touch point of the user on the touch screen (for example, may be the coordinates of each touch point), or may be a sliding track of the user on the touch screen.
  • the capacitive touch screen judges the touch point by sensing the capacitance change generated by the touch of the human body. It has two sets of signal lines: the driving line and the sensing line, the driving line emits signals, and the sensing line detects the change of capacitance value.
  • the finger touches the metal layer due to the existence of the electric field of the human body, the finger and the surface of the touch screen form a coupling capacitance.
  • the capacitance is a direct conductor, so the finger absorbs a small current from the contact point. Affects the coupling between two electrodes near the touch point, thereby changing the capacitance between these two electrodes.
  • the electrodes in the direction of the driving line send excitation signals in sequence, and all the electrodes in the direction of the sensing line receive signals at the same time, so that the change of the capacitance value at the intersection of all horizontal and vertical electrodes can be obtained, that is, the two-dimensional plane of the entire touch screen.
  • Capacitance according to the two-dimensional capacitance change data of the touch screen, the coordinates of each touch point can be calculated, so even if there are multiple touch points on the screen, the real coordinates of each touch point can be calculated. Therefore, the coordinates of each touch point of the current gesture operation can be acquired and calculated as the current gesture information.
  • FIG. 3 is a schematic flowchart of comparing the current gesture information with the initial trigger condition provided by an embodiment of the present disclosure, which may include the following steps S101 and S102 .
  • Step S101 determine at least two first effective touch points among the touch points in the current gesture information, and calculate a first slope of a line connecting the two first effective touch points .
  • the gesture determining and triggering component determines the corresponding touch point among the touch points of the current gesture information, that is, obtains the first touch point among the touch points of the current gesture information
  • One touch point and the second touch point are used as the first valid touch point. Then calculate the slope of the line between the first valid touch points as the first slope.
  • the touch screen event collection component sequentially adds each touch point corresponding to the current gesture operation to the touch point queue according to the touch sequence of the user, so as to record each touch point and the corresponding order of each touch point, and then the touch screen event
  • the collecting component reports the collected touch points and their corresponding sequence to the gesture determination and triggering component.
  • Step S102 comparing the first slope with a first slope threshold in the initial trigger condition to determine whether the current gesture information meets the initial trigger condition.
  • the gesture determination and triggering component calculates the first slope, it compares the first slope with the first slope threshold in the initial trigger condition, that is, in the first When a slope reaches the first slope threshold, it is determined that the current gesture information meets the initial trigger condition; when the first slope does not reach the first slope threshold, it is determined that the current gesture information does not meet the initial trigger condition; The initial trigger condition.
  • the touch screen is a flexible screen
  • at least two first effective touch points are determined in each touch point of the current gesture information, and the connection between the two first effective touch points is calculated.
  • the step of the first slope of the line includes: determining at least two first effective touch points in each touch point of the current gesture information, and calculating an initial slope of a line connecting the two first effective touch points ; Acquiring the current bending angle of the touch screen, correcting the initial slope according to the current bending angle, and generating the first slope.
  • the touch screen in this embodiment is a flexible screen
  • the bending angle of the touch screen is used as the gesture information parameter of the gesture operation.
  • slope thresholds corresponding to gesture information at different bending angles are different.
  • the curvature is denoted as C
  • the slope is denoted as K
  • the slope between the touch points corrected according to the bending angle is denoted as K*C.
  • the process of judging whether the current gesture information meets the initial trigger condition is: the gesture judging and triggering component obtains the slope of the line between the first valid touch points, such as the first touch point and the second The slope K1 of the line between the touch points, and obtain the current bending angle C1 of the touch screen when the user generates the current gesture operation, and correct the slope K1*C1 of the line connecting the first effective touch points according to the bending angle Compared with the first slope threshold K0*C0 in the initial trigger condition, if K1*C1 reaches K0*C0, it is determined that the current gesture information meets the initial trigger condition; if K1*C1 does not reach K0*C0 , it is determined that the current gesture information does not meet the initial trigger condition.
  • an error deviation rate denoted as V
  • the first slope threshold is a threshold range.
  • the threshold range corresponding to the first slope threshold is K*(1-V) ⁇ K*(1+V).
  • the first slope threshold corresponds to a threshold range of K*C*(1-V) ⁇ K*C*(1+v). That is, the first slope belongs to the threshold range corresponding to the first slope threshold, that is, the first slope reaches the first slope threshold.
  • FIG. 4 is a schematic flowchart of steps of comparing the current gesture information with the target trigger condition according to an embodiment of the present disclosure, which may include the following steps S103 and S104 .
  • Step S103 determine at least two second valid touch points in each touch point of the current gesture information, and calculate a second slope of a line connecting the two second valid touch points .
  • the gesture determination and trigger component determines according to the set sequence in the target trigger condition At least two touch points, such as the first touch point and the last touch point, determine the corresponding touch point among the touch points of the current gesture information, that is, obtain the first touch point among the touch points of the current gesture information touch points and the last touch point as the second valid touch point. Then calculate the slope of the line between the second valid touch points as the second slope.
  • the target trigger condition of the gesture determination and triggering component is that the slope between any two touch points in each touch point reaches the second slope, it calculates the difference between any two touch points in each touch point respectively. The slope between them, and take any two touch points in turn as the second valid touch points. Then, the slopes of the lines between the second effective touch points are sequentially calculated as the second slopes.
  • the gesture determination and triggering component can compare the current gesture information with the initial trigger condition and the target trigger condition at the same time; it can also first compare the current gesture information with the initial trigger condition , when the current gesture information does not meet the initial trigger condition, then compare the current gesture information with the target trigger condition.
  • the initial trigger condition may also be replaced by a target trigger condition conforming to the user's gesture operation habits, so as to directly compare the current gesture information with the target trigger condition.
  • the current gesture information meets any one of the initial trigger condition and the target trigger condition, it is determined that the current gesture operation is a valid gesture operation.
  • the current gesture information neither meets the initial trigger condition nor the target trigger condition, it is determined that the current gesture operation is an invalid gesture operation.
  • the current gesture operation before judging the validity of the current gesture operation based on the initial trigger condition and the target trigger condition, it further includes: when a trigger condition setting instruction is received or a false touch instruction is detected, acquiring The user's historical gesture operations, and based on historical gesture information corresponding to the historical gesture operations, generate the target trigger condition.
  • the user can set the trigger condition based on actual needs, that is, the gesture learning component is When receiving a trigger condition setting instruction triggered by a user setting operation, or when the gesture learning component detects a false touch instruction (also when the false touch instruction reaches a preset number of times), it can operate the corresponding gesture based on the history of each user.
  • the historical gesture information generates a trigger condition corresponding to each user as the target trigger condition.
  • the difference between the target trigger condition and the initial trigger condition may be that the effective touch point can be changed from a preset touch point to any two touch points in each touch point, and the first touch point in the initial trigger condition
  • the slope between the first touch point and the second touch point reaches the first slope threshold, and the slope between any two touch points among the touch points corresponding to the current gesture information in the target trigger condition reaches the first slope threshold.
  • Two slope thresholds may be the same as the second slope threshold, or the first slope threshold may be increased or decreased according to the user's actual operation data as the second slope threshold.
  • Step S104 comparing the second slope with a second slope threshold in the target trigger condition to determine whether the current gesture information meets the target trigger condition.
  • the gesture determination and triggering component calculates the second slope, it compares the second slope with the second slope threshold in the target trigger condition, that is, in the first When the second slope reaches the second slope threshold, it is determined that the current gesture information meets the target trigger condition; when the second slope does not reach the second slope threshold, it is determined that the current gesture information does not meet the target trigger condition.
  • the target trigger condition when the target trigger condition is that the slope between any two touch points among the touch points reaches the second slope, the slope between any two touch points among the touch points reaches the second slope threshold , it is determined that the current gesture information meets the target trigger condition; otherwise, it is determined that the current gesture information does not meet the target trigger condition.
  • the step of comparing the current gesture information with the initial trigger condition and/or the target trigger condition may further include: according to each touch point in the current gesture information, determining the current gesture information corresponding to the current gesture information. sliding track; comparing the current sliding track with the first preset sliding track in the initial trigger condition and/or the second preset sliding track in the target trigger condition, so as to judge the current gesture accordingly Whether the information meets the initial trigger condition and/or the target trigger condition.
  • the gesture determination and triggering component acquires each touch point corresponding to the current gesture information , and according to the touch points and the order of the touch points, the current sliding trajectory corresponding to the current gesture operation is fitted, and then the current sliding trajectory is compared with the preset trajectory in the initial trigger condition Comparison. If the current sliding track matches the preset track, it is determined that the current gesture information meets the initial trigger condition. If the current sliding track does not match the preset track, it is determined that the current gesture information does not meet the initial trigger condition.
  • the gesture learning component may adjust the preset trajectories in the initial trigger conditions to generate the target trigger conditions. If the current sliding trajectory matches the adjusted trajectory in the target trigger condition, it is determined that the current gesture information meets the target trigger condition. If the current sliding trajectory does not match the adjusted trajectory in the target trigger condition, it is determined that the current gesture information does not meet the target trigger condition.
  • Step S20 when the current gesture information meets the initial trigger condition and/or target trigger condition, determine that the current gesture operation is a valid gesture operation, and trigger a corresponding response operation; wherein, the initial trigger condition includes A predetermined default trigger condition, where the target trigger condition includes a trigger condition generated based on the user's historical gesture operations.
  • the gesture determining and triggering component determines that the current gesture information meets any one of the initial trigger condition or the target trigger condition, it determines that the current gesture operation is a valid gesture operation, And trigger the corresponding response operation, such as unlock operation or return operation.
  • the current gesture information after comparing the current gesture information with the initial trigger condition and/or the target trigger condition, it further includes: when the current gesture information does not meet the initial trigger condition and the current gesture When the information does not meet the target trigger condition, it is determined that the current gesture operation is a false touch gesture operation.
  • gesture determining and triggering component determines that the current gesture information neither meets the initial trigger condition nor the target trigger condition, it determines that the current gesture operation is an invalid gesture operation, and does not Respond to the current gesture operation described above.
  • a gesture error reminder message may be generated to remind the user why they are currently unresponsive.
  • the gesture information corresponding to the false touch gesture operation that is, the gesture information corresponding to the invalid gesture operation is counted, and the corresponding gesture information is generated according to the false touch gesture information.
  • Target trigger condition the gesture information corresponding to the false touch gesture operation
  • the step of determining that the current gesture operation is a false touch gesture operation when the current gesture information does not meet the initial trigger condition and the current gesture information does not meet the target trigger condition, It also includes: judging whether the current gesture information exists in the false touch gesture information database; if the current gesture information does not exist in the false touch gesture information database, adding the current gesture information to the false touch gesture information library.
  • the target trigger condition is updated according to the current gesture information.
  • the gesture determining and triggering component determines that the current gesture operation is false touch gesture information
  • the current gesture information is compared with the false touch gesture information database to determine the false touch gesture Whether the gesture information corresponding to the false touch gesture operation, that is, the current gesture information, has been recorded in the information database. If no current gesture information is recorded in the false touch gesture information library, adding the current gesture information to the false touch gesture information library.
  • the current gesture information has been recorded in the false touch gesture information library, update the number of false touches corresponding to the current gesture information, such as adding 1, and count the number of false touches corresponding to the current gesture information, if the When the number of false touches corresponding to the current gesture information reaches the preset threshold of false touch times, it means that the current gesture information is the gesture operation habit of the user, and the target trigger condition can be updated according to the current gesture information, that is, the current
  • the effective touch points in the gesture information and the slope values between the effective touch points are used as the target trigger condition, or the sliding track corresponding to each touch point in the current gesture information can be used as the target trigger condition.
  • the target trigger condition is generated or updated according to the gesture information with a large number of false touches in the false touch gesture information library, which not only improves the sensitivity of the touch screen, but also reduces the amount of gesture information to be analyzed, and improves the target. Generation efficiency of trigger conditions.
  • Embodiments of the present disclosure also provide a storage medium for readable storage, the storage medium stores a program, the program includes program instructions, and the processor executes the program instructions to implement the embodiments of the present disclosure A method for controlling the touch screen of any terminal provided.
  • the program is loaded by the processor, and may perform the following steps: when the current gesture operation of the user on the touch screen is detected, obtain the current gesture information corresponding to the current gesture operation; when the current gesture information meets the initial trigger condition and/or or a target trigger condition, determine that the current gesture operation is an effective gesture operation, and trigger a corresponding response operation; wherein, the initial trigger condition includes a default trigger condition set at the factory, and the target trigger condition includes The trigger conditions generated by the user's historical gesture operations.
  • the storage medium may be an internal storage unit of the terminal device in the foregoing embodiments, such as a hard disk or a memory of the terminal device.
  • the storage medium may also be an external storage device of the terminal device, such as a plug-in hard disk equipped on the terminal device, a smart memory card (Smart Media Card, SMC), a secure digital card (Secure Digital Card, SD Card ), Flash Card (Flash Card), etc.
  • This embodiment discloses a terminal touch screen control method, a terminal device and a storage medium.
  • current gesture information corresponding to the current gesture operation is acquired; in the current gesture information
  • the initial trigger condition and/or the target trigger condition are met, it is determined that the current gesture operation is a valid gesture operation, and a corresponding response operation is triggered; wherein the initial trigger condition includes a default trigger condition set at the factory, and the target The trigger condition includes a trigger condition generated based on the user's historical gesture operations.
  • the current gesture information corresponding to the user is compared with the default trigger condition set when the terminal leaves the factory, but also the current gesture information is further compared with the target trigger condition generated based on the user's gesture operation habits to prevent the The gesture operation habit of the user operating the touch screen is different from the gesture operation set at the factory, and when the current gesture information does not meet the initial trigger conditions, the current gesture operation is misjudged as an invalid gesture operation, which improves the sensitivity of the touch screen.
  • the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be composed of several physical components. Components cooperate to execute.
  • Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application-specific integrated circuit .
  • a processor such as a central processing unit, digital signal processor, or microprocessor
  • Such software may be distributed on storable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media).
  • storage medium includes both volatile and nonvolatile media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. , removable and non-removable media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disk (DVD) or other optical disk storage, magnetic cartridges, tape, magnetic disk storage or other magnetic storage devices, or can Any other medium used to store desired information and which can be accessed by a computer.
  • communication media typically embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism, and may include any information delivery media .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente divulgation se rapporte au domaine des ordinateurs et concerne en particulier un procédé de commande pour un écran tactile de terminal, un dispositif terminal et un support de stockage. Le procédé de commande pour un écran tactile de terminal consiste : lorsqu'une opération gestuelle courante est détectée, à obtenir des informations de geste courantes correspondant à l'opération gestuelle courante ; et lorsque les informations de geste courantes satisfont une condition de déclenchement initiale et/ou une condition de déclenchement cible, déterminer que l'opération gestuelle courante est une opération gestuelle valide et déclencher une opération de réponse correspondante, la condition de déclenchement initiale comprenant une condition de déclenchement par défaut établie au niveau de l'usine et la condition de déclenchement cible comprenant une condition de déclenchement générée sur la base d'une opération gestuelle historique d'un utilisateur.
PCT/CN2022/097915 2021-08-09 2022-06-09 Procédé de commande pour écran tactile de terminal, dispositif terminal et support de stockage WO2023016076A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110910544.1A CN115904103A (zh) 2021-08-09 2021-08-09 终端触摸屏的控制方法、终端设备和存储介质
CN202110910544.1 2021-08-09

Publications (1)

Publication Number Publication Date
WO2023016076A1 true WO2023016076A1 (fr) 2023-02-16

Family

ID=85199801

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/097915 WO2023016076A1 (fr) 2021-08-09 2022-06-09 Procédé de commande pour écran tactile de terminal, dispositif terminal et support de stockage

Country Status (2)

Country Link
CN (1) CN115904103A (fr)
WO (1) WO2023016076A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116540918B (zh) * 2023-06-30 2023-12-01 深圳市欧度利方科技有限公司 一种平板电脑分屏控制***与方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101614552A (zh) * 2009-07-24 2009-12-30 深圳市凯立德计算机***技术有限公司 导航***手势命令输入判断方法及装置
CN103365578A (zh) * 2012-03-29 2013-10-23 百度在线网络技术(北京)有限公司 一种移动终端的解锁方法及移动终端
CN105117132A (zh) * 2015-08-31 2015-12-02 广州视源电子科技股份有限公司 一种触摸控制方法及装置
CN107085469A (zh) * 2017-04-21 2017-08-22 深圳市茁壮网络股份有限公司 一种手势的识别方法及装置
WO2017199356A1 (fr) * 2016-05-18 2017-11-23 日立マクセル株式会社 Dispositif de reconnaissance des gestes, procédé d'étalonnage des informations se rapportant à des modèles de reconnaissance des gestes et terminal de communication l'utilisant
CN108038412A (zh) * 2017-10-30 2018-05-15 捷开通讯(深圳)有限公司 终端及其基于自训练手势的控制方法、存储装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101614552A (zh) * 2009-07-24 2009-12-30 深圳市凯立德计算机***技术有限公司 导航***手势命令输入判断方法及装置
CN103365578A (zh) * 2012-03-29 2013-10-23 百度在线网络技术(北京)有限公司 一种移动终端的解锁方法及移动终端
CN105117132A (zh) * 2015-08-31 2015-12-02 广州视源电子科技股份有限公司 一种触摸控制方法及装置
WO2017199356A1 (fr) * 2016-05-18 2017-11-23 日立マクセル株式会社 Dispositif de reconnaissance des gestes, procédé d'étalonnage des informations se rapportant à des modèles de reconnaissance des gestes et terminal de communication l'utilisant
CN107085469A (zh) * 2017-04-21 2017-08-22 深圳市茁壮网络股份有限公司 一种手势的识别方法及装置
CN108038412A (zh) * 2017-10-30 2018-05-15 捷开通讯(深圳)有限公司 终端及其基于自训练手势的控制方法、存储装置

Also Published As

Publication number Publication date
CN115904103A (zh) 2023-04-04

Similar Documents

Publication Publication Date Title
US11157107B2 (en) Method and apparatus for providing touch interface
EP2579130B1 (fr) Procédé et dispositif adaptatifs destinés au mode de fonctionnement par le toucher de l'utilisateur
CN106125984B (zh) 一种移动终端的触控处理方法及移动终端
US20170123590A1 (en) Touch Point Recognition Method and Apparatus
US11079873B2 (en) Touch panel device
CN106886766B (zh) 一种指纹识别方法、指纹识别电路及移动终端
WO2015131675A1 (fr) Procédé de compensation destiné à des trajets de glissement interrompus, dispositif électronique, et support d'informations pour ordinateur
US9330249B2 (en) Information terminal
US9141246B2 (en) Touch pad
US10712870B2 (en) Method for improving fault tolerance of touchscreen and touchscreen terminal
CN108920066B (zh) 触摸屏滑动调整方法、调整装置及触控设备
WO2014109262A1 (fr) Système d'écran tactile
WO2023016076A1 (fr) Procédé de commande pour écran tactile de terminal, dispositif terminal et support de stockage
CN104881174A (zh) 一种动态调整触摸屏灵敏度的方法及装置
CN108845747A (zh) 一种防误触控方法、装置和终端设备
CN113760123A (zh) 屏幕触控的优化方法、装置、终端设备及存储介质
CN109154879B (zh) 电子设备及其输入处理方法
US20130088428A1 (en) Display control apparatus and display control method
CN105809071A (zh) 误触纠正的方法和终端
CN110737341B (zh) 变更接触物件的识别种类的方法
CN113934312A (zh) 一种基于红外触摸屏的触摸物识别方法和终端设备
US9904402B2 (en) Mobile terminal and method for input control
US20230070059A1 (en) False touch rejection method, terminal device, and storage medium
US20190180127A1 (en) Character recognition method of handwriting input device, handwriting input device, and vehicle including the handwriting input device
CN107980116B (zh) 悬浮触控感测方法、悬浮触控感测***及悬浮触控电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22855061

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE