WO2023051411A1 - 一种识别触摸操作的方法及电子设备 - Google Patents

一种识别触摸操作的方法及电子设备 Download PDF

Info

Publication number
WO2023051411A1
WO2023051411A1 PCT/CN2022/120972 CN2022120972W WO2023051411A1 WO 2023051411 A1 WO2023051411 A1 WO 2023051411A1 CN 2022120972 W CN2022120972 W CN 2022120972W WO 2023051411 A1 WO2023051411 A1 WO 2023051411A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
touch operation
data
knuckle
electronic device
Prior art date
Application number
PCT/CN2022/120972
Other languages
English (en)
French (fr)
Inventor
马腾霄
王土生
姚建江
兰佳梅
龚骏
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023051411A1 publication Critical patent/WO2023051411A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • G06F2218/10Feature extraction by analysing the shape of a waveform, e.g. extracting parameters relating to peaks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Definitions

  • the present application relates to the field of electronic technology, and in particular to a method for identifying a touch operation and an electronic device.
  • the touchpad is an important means for the user to interact with the electronic device when using the electronic device (such as a notebook computer).
  • the user mainly uses a single finger or multiple fingers to click, press, slide, etc. on the touchpad to achieve different functions. For example, click the touchpad with a single finger to simulate the click operation of the left button of the mouse, and for example, click the touchpad with two fingers to simulate the click operation of the right mouse button, and for example, slide up and down the touchpad with two fingers to Realize the mouse wheel function, and for example, use two fingers on the touchpad to pinch or spread to realize the function of zooming in and out of the screen.
  • gestures are operated by clicking and sliding with a single finger or two fingers, but when users want to achieve more functions , It needs to be realized through more complex gestures, such as using three fingers or using a touchpad with four fingers.
  • these gestures are implemented, the user operation is more difficult, and the gesture operation is more troublesome to remember, and the user experience is not good.
  • the use of the touchpad is simplified by introducing a new touch method.
  • One touch method is to operate the touchpad through knuckles to realize different gesture functions, and another touch method is for the user to slide on a specific area on the edge of the touchpad.
  • the operation triggers the corresponding function, but these new touch methods have many shortcomings during use, such as the low recognition rate of the knuckle touch method, and for example, the user cannot judge whether the edge sliding function is triggered when the edge sliding touch operation is performed. Therefore, User experience is not good.
  • Embodiments of the present application provide a method for identifying a touch operation and an electronic device, which help to improve user experience in identifying touch operations.
  • an embodiment of the present application provides a method for identifying a touch operation, which is applied to an electronic device, and the electronic device includes a touchpad and at least one deformation sensor.
  • the method includes: receiving a first touch operation acting on the touchpad, Obtain the deformation data collected by at least one deformation sensor and the touch data collected by the touch panel, and after determining that the first touch operation is a first knuckle touch operation according to the deformation data and touch data, trigger a response corresponding to the first knuckle touch operation Function.
  • the touch data may be, for example, coordinate data of a touch position corresponding to the first touch operation and touch image data corresponding to the first touch operation.
  • the knuckle touch operation can be recognized, and various interferences (such as the state of the electronic device itself and the interference of the surrounding environment) can be avoided.
  • various interferences such as the state of the electronic device itself and the interference of the surrounding environment.
  • the validity of the data required to recognize the knuckle touch operation can be realized, thereby improving the recognition rate, reducing the false touch rate, and improving user experience.
  • the deformation amount data is the deformation amount waveform data of a non-complete cycle.
  • the collected deformation amount waveform data and touch data of a non-complete cycle can be used without waiting for the entire cycle of the first touch operation.
  • the touch operation is recognized after the end of the life cycle (at least 10 seconds), which can speed up the process from the touch occurrence time of the touch operation to the identification of whether the first touch operation is a knuckle touch operation, so as to quickly respond to the function corresponding to the first touch operation .
  • the method may further include: determining the first knuckle touch operation Whether it is a single knuckle operation or a double knuckle operation; triggering the response function corresponding to the first knuckle touch operation may include: triggering the response function corresponding to the single knuckle operation when it is judged that the first knuckle touch operation is a single knuckle operation , or; when it is judged that the first knuckle touch operation is a two-finger joint operation, a response function corresponding to the two-finger joint operation is triggered.
  • determining the first touch operation as the first knuckle touch operation according to the deformation data and the touch data includes: generating at least one deformation waveform data according to the deformation data collected by at least one deformation sensor; Determining the target waveform data from at least one deformation waveform data, the difference between the peak and the trough of the target waveform data is greater than or equal to a preset threshold; determining the first touch operation as the first knuckle touch operation according to the target waveform data and the touch data .
  • the determined target waveform data is relatively close to the actual deformation of the touch panel caused by the first touch operation, so performing knuckle touch operation recognition based on the target waveform data and touch data can improve the accuracy of different areas on the touch panel.
  • the effectiveness and consistency of knuckle touch operation recognition improves user experience.
  • determining the first touch operation as the first knuckle touch operation according to the deformation data and the touch data may include: according to the characteristics of the deformation data, the characteristics of the touch data, and the pre-trained recognition model It is determined that the first touch operation is the first knuckle touch operation, and the pre-trained recognition model is obtained through training based on features of deformation data corresponding to the pre-collected knuckle touch operation and features of touch data.
  • the method further includes: receiving a second touch operation acting on the touch panel; if the first touch operation and the second touch operation meet a preset condition, determining whether the first touch operation and the second touch operation The operation is a knuckle double-click operation; trigger the response function corresponding to the knuckle double-click operation; the preset conditions include the following content: the second touch operation is a knuckle touch operation; the time when the first touch operation acts on the touchpad and the second touch operation The time interval between touch pads is less than the time threshold; the distance between the position of the first touch operation and the position of the second touch operation is less than the distance threshold.
  • the method further includes: displaying a user interface, and setting response functions corresponding to different knuckle touch operations through the user interface. In this way, it is helpful for the user to define the functions corresponding to different knuckle touch operations on the user interface.
  • the embodiment of the present application provides a method for identifying a touch operation, which is applied to an electronic device, and the electronic device includes a touchpad and a vibrating device.
  • the method includes: receiving a third touch operation acting on the touchpad; identifying the third The touch operation is an edge sliding touch operation; the response function corresponding to the edge sliding touch operation is triggered, and a vibration prompt is output through a vibration device.
  • a vibration prompt can be output, so that the user perceives that the function has been triggered, thereby improving user experience.
  • outputting a vibration prompt through a vibration device includes: outputting a vibration prompt through k vibration devices closest to each touch point corresponding to the edge sliding touch operation, where k is a positive integer.
  • outputting a vibration prompt through a vibration device includes: determining the distance to the first touch point according to the distances between the first touch point and the nearest k vibration devices and the first mapping relationship According to the vibration signal values corresponding to the nearest k vibration devices, vibration prompts are output according to the vibration signal values corresponding to the k vibration devices, wherein the first mapping relationship includes the mapping relationship between the distance and the vibration signal value, and the first touch point is the third Any one of the touch points corresponding to the touch operation.
  • the k vibration devices output corresponding vibration signal values according to the distance from themselves to the first touch point, which can ensure the consistency of the vibration intensity felt in different areas touched by the finger sliding on the edge.
  • the vibration intensity output by the k vibration devices closest to the first touch point is the same as the vibration intensity when they reach the first touch point, so that the k vibration devices felt by the user's fingers at the touch point The vibration intensity is the same.
  • identifying the third touch operation as an edge sliding touch operation includes: acquiring touch data collected by the touch panel in response to the third touch operation; determining that the third touch operation is a sliding touch operation according to the touch data , and after the sliding touch operation acts on the edge area of the touchpad, according to the coordinate data of each touch point, calculate the angle between the fitting line of the coordinate data of each touch point and the first direction, and the angle between each touch point in the second direction The difference in distance between the projected point on , and the projected point of the starting point of the third touch operation in the second direction; the first direction is the direction along the first side of the touchpad, and the first side is the third touch operation The side where the touch position of the operation is located, the second direction is the direction perpendicular to the first side; if the included angle and the distance difference corresponding to each touch point meet the preset conditions, the third touch operation is recognized as an edge sliding touch operation.
  • the recognition rate of the edge sliding touch operation can be improved.
  • the preset condition includes: the included angle is less than or equal to the included angle threshold, and the maximum value of the distance difference among the distance differences corresponding to each touch point is less than or equal to the difference threshold.
  • the method further includes: displaying a user interface, and setting a response function corresponding to the edge sliding touch operation through the user interface. In this way, it is helpful for the user to define the function corresponding to the edge sliding touch operation on the user interface.
  • an embodiment of the present application provides an electronic device, including a processor, a memory, a touch panel, and at least one deformation sensor; the touch panel is used to receive touch operations and collect touch data; the deformation sensor is used to collect shape Variable data; the memory is used to store one or more computer programs, and when the one or more computer programs are executed by the processor, the electronic device executes the method according to the above-mentioned first aspect and any possible implementation of the first aspect .
  • an embodiment of the present application provides an electronic device, including a processor, a memory, a touch panel, and at least one vibration device; the touch panel is used to receive touch operations and collect touch data; the vibration device is used to output vibration signals;
  • the memory is used to store one or more computer programs, and when the one or more computer programs are executed by the processor, the electronic device executes the method according to the above-mentioned first aspect and any possible implementation manner of the first aspect.
  • the embodiment of the present application also provides an electronic device, which may include: a receiving module, a first collection module, a second collection module, an identification module, and a response module; wherein, the receiving module is used to receive the first Touch operation; the first collection module is used to collect deformation data; the second collection module is used to collect touch data; the identification module is used to determine the first touch operation as the first knuckle touch operation according to the deformation data and the touch data ; The response module triggers the response function corresponding to the touch operation of the first knuckle.
  • the embodiment of the present application also provides an electronic device, which may include: a receiving module, an identification module, and a response module; wherein, the receiving module is used to receive the third touch operation; the identification module is used to identify the third touch operation.
  • the three-touch operation is an edge sliding touch operation; the response module is used to trigger a response function corresponding to the edge sliding touch operation, and output a vibration prompt through a vibration device.
  • an embodiment of the present application provides a computer-readable storage medium, the computer-readable storage medium includes a computer program, and when the computer program runs on an electronic device, the electronic device executes the above-mentioned first aspect, And the method of any one possible implementation manner of the first aspect, or perform the above-mentioned second aspect and the method of any possible implementation manner of the second aspect.
  • the embodiment of the present application further provides a chip, the chip is coupled with the memory in the electronic device, and is used to call the computer program stored in the memory and execute the first aspect of the embodiment of the present application and any one of the first aspects thereof.
  • "coupling" refers to two components are directly or indirectly connected to each other.
  • the embodiment of the present application further provides a method including a computer program product, which, when the computer program product is run on a terminal, causes the electronic device to execute any possible design of any of the above aspects.
  • FIG. 1A is a schematic diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 1B is a schematic diagram of knuckle touch operation provided by the embodiment of the present application.
  • FIG. 1C is a schematic diagram of an edge sliding touch operation provided by an embodiment of the present application.
  • FIG. 1D is a schematic diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a system architecture for knuckle touch operation recognition provided by an embodiment of the present application
  • FIG. 3 is a schematic flowchart of a method for identifying a touch operation provided by an embodiment of the present application
  • Fig. 4 is a schematic diagram of the deformation waveform provided by the embodiment of the present application.
  • Fig. 5 is the image feature data provided by the embodiment of the present application.
  • FIG. 6 is another schematic flowchart of a method for identifying a touch operation provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of the architecture of the edge sliding operation recognition provided by the embodiment of the present application.
  • FIG. 8 is a schematic flowchart of a method for identifying a touch operation provided by an embodiment of the present application.
  • FIG. 9 is a schematic flowchart of a method for identifying a touch operation provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of sliding touch provided by the embodiment of the present application.
  • FIG. 11 is a schematic diagram of a sliding touch scene provided by an embodiment of the present application.
  • FIG. 12 is another schematic flow chart of a method for identifying a touch operation provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of an electronic device provided by an embodiment of the present application.
  • first and second are used for description purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features .
  • a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features.
  • the touchpad is an important way for the user to interact with the electronic device when the user uses the electronic device (such as a notebook computer).
  • the electronic device shown in FIG. 1A the user can perform touch operations on the touchpad and display
  • the response result corresponding to the touch operation is displayed on the touch panel, and FIG. 1A is only an example, and does not limit the specific position of the touch panel.
  • a new touch method (or interaction method) has been introduced to simplify the use of the touchpad.
  • One touch method is to operate the touchpad through knuckles to realize different gesture functions.
  • the vibro-acoustic sensor can be used to obtain data for knuckle touch recognition, but this method is easily interfered by the vibration of the device itself or the sound data generated by the surrounding environment, and the false positive rate is high.
  • the data collected by the acceleration sensor of the device itself is used to recognize the knuckle touch, and the recognition rate is low.
  • Another touch method is for the user to perform a sliding operation on a specific area on the edge of the touchpad to trigger the corresponding function. However, the user cannot judge whether the edge sliding function is triggered when the edge sliding touch operation is performed, and the user experience is not good.
  • an embodiment of the present application provides a method for recognizing a touch operation, which helps to improve user experience in recognizing a touch operation.
  • the electronic device may be a portable electronic device including functions such as a personal digital assistant and/or a music player, such as a mobile phone, a tablet computer, a wearable device (such as a smart watch) with a wireless communication function, Vehicle equipment, etc.
  • portable electronic devices include, but are not limited to Or portable electronic devices with other operating systems.
  • the aforementioned portable electronic device may also be, for example, a laptop computer (Laptop) with a touch-sensitive surface (such as a touch panel).
  • the above-mentioned electronic device may also be a desktop computer with a touch-sensitive surface (such as a touch panel).
  • the electronic device provided by the embodiment of the present application is introduced below.
  • FIG. 1D exemplarily shows a schematic structural diagram of an electronic device 100 .
  • the illustrated electronic device 100 is only one example, and that the electronic device 100 may have more or fewer components than shown, may combine two or more components, or may have different Part configuration.
  • the various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, and a battery 142 , antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193 , a display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait.
  • application processor application processor
  • AP application processor
  • modem processor graphics processing unit
  • graphics processing unit graphics processing unit
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the
  • the processor 110 may run the method for identifying a touch operation provided in the embodiment of the present application, and the processor may respond to a touch operation on the touch panel and trigger a response function corresponding to the first touch operation.
  • the processor 110 integrates different devices, such as an integrated CPU and GPU, the CPU and GPU can cooperate to execute the method for recognizing touch operations provided by the embodiment of the present application. For example, in the method for recognizing touch operations, part of the algorithm is executed by the CPU, and another part of the algorithm Executed by the GPU for faster processing efficiency.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver ( Universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface , and/or a universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART Universal asynchronous receiver/transmitter
  • mobile industry processor interface mobile industry processor interface
  • MIPI mobile industry processor interface
  • general-purpose input and output general-purpose input/output
  • GPIO general
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger. Wherein, the charger may be a wireless charger or a wired charger.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the electronic device 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K can be disposed under the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a position different from that of the display screen 194 , for example, disposed on a touch panel.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example, time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the electronic device 100 may also include a touch panel, a Bluetooth device, a positioning device, a flashlight, a micro projection device, a near field communication (near field communication, NFC) device, etc., which will not be repeated here.
  • the method for recognizing touch operations provided by the embodiment of the present application can be applied to various application scenarios, such as the knuckle touch operation recognition scenario shown in FIG. 1B , and the edge sliding operation recognition scenario shown in FIG. 1C .
  • the method of recognizing touch operation is described in detail.
  • Application scenario 1 knuckle touch operation recognition scenario.
  • FIG. 2 is a schematic diagram of a system architecture for recognizing knuckle touch operations provided by an embodiment of the present application.
  • the system architecture includes a hardware layer, a kernel layer, and an application layer from bottom to top.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes the operating system (OS) driver, and the OS driver is a special program that enables the system to communicate with the hardware layer, and provides an interface for the communication between the hardware layer and the operating system.
  • OS operating system
  • the hardware layer may include a touchpad, at least one deformation sensor connected to the touchpad, and a microcontroller unit (MCU), and the MCU is respectively connected to the touchpad and at least one deformation sensor.
  • a deformation sensor is added to identify knuckle touch.
  • the deformation sensor involved in the present application may be a piezoelectric sensor, a piezoelectric sheet, a strain gauge, an acceleration sensor, or other sensors capable of collecting deformation data, which is not limited in the present application. In some other embodiments, the deformation sensor may also directly collect deformation waveform data.
  • the notebook computer when the electronic device is a notebook computer, the notebook computer includes a host module and a display module, the display module is used to provide visual output, the host module has a touch panel (also called a touch panel), and the touch panel has a touch A sensor that collects touch data.
  • the touch sensor and the display screen form a touch screen, also called a "touch screen", which can not only collect touch data, but also provide visual output.
  • the electronic device is a notebook computer as an example. If the solution of this application is applied to a tablet computer or a mobile phone, the implementation of the touchpad can also be applied to the touch screen. For specific implementation methods, refer to the notebook computer. way of realization.
  • the deformation sensor is a newly added device in the embodiment of the present application, and is used to acquire deformation data of the touchpad when the user operates the touchpad.
  • the deformation sensor can be deployed under the touchpad.
  • multiple deformation sensors can be evenly spaced under the touchpad, or deployed under the touchpad in an axisymmetric manner, or It is deployed under the touch panel in a centrosymmetric manner, and the present application does not limit the layout of multiple deformation sensors or a single deformation sensor.
  • the MCU can be used to collect and process the touch data and the deformation data of the touch panel, for example, processing the deformation data into deformation waveform data.
  • the MCU can also be used to identify knuckle touch operations based on touch data and deformation data. After the MCU completes the recognition of the touch operation, and the touch operation is a knuckle touch operation, the knuckle event is reported to the OS side application through the OS driver.
  • the application layer can include a series of OS-side applications and OS.
  • the OS includes various system application program interfaces.
  • the OS-side applications are pre-installed applications of the OS.
  • the OS-side applications can be triggered by calling the system application programming interface (application programming interface, API) Knuckle touch operations correspond to related functions, and the functions corresponding to knuckle touch operations can be customized through OS-side applications.
  • the application on the OS side can provide a user interface to the user, so that the user can define the function corresponding to the knuckle touch operation on the user interface.
  • the touchpad of the electronic device can receive at least one touch operation. For example, in some scenarios, when the knuckles of the user's fingers touch the touchpad twice within a short period of time, two touches will be generated on the touchpad.
  • the method for identifying touch operations in the following embodiments is used to identify each touch operation. The following embodiments only take recognizing one touch operation as an example, and do not limit that only one touch operation is received on the touch panel. It can be understood that a touch operation here means that a single knuckle touches the screen once, or two knuckles touch the screen at the same time, or more knuckles touch the screen at the same time.
  • FIG. 3 is a schematic flowchart of a method for identifying a touch operation provided by an embodiment of the present application.
  • the method can be applied to an electronic device with a touch panel, and the touch panel is provided with a deformation sensor. As shown in Figure 3, the method includes the following steps:
  • the electronic device receives a first touch operation acting on a touchpad.
  • the electronic device acquires deformation data of the touchpad collected by at least one deformation sensor and touch data collected by the touchpad.
  • Each deformation sensor in the electronic device can collect deformation data periodically or continuously, and store it in the data buffer.
  • the electronic device can obtain the shape data from the data buffer when it receives the first touch operation.
  • the variable data for example, can use the deformation data collected in the first period of time to identify the knuckle touch operation.
  • each deformation sensor collects one deformation data, and at least one deformation sensor can collect at least one deformation data. Taking the electronic device with five deformation sensors as an example, then five deformation data can be collected .
  • the electronic device can generate deformation waveform data according to the deformation data.
  • the knuckle touch operation is identified using deformation data collected within a first period
  • the first period may include a second period and a third period, wherein the second period includes from the start time of waveform selection to the second period A period between the touch occurrence times of a touch operation, the third period includes the period from the touch occurrence time of the first touch operation to the waveform selection end time.
  • the waveform data of the deformation amount when selecting the waveform data of the deformation amount, a waveform that does not include the entire life cycle of the first touch operation may be selected.
  • the entire life cycle of the first touch operation includes 10 seconds
  • the second period is 1 second.
  • the three-period period is 5 seconds as an example.
  • the deformation waveform within 1 second before the touch occurrence time of the first touch operation and the deformation amount waveform within 5 seconds after the touch occurrence time are selected to identify the touch operation.
  • the touch operation can be identified according to the selected deformation waveform, instead of waiting for the entire life cycle of the first touch operation to end (at least 10 seconds) before the touch operation can be identified, thereby speeding up from The process from the touch occurrence time of the touch operation to identifying whether the first touch operation is a knuckle touch operation, so as to quickly respond to the function corresponding to the first touch operation.
  • a method for selecting a deformation waveform collected by a deformation sensor will be exemplarily described below with reference to FIG. 4 .
  • FIG. 4 is a schematic diagram of a deformation waveform provided by an embodiment of the present application. As shown in FIG. 4 , the time corresponding to the forward shift of t1 from the touch occurrence time is used as the start time of waveform selection, and a time after the touch occurrence time is used as the end time of waveform selection, and the deformation waveform of the time length t2 is selected.
  • the knuckle touch operation is identified using the deformation data collected within the first period.
  • the first period may include the fourth period corresponding to the life cycle of the first touch operation, and the fifth period before the fourth period. period, and the sixth period after the fourth period, the life cycle of the first touch operation is a process from the first touch operation starting to touch the touchpad to leaving the touchpad.
  • the touchpad will generate deformation, and the deformation will continue to change to form a waveform, and each deformation sensor can collect the deformation waveform data formed by the deformation change.
  • the entire time period corresponding to the life cycle of the first touch operation is referred to as the fourth time period.
  • the electronic device When the electronic device obtains the deformation variable waveform data, it can first determine the waveform selection start time and the waveform selection end time.
  • the waveform selection start time can be a time before the life cycle of the first touch operation starts, and this time can be called the fifth time.
  • the waveform selection end time may be a period of time after the end of the life cycle of the first touch operation, and this period of time may be referred to as a sixth period. That is to say, the period from the waveform selection start time to the waveform selection end time is called the first period, which includes the fourth period, the fifth period and the sixth period in sequence.
  • the touch panel can collect the touch data of the first touch operation on the touch panel, such as coordinate data and touch image data including the touch position, wherein the touch image data is shown in Figure 5 (a ) and (b), the user's finger touches the touch panel to form a touch area, that is, a black area.
  • a touch area that is, a black area.
  • the black area in FIG. 5 is only used as an example of the touch area.
  • a black area may be displayed when a finger touches the touchpad. Of course, it may also be a touch area displayed in other colors, or no black area may be displayed. , that is, the user does not see changes in the touched area.
  • the electronic device identifies whether the first touch operation is a knuckle touch operation according to the at least one deformation amount data and the touch data. If yes, execute S304; if not, execute S305.
  • the deformation data of the touchpad when the finger is touched is collected by adding the deformation sensor, so as to avoid various interferences (such as the state of the electronic device itself and the interference of the surrounding environment) from affecting the recognition of the knuckles.
  • various interferences such as the state of the electronic device itself and the interference of the surrounding environment.
  • the electronic device may determine at least one optimal deformation waveform data from the deformation waveform data generated by multiple deformation data collected by multiple deformation sensors, and then Deformation waveform data, identifying whether the first touch operation is a knuckle touch operation. Alternatively, the electronic device identifies whether the first touch operation is a knuckle touch operation according to the optimal deformation waveform data.
  • the electronic device determines better deformation waveform data from the at least one deformation waveform data, which may be achieved in the following manner: the electronic device calculates each deformation waveform in the at least one deformation waveform data The deformation amount difference between the peak and the trough in the waveform of the data, the electronic device determines at least one deformation amount waveform data whose deformation amount difference between the peak and the trough of the waveform in the deformation amount waveform data is greater than a preset threshold value as Better deformation waveform data.
  • the waveform data having the largest deformation difference between the peak and the trough of the waveform is used as the optimal waveform data.
  • the deformation waveform data with the largest deformation difference is closest to the actual deformation generated by the first touch operation on the touch panel. Identifying the deformation waveform data with the largest deformation difference can improve the fingering of different areas on the touch panel. The effectiveness and consistency of joint touch operation recognition improves user experience.
  • the waveform of the deformation waveform data in the first period as shown in Figure 4 is not a waveform of a complete cycle, that is, the waveform within the time length t2 in Figure 4, its peak is L1, and the trough is L2, the deformation waveform The deformation difference between the peak and the trough of the data in the waveform of the first period is L1-L2.
  • the above-mentioned identification of whether the first touch operation is a knuckle touch operation based on the better (or optimal) deformation waveform data and touch data may be implemented in the following manner: Extract waveform features from the optimal (or optimal) deformation waveform data, for example, the waveform features may include but not limited to features such as waveform amplitude and waveform frequency as shown in (c) and (d) in Figure 5, Among them, the amplitude of the deformation waveform in (c) in Figure 5 is larger and the waveform is narrower, and the amplitude of the deformation waveform shown in (d) in Figure 5 is smaller and the waveform is wider.
  • the deformation waveform The narrower the waveform, the greater the vibration frequency within one vibration period of the waveform, and the vibration frequency is the reciprocal of the vibration period; the wider the deformation waveform, the smaller the vibration frequency within one vibration period of the waveform.
  • the electronic device extracts touch image features from the touch data.
  • the touch image features may include but not limited to the coordinates of the touch position and the area of the touch area.
  • the electronic device can identify the first Whether the touch operation is a knuckle touch operation, the knuckle touch recognition model is trained according to the pre-collected waveform features and touch image features corresponding to the knuckle touch operations.
  • the knuckle touch recognition model is any model that can realize the recognition of the knuckle touch operation in this application, for example, it can be a neural network model, which is not limited in this application.
  • the amplitude in the waveform feature is greater than or equal to the amplitude threshold, the vibration frequency is greater than or equal to the frequency threshold, and the area of the touch area in the touch image feature is less than or equal to the area threshold, the electronic device recognizes the first touch operation as a knuckle touch operation ;
  • the amplitude in the waveform feature is less than the amplitude threshold, the vibration frequency is less than the frequency threshold, and the area of the touch area in the touch image feature is greater than the area threshold, then the electronic device recognizes the first touch operation as a non-knuckle touch operation.
  • FIG. 5 is a schematic diagram of feature differences between a knuckle touch operation and a non-knuckle touch operation according to an embodiment of the present application.
  • the touch area of the knuckle touch operation, the amplitude of the corresponding deformation waveform is greater than or equal to the amplitude threshold, the vibration frequency is greater than or equal to the frequency threshold, and the area of the touch area is small, as shown in Figure 5
  • the amplitude of the corresponding deformation waveform is smaller than the amplitude threshold
  • the vibration frequency is smaller than the frequency threshold
  • the area of the touch area is large.
  • the electronic device identifies the first touch operation as a knuckle touch operation, and triggers a first response function corresponding to the first touch operation.
  • the first response function may be system preset or user-defined. For example, after the knuckle touch operation is recognized, a setting interface will pop up for setting screen brightness or volume, etc.
  • the electronic device can trigger the volume setting function after recognizing the first touch operation as a knuckle touch operation, that is, whether it is a knuckle touch operation realized by one finger or two fingers , the response functions corresponding to the knuckle touch operations are all set to the same function, and in this case, it is not necessary to determine the number of fingers used for the first touch operation.
  • the above-mentioned embodiment shown in FIG. 3 can be applied to a single knuckle touch operation of a single finger, and is also applicable to a single knuckle touch operation of multiple fingers (such as two fingers).
  • the touch operation as an example, the knuckles of the two fingers touch the touchpad at the same time, or the time difference between the knuckles of the two fingers touching the touchpad is very short and can be ignored.
  • the specific value of the time difference may be set, for example, 1 ms, and the embodiment of the present application does not limit the specific value of the time difference.
  • the index finger and middle finger tap the touchpad together, and the time difference between the index finger and middle finger tapping the touchpad is less than 1 ms, it is considered that the index finger and middle finger have implemented a double-knuckle click operation.
  • different response functions can be set for single-finger knuckle click operation and two-finger knuckle click operation, and the electronic device can recognize the touch operation as a knuckle touch operation each time, according to the touch image
  • the data determines the number of fingers used to realize the knuckle touch operation. Taking the first touch operation identified as a knuckle touch operation as an example, if the first touch operation is performed by a single finger, it is a single knuckle click operation. If The first touch operation is implemented by two fingers, that is, a double-knuckle click operation.
  • the number of fingers used for the knuckle touch operation may be determined.
  • the present application does not limit the timing of judging the number of fingers that realize the knuckle touch operation.
  • a specific implementation method for identifying whether a touch operation (that is, the first touch operation) is a knuckle touch operation is provided.
  • the Other touch operations may be received within the preset time period.
  • the touchpad receives a second touch operation.
  • a specific value of the preset duration may be set, for example, 10 ms, and the embodiment of the present application does not limit the specific value of the preset duration.
  • the following describes how to determine whether the first touch operation and the second touch operation are knuckle double-click operations.
  • the first touch operation and the second touch operation satisfy a first preset condition
  • the first preset condition includes The following three items: the first item, the first touch operation and the second touch operation are both knuckle touch operations; the second item, the time when the first touch operation acts on the touchpad and the time when the second touch operation acts on the touchpad The interval between them is less than the time threshold; the third item, the distance between the position of the first touch operation and the position of the second touch operation is less than the distance threshold.
  • first touch operation and the second touch operation do not meet the first preset condition, that is, do not meet any one or more contents of the first preset condition, then it is determined that the first touch operation and the second touch operation are non-pointing Joint double click operation.
  • each touch operation is a knuckle touch operation
  • the number of fingers involved in the touch operation is directly determined. That is, the present application does not specifically limit the timing of judging whether it is a two-finger joint or a single-finger joint.
  • the following is an example to illustrate the response function of the knuckle double-click operation.
  • the electronic device recognizes the first touch operation and the second touch operation as a single-knuckle double-click operation, and triggers the system screenshot function; the electronic device recognizes the first touch operation and the second touch operation as a double-knuckle double-click operation, triggers System screen recording function.
  • the electronic device recognizes the first touch operation and the second touch operation as a single-knuckle double-click operation, and triggers the system screen recording function; the electronic device recognizes the first touch operation and the second touch operation as double-knuckle double-click Operation to trigger the system screenshot function.
  • FIG. 2 A detailed example of a method for recognizing a touch operation is provided below in conjunction with FIG. 2 .
  • FIG. 6 is another schematic flowchart of the method for recognizing a touch operation provided by the embodiment of the present application. Specifically, the process includes:
  • Multiple deformation sensors in the electronic device periodically report collected deformation data to the MCU, and the MCU stores the multiple deformation data in a data buffer.
  • the MCU acquires the touch data corresponding to the first touch operation, acquires a plurality of deformation data in the first period from the data buffer, and generates the deformation according to the deformation data Waveform data.
  • the first time period is a time period between the start time of waveform selection and the end time of waveform selection, wherein a certain period of time before the occurrence of the touch operation is used as the start time of waveform selection, and a certain time later is used as the end time of waveform selection.
  • the MCU selects the deformation amount waveform data having the largest difference between the peak and the trough in the waveform as the optimal deformation amount waveform data from the plurality of deformation amount waveform data.
  • the MCU performs feature extraction based on the built-in knuckle touch recognition model according to the touch data and the optimal deformation waveform data, extracts touch image features from the touch data, and extracts waveform features from the optimal deformation waveform data.
  • the MCU classifies and recognizes based on the touch image features and waveform features, and determines whether the first touch operation is a knuckle touch operation. If yes, execute S606; otherwise, execute S618.
  • the MCU determines the number of fingers used for the first touch operation; if it is a single finger or two fingers, execute S607; if it is multiple fingers, execute S618. It should be noted that the present application does not limit the timing of identifying the number of fingers.
  • the MCU determines whether the touchpad receives a second knuckle touch operation. If yes, perform S612; otherwise, perform S608 when the first touch operation uses one finger, and perform S610 when the first touch operation uses two fingers.
  • the MCU reports the single knuckle click event to the OS side application through the OS driver.
  • the OS side application invokes the OS API to trigger the volume setting function.
  • the click of a single knuckle triggers the function of setting the volume.
  • the click of a single knuckle can also correspond to different functions. Do limited.
  • the MCU reports a two-knuckle click event to the OS side application through the OS driver.
  • the application on the OS side invokes the OS API to trigger a brightness setting function. It should be noted that in the embodiment provided by this application, double-clicking with a single knuckle triggers the brightness setting function. In other implementations, double-clicking with a single knuckle can also correspond to different functions, which is not limited in this application. .
  • the MCU determines whether the touch positions of the two touch operations are smaller than the distance threshold, and whether the interval time is smaller than the time threshold; if yes, execute S613; otherwise, execute S618.
  • S613, MCU combined with S606 to determine the number of fingers used for two knuckle touch operations; if they are all single fingers, execute S614; if they are two fingers, execute S616; three fingers) or the numbers of fingers used by the two touch operations are different, execute S618. It can be understood that, here, the number of fingers may be determined while determining whether each touch operation is a knuckle touch operation (eg, determined in step S605 ).
  • the MCU reports a single-knuckle double-click event to the OS side application through the OS driver.
  • the OS side application calls the OS API to trigger the screenshot function. It should be noted that in the embodiment provided by this application, double-tapping with a single knuckle triggers the screenshot function. In other implementations, double-tapping with a single knuckle may also correspond to different functions, which is not limited in this application.
  • the MCU reports the double-knuckle double-click event to the OS side application through the OS driver.
  • the OS side application calls the OS API to trigger the screen recording function.
  • double-tapping with two knuckles corresponds to triggering the screen recording function.
  • double-tapping with two knuckles can also correspond to different functions, which is not limited in this application. .
  • FIG. 7 is a schematic diagram of an architecture of edge sliding operation recognition provided by an embodiment of the present application.
  • the system architecture includes a hardware layer, a kernel layer, and an application layer sequentially from bottom to top.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least an OS driver.
  • the OS driver is a special program that enables the system to communicate with the hardware device, and provides an interface for the hardware device to the operating system.
  • the hardware layer may include a touchpad, a vibrating device connected to the touchpad, and an MCU, and the MCU is respectively connected to the touchpad and at least one vibrating device.
  • the notebook computer when the electronic device is a notebook computer, the notebook computer includes a host module and a display module. A sensor that collects touch data.
  • the touch sensor and the display screen form a touch screen, also called a "touch screen", which can not only collect touch data, but also provide visual output.
  • the electronic device is a notebook computer as an example. If the solution of this application is applied to a tablet computer or a mobile phone, the implementation of the touchpad can also be applied to the touch screen. For specific implementation methods, refer to the notebook computer. way of realization.
  • the user experience is improved by adding a vibrating device to make the user feel that the function has been triggered.
  • At least one vibration device can be arranged in multiple areas of the touch panel. For example, at least one vibration device is evenly distributed under the touch panel.
  • the application does not limit the layout of the at least one vibration device. Symmetrically distributed below the touchpad.
  • at least one vibrating device is distributed under the touchpad in a centrosymmetric manner. This application does not limit the layout of the at least one vibrating device.
  • the vibrating device is used to output vibration prompts.
  • the MCU can be used to collect touch data of each touch point when the user performs a sliding operation on a specific edge area of the touchpad.
  • the MCU can also be used to identify edge sliding operations. After the MCU completes the recognition of the touch operation, and the touch operation is an edge sliding touch operation, the edge touch event is reported to the OS side application through the OS driver.
  • the application layer can include a series of OS-side applications and OS.
  • the OS includes various system application program interfaces.
  • OS-side applications are pre-installed applications of the OS.
  • OS-side applications can trigger functions related to edge sliding operations by calling system APIs, and at the same time trigger
  • the vibration device outputs a vibration prompt, and the functions corresponding to the edge sliding operation can be customized through the OS side application.
  • the application on the OS side can provide a user interface to the user, so that the user can define the function corresponding to the edge sliding operation on the user interface.
  • FIG. 8 is a schematic flowchart of a method for identifying a touch operation provided by an embodiment of the present application.
  • the method can be applied to an electronic device with a touch panel, and the touch panel is provided with a vibrating device. As shown in Figure 8, the process of the method includes:
  • the electronic device receives a third touch operation acting on a touchpad.
  • the electronic device identifies whether the third touch operation is an edge sliding touch operation. If yes, execute S803; if not, execute S804.
  • step S802 the electronic device collects touch data in response to the third touch operation, where the touch data may include touch The coordinate data of each touch point on the board and the interval time between any two adjacent touch points, and then determine whether the third touch operation is a sliding touch operation according to the touch data; for example, each touch point corresponding to the third touch operation If the interval time between any two adjacent touch points is greater than the time threshold, it is recognized that the third touch operation is not a sliding touch operation, and then the electronic device judges that the third touch operation is not an edge sliding touch operation, but a false touch operation, Do nothing.
  • a possible implementation method for identifying whether the third touch operation is an edge sliding touch operation is: the electronic device calculates the angle between the fitting line of the coordinate data of each touch point and the first direction according to the coordinate data of each touch point, and Identify the distance difference between the projection point of each touch point in the second direction and the projection point of the starting point of the third touch operation in the second direction, according to the included angle and the distance difference corresponding to each touch point.
  • the three-touch operation is an edge sliding touch operation; wherein, the first direction is the direction along the first side of the touchpad, the first side is the side where the touch position of the third touch operation is located, and the second direction is the side where the touch position of the third touch operation is located. side-to-side vertical direction.
  • the user's finger slides on the edge specific area of the first side of the touchpad (the edge specific area is such as the upper edge sliding area shown in Figure 1C), forming a sliding track from the sliding starting point to the sliding end point, and the sliding
  • the direction is a direction along the first side.
  • the electronic device collects the coordinate data of the 5 touch points on the touch panel in the plane coordinate system where the touch panel is located. The coordinate data of these five touch points are fitted with a straight line, and then the angle between the fitted straight line and the first direction is calculated.
  • the first direction is the sliding direction in Figure 9, which is parallel to the first side of the touchpad and is parallel to the touchpad.
  • the second side of is vertical.
  • the maximum value of the distance difference is the maximum distance t corresponding to the second touch point.
  • the third touch operation is identified as an edge slide touch operation, then it is determined that the third touch operation is an edge slide touch operation, wherein the preset The conditions include: the included angle is less than or equal to the included angle threshold, and the maximum value of the distance difference is less than or equal to the difference threshold.
  • the third touch operation is not an edge sliding touch operation.
  • the edge sliding touch operation is recognized based on the coordinate data of each touch point of the third touch operation, which can reduce the false touch rate and improve the recognition rate.
  • the above touch data may also include the touch area.
  • the MCU can also first determine whether the touch area corresponding to the third touch operation is less than or equal to the preset threshold, if so, continue to identify the third touch operation according to the angle and distance difference, if not, then identify the third touch operation as a false touch Operation, for example, the user's palm touches the touchpad, and the palm touch area is relatively large. In this case, it can be identified as a false touch operation by calculating the touch area of the touch operation.
  • the electronic device triggers the second response function corresponding to the edge sliding touch operation, and outputs a vibration prompt through the vibration device.
  • the second response function corresponding to the execution of the edge sliding touch operation in S803 above may be implemented in the following manner: after recognizing that the third touch operation is an edge sliding touch operation, the electronic device may also move in the first direction based on the edge sliding touch operation. Sliding movement distance m or movement speed triggers the corresponding response function. As shown in FIG. 10 , the sliding distance may be the distance m between the projection point of the sliding end point in the first direction and the projection point of the sliding start point in the first direction of the edge sliding touch operation.
  • the sliding distance corresponding to the edge sliding touch operation is 0.5 cm
  • one edge touch event is triggered
  • the corresponding sliding distance to the edge sliding touch operation is 1 cm
  • two edge touch events are triggered.
  • two edge sliding touch operations with a sliding distance of 0.5 cm the edge sliding touch operation with a fast moving speed increases the volume by 2 dB
  • the edge sliding touch operation with a slow moving speed increases the volume by 1 dB.
  • outputting a vibration prompt through a vibration device in S803 above may be implemented in the following manner: output a vibration prompt through all vibration devices whose distance to each touch point corresponding to the third touch operation is smaller than a distance threshold. For example, taking the distance threshold as 1 cm as an example, all vibration devices within a distance of 1 cm from each touch point may output vibration prompts.
  • the output of the vibration reminder through the vibration device in S803 above may be implemented in the following manner: the electronic device outputs the vibration reminder through the k vibration devices closest to each touch point corresponding to the third touch operation, k is a positive integer.
  • the electronic device can also determine the vibration corresponding to the k vibration devices closest to the first touch point according to the distances between the first touch point and the k vibration devices and the first mapping relationship
  • the first mapping relationship includes a mapping relationship between a distance and a vibration signal value
  • the first touch point is any one of the touch points corresponding to the third touch operation. Therefore, the vibration intensity when the vibration signals output by the k vibration devices closest to the first touch point reach the first touch point is consistent.
  • the mapping relationship may be in the form of a list, or may be a function formula.
  • vibration device 1 there are two vibration devices around a touch point O, where the distance between vibration device 1 and touch point O is s1, and the distance between vibration device 2 and touch point O is s2, then These two vibration devices are the vibration devices closest to the touch point O.
  • the vibration signal value x1 corresponding to s1 and the vibration signal value x2 corresponding to s2 can be determined, and then The vibration signal value x1 is sent to the vibration device 1, and the vibration signal value x2 is sent to the vibration device 2.
  • the vibration device 1 outputs a vibration prompt according to the vibration signal value x1, and the vibration device 2 outputs a vibration prompt according to the vibration signal value x2. In this way, the vibration intensity of the two vibration devices felt by the user at the touch point O is consistent.
  • the vibration device provided on the touch panel when the edge sliding touch operation is recognized, the vibration device provided on the touch panel outputs vibration feedback to indicate to the user that the sliding function has been triggered, which can improve user experience.
  • the electronic device recognizes that the third touch operation is a false touch operation, and does not perform any processing.
  • the following provides a detailed example of another method for recognizing a touch operation with reference to FIG. 7 .
  • FIG. 12 is another schematic flowchart of the method for recognizing a touch operation provided by the embodiment of the present application. Specifically, the process includes:
  • the MCU acquires the touch data corresponding to the third touch operation, wherein the touch data includes the coordinate data of each touch point and the time between any two adjacent touch points in each touch point spacing and touch area.
  • the MCU determines whether the touch area corresponding to the third touch operation is less than or equal to the preset threshold, and the time interval between any two adjacent touch points among the touch points corresponding to the third touch operation is less than the time threshold, and the touch If the location is within the specified edge area, if yes, go to step S1203; otherwise, go to step S1211.
  • the MCU fits a straight line according to the coordinate data of each touch point corresponding to the third touch operation.
  • the MCU calculates an angle ⁇ between the fitting line and the first direction.
  • the first direction is a direction along the first side of the touch panel, and the first side is the side where the touch position of the third touch operation is located.
  • the MCU calculates a distance difference between the projection point of each touch point in the second direction and the projection point of the start point of the third touch operation in the second direction, and determines a maximum value t of the distance difference.
  • the second direction is a direction perpendicular to the first side.
  • the MCU determines whether the angle ⁇ is less than or equal to the angle threshold and the maximum distance difference t is less than or equal to the difference threshold. If yes, execute step S1207; otherwise, execute S1211.
  • the MCU recognizes that the third touch operation is an edge sliding touch operation, and reports an edge sliding touch event to the OS side application based on the sliding movement distance m or moving speed of the edge sliding touch operation in the first direction.
  • the OS side application calls the OS API to trigger a response function corresponding to the edge sliding touch operation.
  • the MCU Based on the current distance between the touch point and the vibration device, the MCU selects k vibration devices with the smallest distances, and calculates vibration signal values corresponding to the k vibration devices.
  • the MCU sends respective vibration signal values to the k vibration devices, so that the k vibration devices perform vibration prompts according to the received vibration signal values.
  • the MCU recognizes that the third touch operation is a false touch operation, and does not process it.
  • the methods provided in the embodiments of the present application are introduced from the perspective of an electronic device (such as a tablet computer) as an execution subject.
  • the electronic device may include a hardware structure and/or a software module, and realize the above-mentioned functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether one of the above-mentioned functions is executed in the form of a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraints of the technical solution.
  • FIG. 13 is a schematic diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device 1300 can implement the steps performed by the electronic device in the above method embodiment.
  • the electronic device may include a receiving module 1301 , a first collection module 1302 , a second collection module 1303 , an identification module 1304 , and a response module 1305 .
  • the first collection module 1302 is used to collect deformation data
  • the second collection module 1303 is used to collect touch data
  • An identification module 1304, configured to determine that the first touch operation is a first knuckle touch operation according to the deformation data and the touch data;
  • the response module 1305 triggers a response function corresponding to the first knuckle touch operation.
  • the functions of the receiving module 1301 and the second acquisition module 1303 can be realized by the touch panel in the above method embodiment, and the function of the first acquisition module 1302 can be realized by at least one deformation sensor in the above method embodiment
  • the function of the identification module 1304 may be implemented by the MCU in the above method embodiment, and the function of the response module 1305 may be implemented by an application on the OS side.
  • the deformation data is non-complete period deformation waveform data.
  • the identification module 1304 determines that the first touch operation is a first knuckle touch operation, and before the response module 1305 triggers a response function corresponding to the first knuckle touch operation, the identification module 1304 is further configured to: Determine whether the first knuckle touch operation is a single knuckle operation or a two-finger joint operation; the response module 1305 is specifically configured to: trigger a response function corresponding to the single knuckle operation when it is determined that the first knuckle touch operation is a single knuckle operation, Or, when it is judged that the first knuckle touch operation is a two-finger joint operation, a response function corresponding to the two-finger joint operation is triggered.
  • the identification module 1304 is specifically configured to: generate at least one deformation amount waveform data according to the deformation amount data collected by at least one deformation amount sensor; determine the target waveform data from the at least one deformation amount waveform data, The difference between the peak and the trough of the target waveform data is greater than or equal to a preset threshold; according to the target waveform data and the touch data, it is determined that the first touch operation is the first knuckle touch operation.
  • the recognition module 1304 is specifically configured to: determine the first touch operation as the first knuckle touch operation according to the characteristics of the deformation data, the characteristics of the touch data, and the pre-trained recognition model, and pre-train The recognition model is trained according to the characteristics of the deformation data corresponding to the knuckle touch operation and the characteristics of the touch data collected in advance.
  • the receiving module 1301 is further configured to receive a second touch operation acting on the touch panel;
  • the identification module 1304 is further configured to: if the first touch operation and the second touch operation satisfy a preset condition, then Determine that the first touch operation and the second touch operation are knuckle double-click operations; trigger the response function corresponding to the knuckle double-click operation;
  • the preset conditions include the following: the second touch operation is a knuckle touch operation; the first touch operation acts on the touch The interval between the time of the panel and the time when the second touch operation acts on the touch panel is less than the time threshold; the distance between the position of the first touch operation and the position of the second touch operation is less than the distance threshold.
  • the electronic device 1300 further includes a display module 1306, configured to display a user interface, so that the user can set a response function corresponding to the first knuckle touch operation through the user interface.
  • the function of the display module 1306 can be implemented by a display screen.
  • FIG. 14 is a schematic diagram of another electronic device provided by the embodiment of the present application.
  • the electronic device 1400 can implement the steps performed by the electronic device in the above method embodiment.
  • the electronic device may include a receiving module 1401 , an identification module 1402 and a response module 1403 .
  • a receiving module 1401, configured to receive a third touch operation
  • An identification module 1402 configured to identify the third touch operation as an edge sliding touch operation
  • the response module 1403 is configured to trigger a response function corresponding to an edge sliding touch operation, and output a vibration prompt through a vibration device.
  • the functions of the receiving module 1401 and the collection module 1405 can be realized by the touch panel in the above method embodiment
  • the functions of the identification module 1402 and the processing module 1404 can be realized by the MCU in the above method embodiment
  • the response module 1403 The function of can be implemented by the OS side application in the above method embodiment.
  • the response module 1403 is specifically configured to output vibration prompts through k vibration devices closest to each touch point corresponding to the edge sliding touch operation, where k is a positive integer.
  • the electronic device 1400 further includes a processing module 1404, and the processing module 1404 is specifically configured to: determine and The vibration signal values corresponding to the k vibration devices closest to the first touch point, the first mapping relationship includes the mapping relationship between the distance and the vibration signal value, and the first touch point is any of the touch points corresponding to the third touch operation A touch point; a response module 1403, specifically configured to output a vibration prompt according to the vibration signal values corresponding to the k vibration devices.
  • the vibration signals output by the k vibration devices closest to the first touch point have the same vibration intensity when they reach the first touch point.
  • the electronic device 1400 further includes a collection module 1405, configured to collect touch data; a processing module 1404, specifically configured to respond to the third touch operation, to obtain touch data collected by the touch panel; After the data determines that the third touch operation is a sliding touch operation, and the sliding touch operation acts on the edge area of the touchpad, according to the coordinate data of each touch point, calculate the angle between the fitting line of the coordinate data of each touch point and the first direction , and the distance difference between the projection point of each touch point in the second direction and the projection point of the starting point of the third touch operation in the second direction; the first direction is along the first side of the touchpad direction, the first side is the side where the touch position of the third touch operation is located, and the second direction is the direction perpendicular to the first side; the identification module 1402 is specifically used for if the included angle and the distance difference corresponding to each touch point The value satisfies the preset condition, and the third touch operation is identified as an edge sliding touch operation.
  • a collection module 1405 configured to collect touch data
  • the preset condition includes: the included angle is less than or equal to the included angle threshold, and the maximum value of the distance difference among the distance differences corresponding to each touch point is less than or equal to the difference threshold.
  • the electronic device 1400 further includes a display module 1406, configured to display a user interface, so that the user can set a response function corresponding to an edge sliding touch operation through the user interface.
  • the function of the display module 1406 may be implemented by a display screen.
  • FIG. 1D When implemented by hardware, reference may be made to FIG. 1D and related descriptions for the hardware structure implementation of the electronic device.
  • the electronic device includes: a processor, a memory, a touch panel, and at least one deformation sensor; the touch panel is used to receive touch operations and collect touch data; the deformation sensor is used to collect deformation Data; the memory is used to store one or more computer programs, and when the one or more computer programs are executed by the processor, the electronic device executes the method in any of the above embodiments.
  • the electronic device includes a processor, a memory, a touch panel, and at least one vibration device; the touch panel is used to receive touch operations and collect touch data; the vibration device is used to output vibration signals;
  • the memory is used to store one or more computer programs, and when the one or more computer programs are executed by the processor, the electronic device executes the method in any one of the above embodiments.
  • the embodiment of the present application also provides a computer storage medium, the computer storage medium stores computer instructions, and when the computer instructions are run on the electronic device, the electronic device executes the steps of the above-mentioned related methods to implement the methods in the above-mentioned embodiments.
  • An embodiment of the present application further provides a computer program product, which, when running on a computer, causes the computer to execute the above-mentioned related steps, so as to implement the methods in the above-mentioned embodiments.
  • an embodiment of the present application also provides a device, which may specifically be a chip, a component or a module, and the device may include a connected processor and a memory; wherein the memory is used to store computer-executable instructions, and when the device is running, The processor can execute the computer-executable instructions stored in the memory, so that the chip executes the methods in the foregoing method embodiments.
  • the electronic device, computer storage medium, computer program product or chip provided in the embodiments of the present application are all used to execute the corresponding method provided above, therefore, the beneficial effects it can achieve can refer to the corresponding method provided above. The beneficial effects of the method will not be repeated here.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or It may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the terms “when” or “after” may be interpreted to mean “if” or “after” or “in response to determining" or “in response to detecting ".
  • the phrases “in determining” or “if detected (a stated condition or event)” may be interpreted to mean “if determining" or “in response to determining" or “on detecting (a stated condition or event)” or “in response to detecting (a stated condition or event)”.
  • relational terms such as first and second are used to distinguish one entity from another, without limiting any actual relationship and order between these entities.
  • references to "one embodiment” or “some embodiments” or the like in this specification means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically stated otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, all or part of the processes or functions according to the embodiments of the present invention will be generated.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device including a server, a data center, and the like integrated with one or more available media.
  • the available medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, DVD), or a semiconductor medium (for example, a Solid State Disk (SSD)).
  • a magnetic medium for example, a floppy disk, a hard disk, or a magnetic tape
  • an optical medium for example, DVD
  • a semiconductor medium for example, a Solid State Disk (SSD)

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供一种识别触摸操作的方法及电子设备,该方法应用于电子设备,且电子设备包括触摸板和至少一个形变量传感器,该方法包括:接收作用于触摸板的第一触摸操作,获取至少一个形变量传感器采集的形变量数据以及触摸板采集的触摸数据,根据形变量数据以及触摸数据确定第一触摸操作为第一指关节触摸操作后,触发第一指关节触摸操作对应的响应功能。通过根据触摸板采集的触摸数据,以及新增的形变量传感器采集形变量数据,识别指关节触摸操作,可以避免各种干扰(例如电子设备自身状态以及周围环境的干扰)影响到用于识别指关节触摸操作所需数据的有效性,从而提升识别率,降低误触率,提升用户体验。

Description

一种识别触摸操作的方法及电子设备
相关申请的交叉引用
本申请要求在2021年09月29日提交中国专利局、申请号为202111153203.0、申请名称为“一种识别触摸操作的方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子技术领域,尤其涉及一种识别触摸操作的方法及电子设备。
背景技术
触摸板是用户使用电子设备(例如笔记本电脑)时用户与电子设备进行交互的一种重要手段,现阶段用户主要以单个手指或者多个手指对触摸板进行点击、按压、滑动等操作来实现不同的功能,例如,用单个手指单击触摸板模拟鼠标左键的点击操作,又例如,用两个手指点击触摸板模拟鼠标右键的点击操作,又例如,用两个手指在触摸板上下滑动以实现鼠标滚轮功能,又例如,用两个手指在触摸板上双指捏合或者张开实现放大和缩小屏幕功能。目前,越来越多的电子设备或者三方软件支持多种手势功能的自定义,但是这些手势都是单个手指或者两个手指的点击、滑动等来进行操作,但用户想实现更多的功能时,就需要通过更复杂的手势的来实现,如三指使用或者四指使用触摸板,实现这些手势操作时,用户操作比较困难,手势操作记忆起来也比较麻烦,用户体验并不好。
目前,通过引入新的触摸方式来简化触摸板的使用,一种触摸方式为通过指关节操作触摸板,来实现不同的手势功能,又一种触摸方式为用户在触摸板边缘的特定区域进行滑动操作触发对应的功能,但是这些新的触摸方式在使用过程中存在很多缺点,例如指关节触摸方式的识别率低,又例如,用户在边缘滑动触摸操作时无法判断边缘滑动功能是否触发,所以,用户体验并不好。
发明内容
本申请实施例提供一种识别触摸操作的方法及电子设备,有助于提升触摸操作识别的用户体验。
第一方面,本申请实施例提供一种识别触摸操作的方法,应用于电子设备,且电子设备包括触摸板和至少一个形变量传感器,该方法包括:接收作用于触摸板的第一触摸操作,获取至少一个形变量传感器采集的形变量数据以及触摸板采集的触摸数据,根据形变量数据以及触摸数据确定第一触摸操作为第一指关节触摸操作后,触发第一指关节触摸操作对应的响应功能。其中,触摸数据例如可以为第一触摸操作对应的触摸位置的坐标数据以及第一触摸操作对应的触摸图像数据。
本申请实施例中,根据触摸板采集的触摸数据,以及新增的形变量传感器采集形变量数据,识别指关节触摸操作,可以避免各种干扰(例如电子设备自身状态以及周围环境的干扰)影响到用于识别指关节触摸操作所需数据的有效性,从而提升识别率,降低误触率, 提升用户体验。
在一种可能的实现方式中,形变量数据为非完整周期的形变量波形数据,如此,可以根据采集到的非完整周期的形变量波形数据和触摸数据,而不用等待第一触摸操作的整个生命周期结束(至少要等10秒)之后才识别触摸操作,可以加快从触摸操作触摸发生时间至识别出第一触摸操作是否为指关节触摸操作的过程,从而快速响应第一触摸操作对应的功能。
在一种可能的实现方式中,在确定第一触摸操作为第一指关节触摸操作后,触发第一指关节触摸操作对应的响应功能之前,该方法还可以包括:判断第一指关节触摸操作为单指关节操作还是双指关节操作;触发第一指关节触摸操作对应的响应功能,可以包括:当判断第一指关节触摸操作为单指关节操作时,触发单指关节操作对应的响应功能,或者;当判断第一指关节触摸操作为双指关节操作时,触发双指关节操作对应的响应功能。通过该实现方式,可以实现根据第一指关节触摸操作所使用的手指数量,为单指关节操作和双指关节操作设置不同的响应功能,从而为用户提供更多的手势功能,可以提升用户体验。
在一种可能的实现方式中,根据形变量数据以及触摸数据确定第一触摸操作为第一指关节触摸操作,包括:根据至少一个形变量传感器采集的形变量数据生成至少一个形变量波形数据;从至少一个形变量波形数据中确定出目标波形数据,目标波形数据的波峰与波谷的差值大于或等于预设阈值;根据目标波形数据与触摸数据确定第一触摸操作为第一指关节触摸操作。通过该实现方式,确定出的目标波形数据较为接近第一触摸操作对触摸板产生的实际形变量,所以根据该目标波形数据和触摸数据进行指关节触摸操作识别,可以提升触摸板上不同区域的指关节触摸操作识别的有效性和一致性,提升用户体验。
在一种可能的实现方式中,根据形变量数据以及触摸数据确定第一触摸操作为第一指关节触摸操作,可以包括:根据形变量数据的特征、触摸数据的特征、以及预训练的识别模型确定第一触摸操作为第一指关节触摸操作,预训练的识别模型为根据预先采集的指关节触摸操作对应的形变量数据的特征和触摸数据的特征训练得到的。
在一种可能的实现方式中,该方法还包括:接收作用于触摸板的第二触摸操作;若第一触摸操作与第二触摸操作满足预设条件,则确定第一触摸操作与第二触摸操作为指关节双击操作;触发指关节双击操作对应的响应功能;预设条件包括以下内容:第二触摸操作为指关节触摸操作;第一触摸操作作用于触摸板的时间与第二触摸操作作用于触摸板的时间之间的间隔小于时间阈值;第一触摸操作的位置与第二触摸操作的位置之间的距离小于距离阈值。通过该实现方式,可以准确识别出指关节双击操作。
在一种可能的实现方式中,该方法还包括:显示用户界面,通过用户界面设置不同指关节触摸操作对应的响应功能。如此,有助于用户在用户界面上自行定义不同指关节触摸操作对应的功能。
第二方面,本申请实施例提供一种识别触摸操作的方法,应用于电子设备,且电子设备包括触摸板和震动器件,该方法包括:接收作用于触摸板的第三触摸操作;识别第三触摸操作为边缘滑动触摸操作;触发边缘滑动触摸操作对应的响应功能,并通过震动器件输出震动提示。
本申请实施例中,通过添加震动器件,可以在电子设备识别出边缘滑动触摸操作后,输出震动提示,使得用户感知到功能已被触发,从而提升用户体验。
在一种可能的实现方式中,通过震动器件输出震动提示,包括:通过与边缘滑动触摸 操作对应的每个触摸点距离最近的k个震动器件输出震动提示,k为正整数。
在一种可能的实现方式中,通过震动器件输出震动提示,包括:根据第一触摸点分别与距离最近的k个震动器件之间的距离、以及第一映射关系,确定与第一触摸点距离最近的k个震动器件对应的震动信号值,根据k个震动器件对应的震动信号值输出震动提示,其中第一映射关系包括距离与震动信号值之间的映射关系,第一触摸点为第三触摸操作对应的各个触摸点中的任意一个触摸点。通过该实现方式,k个震动器件都以自身到第一触摸点的距离远近输出对应的震动信号值,可以保证实现边缘滑动的手指触摸不同区域的震动强度感受的一致性。
在一种可能的实现方式中,与第一触摸点距离最近的k个震动器件输出的震动信号到达第一触摸点时的震动强度一致,这样用户手指在触摸点处感受到的k个震动器件的震动强度一致。
在一种可能的实现方式中,识别第三触摸操作为边缘滑动触摸操作,包括:响应于第三触摸操作,获取触摸板采集的触摸数据;在根据触摸数据确定第三触摸操作为滑动触摸操作、且滑动触摸操作作用于触摸板的边缘区域后,根据各触摸点的坐标数据,计算各触摸点的坐标数据的拟合直线与第一方向的夹角,以及每个触摸点在第二方向上的投影点与第三触摸操作的起始点在第二方向上的投影点之间的距离差值;第一方向为沿着触摸板第一侧边的方向,第一侧边为第三触摸操作的触摸位置所在的侧边,第二方向为与第一侧边垂直的方向;若夹角以及各个触摸点对应的距离差值满足预设条件,识别第三触摸操作为边缘滑动触摸操作。通过该实现方式,可以提升边缘滑动触摸操作的识别率。
在一种可能的实现方式中,预设条件包括:夹角小于或等于夹角阈值、且各个触摸点对应的距离差值中的距离差值最大值小于或等于差值阈值。
在一种可能的实现方式中,该方法还包括:显示用户界面,通过用户界面设置边缘滑动触摸操作对应的响应功能。如此,有助于用户在用户界面上自行定义边缘滑动触摸操作对应的功能。
第三方面,本申请实施例提供一种电子设备,包括处理器、存储器、触摸板、以及至少一个形变量传感器;触摸板用于接收触摸操作、以及采集触摸数据;形变量传感器用于采集形变量数据;存储器用于存储一个或多个计算机程序,一个或多个计算机程序被处理器执行时,使得电子设备执行如上述第一方面、以及第一方面的任意一种可能的实现方式的方法。
第四方面,本申请实施例提供一种电子设备,包括处理器、存储器、触摸板、以及至少一个震动器件;触摸板用于接收触摸操作、以及采集触摸数据;震动器件用于输出震动信号;存储器用于存储一个或多个计算机程序,一个或多个计算机程序被处理器执行时,使得电子设备执行如上述第一方面、以及第一方面的任意一种可能的实现方式的方法。
第五方面,本申请实施例还提供一种电子设备,该电子设备可以包括:接收模块、第一采集模块、第二采集模块、识别模块和响应模块;其中,接收模块,用于接收第一触摸操作;第一采集模块,用于采集形变量数据;第二采集模块,用于采集触摸数据;识别模块,用于根据形变量数据以及触摸数据确定第一触摸操作为第一指关节触摸操作;响应模块,触发第一指关节触摸操作对应的响应功能。
第六方面,本申请实施例还提供一种电子设备,该电子设备可以包括:接收模块、识别模块和响应模块;其中,接收模块,用于接收第三触摸操作;识别模块,用于识别第三 触摸操作为边缘滑动触摸操作;响应模块,用于触发边缘滑动触摸操作对应的响应功能,并通过震动器件输出震动提示。
第七方面,本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质包括计算机程序,当计算机程序在电子设备上运行时,使得所述电子设备执行如上述第一方面、以及第一方面的任意一种可能的实现方式的方法,或者执行如上述第二方面、以及第二方面的任意一种可能的实现方式的方法。
第八方面,本申请实施例还提供一种芯片,所述芯片与电子设备中的存储器耦合,用于调用存储器中存储的计算机程序并执行本申请实施例第一方面及其第一方面任一可能设计的技术方案,或者,用于调用存储器中存储的计算机程序并执行本申请实施例第二方面及其第二方面任一可能设计的技术方案,本申请实施例中“耦合”是指两个部件彼此直接或间接地结合。
第九方面,本申请实施例还提供一种包含计算机程序产品,当所述计算机程序产品在终端上运行时,使得所述电子设备执行上述任一方面的任意一种可能的设计的方法。
附图说明
图1A为本申请实施例提供的电子设备示意图;
图1B为本申请实施例提供的指关节触摸操作示意图;
图1C为本申请实施例提供的边缘滑动触摸操作示意图;
图1D为本申请实施例提供的电子设备的示意图;
图2为本申请实施例提供的指关节触摸操作识别的***架构示意图;
图3为本申请实施例提供的识别触摸操作的方法的流程示意图;
图4为本申请实施例提供的形变量波形示意图;
图5为本申请实施例提供的图像特征数据;
图6为本申请实施例提供的识别触摸操作的方法的另一种流程示意图;
图7为本申请实施例提供的边缘滑动操作识别的架构示意图;
图8为本申请实施例提供的识别触摸操作的方法的流程示意图;
图9为本申请实施例提供的识别触摸操作的方法的流程示意图;
图10为本申请实施例提供的滑动触摸示意图;
图11为本申请实施例提供的滑动触摸场景示意图;
图12为本申请实施例提供的识别触摸操作的方法的另一种流程示意图;
图13为本申请实施例提供的电子设备的示意图;
图14为本申请实施例提供的电子设备的示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。
触摸板是用户使用电子设备(例如笔记本电脑)时用户与电子设备进行交互的一种重 要途径,以图1A所示的电子设备为例,用户可以在触摸板上进行触摸操作,并在显示屏上显示触摸操作对应的响应结果,图1A中仅为示例,并不限定触摸板的具***置。目前引入了新的触摸方式(或称为交互方式)来简化触摸板的使用。一种触摸方式为通过指关节操作触摸板,来实现不同的手势功能。在这种触摸方式下,可以使用震动声学传感器获取数据进行指关节触摸识别,但是该方法容易受到设备自身震动或者周围环境产生的声音数据干扰,误报率较高。又或者,使用设备自身的加速度传感器采集的数据识别指关节触摸,识别率低。另一种触摸方式为用户在触摸板边缘的特定区域进行滑动操作触发对应的功能,但是,用户在边缘滑动触摸操作时无法判断边缘滑动功能是否触发,用户体验并不好。
基于上述问题,本申请实施例提供一种识别触摸操作的方法,该方法有助于提升触摸操作识别的用户体验。本申请公开的各个实施例可以应用于设置有触摸板的电子设备中。在本申请一些实施例中,电子设备可以是包含诸如个人数字助理和/或音乐播放器等功能的便携式电子设备,诸如手机、平板电脑、具备无线通讯功能的可穿戴设备(如智能手表)、车载设备等。便携式电子设备的示例性实施例包括但不限于搭载
Figure PCTCN2022120972-appb-000001
Figure PCTCN2022120972-appb-000002
或者其它操作***的便携式电子设备。上述便携式电子设备也可以是诸如具有触敏表面(例如触控面板)的膝上型计算机(Laptop)等。还应当理解的是,在本申请其他一些实施例中,上述电子设备也可以是具有触敏表面(例如触控面板)的台式计算机。
下面介绍本申请实施例提供的电子设备。
图1D示例性示出了一种电子设备100的结构示意图。
应理解,图示电子设备100仅是一个范例,并且电子设备100可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
如图1D所示,电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
下面结合图1D对电子设备100的各个部件进行具体的介绍:
处理器110可以包括一个或多个处理单元,例如,处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。处理器110 中还可以设置存储器,用于存储指令和数据。
处理器110可以运行本申请实施例提供的识别触摸操作的方法,处理器可以响应于对触摸板的触摸操作,并触发第一触摸操作对应的响应功能。当处理器110集成不同的器件,比如集成CPU和GPU时,CPU和GPU可以配合执行本申请实施例提供的识别触摸操作的方法,比如识别触摸操作的方法中部分算法由CPU执行,另一部分算法由GPU执行,以得到较快的处理效率。
在一些实施例中,处理器110可以包括一个或多个接口。比如,接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与***设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星***(global navigation satellite system,GNSS),调频(frequency modulation, FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯***(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位***(global positioning system,GPS),全球导航卫星***(global navigation satellite system,GLONASS),北斗卫星导航***(beidou navigation satellite system,BDS),准天顶卫星***(quasi-zenith satellite system,QZSS)和/或星基增强***(satellite based augmentation systems,SBAS)。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作***,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设 备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194之下,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同,例如设置于触摸板。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如,时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
尽管图1D中未示出,电子设备100还可以包括触摸板、蓝牙装置、定位装置、闪光灯、微型投影装置、近场通信(near field communication,NFC)装置等,在此不予赘述。
下面结合附图介绍本申请实施例提供的技术方案。
本申请实施例提供的识别触摸操作的方法可以应用于多种应用场景,例如图1B所示的指关节触摸操作识别场景,又例如图1C所示的边缘滑动操作识别场景,下面结合应用场景对识别触摸操作的方法进行详细介绍。
应用场景一,指关节触摸操作识别场景。
图2为本申请实施例提供的识别指关节触摸操作的***架构示意图。
如图2所示,该***架构由下向上依次包括硬件层、内核层以及应用层。
内核层是硬件和软件之间的层。内核层包含操作***(operating system,OS)驱动,OS驱动为可以使***和硬件层通信的特殊程序,为硬件层与操作***的通信提供了接口。
硬件层可以包括触摸板、与触摸板连接的至少一个形变量传感器和微控制单元(microcontroller unit,MCU),MCU分别与触摸板、至少一个形变量传感器连接。在本申 请实施例中,新增了形变量传感器用于识别指关节触摸。本申请涉及的形变量传感器可以为压电传感器、压电片、应变计、加速度传感器,也可以为能够实现采集形变量数据的其他传感器,本申请对此不作限制。在其它一些实施例中,形变量传感器也可以直接采集形变量波形数据。
其中,当电子设备为笔记本电脑时,笔记本电脑包括主机模组和显示模组,显示模组用于提供视觉输出,主机模组上具有触摸板(又称为触控板),触摸板具有触摸传感器,可以采集触摸数据。当电子设备为平板电脑或者手机时,触摸传感器与显示屏组成触摸屏,也称“触控屏”,既可以采集触摸数据,也可以提供视觉输出。在以下的实施例中,是以电子设备为笔记本电脑为例进行说明的,如果将本申请方案应用在平板电脑或者手机,触摸板的实现方式也可以应用于触摸屏上,具体实现方式参考笔记本电脑的实现方式。
形变量传感器为本申请实施例中的新增器件,用于获取用户对触摸板进行操作时采集触摸板形变量数据。形变量传感器可以部署于触摸板的下方,当部署多个形变量传感器时,可以将多个形变量传感器等间距均匀部署于触摸板下方,又或者采用轴对称方式部署于触摸板下方,又或者采用中心对称方式部署于触摸板下方,本申请对于多个形变量传感器或者单个形变量传感器的布局方式不作限制。
MCU可以用于对触摸数据和触摸板形变量数据进行采集和处理,例如将形变量数据处理为形变量波形数据。MCU还可以用于根据触摸数据和形变量数据进行指关节触摸操作的识别。在MCU完成对触摸操作的识别后,且触摸操作为指关节触摸操作,则将指关节事件通过OS驱动上报给OS侧应用。
应用层可以包括一系列OS侧应用和OS,OS包括各种***应用程序接口,OS侧应用为OS的预装应用,OS侧应用可以通过调用***应用程序接口(application programming interface,API)来触发指关节触摸操作对应相关的功能,指关节触摸操作对应的功能可通过OS侧应用进行自定义。OS侧应用可以提供用户界面给用户,以便于用户在用户界面上自行定义指关节触摸操作对应的功能。
下面结合实施例对本申请实施例提供的识别触摸操作的方法进行详细说明。
本申请实施例中,电子设备的触摸板可以接收至少一个触摸操作,比如在某些场景中,用户手指的指关节在短时间内两次接触触摸板时,会在触摸板上产生两个触摸操作,为了分别识别这两个触摸操作是否为指关节触摸操作,采用以下实施例中的识别触摸操作的方法分别针对这每个触摸操作进行识别。下面的实施例仅以识别一个触摸操作为例,并不限定触摸板上只接收到一个触摸操作。可以理解的是,这里的一个触摸操作指的是单指关节触摸屏幕一次,或者是双指关节一起同时触摸屏幕一次,又或者更多指关节同时触摸屏幕一次。
图3为本申请实施例提供的识别触摸操作的方法的流程示意图。该方法可以适用于具有触摸板的电子设备,触摸板设有形变量传感器。如图3所示,该方法包括以下步骤:
S301,电子设备接收作用于触摸板的第一触摸操作。
S302,电子设备响应于第一触摸操作,获取至少一个形变量传感器采集触摸板的形变量数据、以及触摸板采集的触摸数据。
电子设备中的每个形变量传感器可以周期性采集形变量数据,也可以持续采集形变量数据,存放于数据缓冲区,电子设备可以在接收到第一触摸操作时,从数据缓冲区中获取 形变量数据,例如,可以使用第一时段内采集的形变量数据来识别指关节触摸操作。上述S302中,每个形变量传感器采集一个形变量数据,至少一个形变量传感器可以采集至少一个形变量数据,以电子设备设有5个形变量传感器为例,那么可以采集到5个形变量数据。电子设备可以根据形变量数据,生成形变量波形数据。
在一种实现方式中,使用第一时段内采集的形变量数据来识别指关节触摸操作,第一时段可以包括第二时段和第三时段,其中,第二时段包括从波形选取开始时间到第一触摸操作的触摸发生时间之间的时段,第三时段包括从第一触摸操作的触摸发生时间到波形选取结束时间之间的这段时间。如此,在选取形变量波形数据时,可能会选取到不包括第一触摸操作的整个生命周期的波形,例如,第一触摸操作的整个生命周期包括10秒,以第二时段为1秒,第三时段为5秒为例,本申请实施例选取第一触摸操作的触摸发生时间之前的1秒内的形变量波形以及触摸发生时间之后的5秒内的形变量波形识别触摸操作,可以在第一触摸操作开始后达到5秒时,就可以根据选取的形变量波形识别触摸操作,而不用等待第一触摸操作的整个生命周期结束(至少要等10秒)之后才能识别触摸操作,从而加快从触摸操作触摸发生时间至识别出第一触摸操作是否为指关节触摸操作的过程,从而快速响应第一触摸操作对应的功能。
下面具体地结合图4,示例性说明一种形变量传感器采集的形变量波形选取方式。
图4为本申请实施例提供的形变量波形示意图。如图4所示,从触摸发生时间向前偏移t1对应的时间作为波形选取开始时间,触摸发生时间之后的一个时间作为波形选取结束时间,选取时间长度t2的形变量波形。
在另一种实现方式中,使用第一时段内采集的形变量数据来识别指关节触摸操作,第一时段可以包括第一触摸操作的生命周期对应的第四时段、第四时段之前的第五时段、以及第四时段之后的第六时段,第一触摸操作的生命周期为第一触摸操作从开始接触触摸板至离开触摸板的过程。在第一触摸操作的生命周期内,触摸板会产生形变量,而且形变量会持续变化,形成波形,每个形变量传感器都可以采集到形变量变化形成的形变量波形数据。本申请实施例中将第一触摸操作的生命周期对应的整个时段称为第四时段。
电子设备在获取形变量波形数据时,可以先确定波形选取开始时间和波形选取结束时间,波形选取开始时间可以为第一触摸操作的生命周期开始前的一个时间,这段时间可以称为第五时段,波形选取结束时间可以为第一触摸操作的生命周期结束后的一段时间,这段时间可以称为第六时段。也就是说,从波形选取开始时间至波形选取结束时间之间的时段称为第一时段,依次包括第四时段、第五时段和第六时段。
第一触摸操作作用于触摸板上时,触摸板可以采集到第一触摸操作在触摸板上的触摸数据,例如包括触摸位置的坐标数据和触摸图像数据,其中触摸图像数据如图5中(a)和(b)所示,用户手指触摸到触摸板上形成一个触摸区域,即黑色区域,通过计算该黑色区域的面积,即可知触摸操作形成的触摸区域的面积。应理解,图5中的黑色区域仅用于示例触摸区域,在现实操作过程中,手指接触触摸板时可以显示黑色区域,当然也可以为显示为其它颜色的触摸区域,也可以不显示黑色区域,即用户看不到触摸区域的变化。
S303,电子设备根据至少一个形变量数据和触摸数据,识别第一触摸操作是否为指关节触摸操作。若是,则执行S304;若否,则执行S305。
本申请实施例中,通过新增设置的形变量传感器,采集手指触摸时触摸板的形变量数据,以避免各种干扰(例如电子设备自身状态以及周围环境的干扰)影响到用于识别指关 节触摸操作所需数据的有效性,从而提升识别率,降低误触率,提升用户体验。
在一种可能的实施方式中,电子设备可以从多个形变量传感器采集的多个形变量数据生成的形变量波形数据中,确定出至少一个较优的形变量波形数据,然后根据较优的形变量波形数据,识别第一触摸操作是否为指关节触摸操作。又或者,电子设备根据最优的形变量波形数据来识别第一触摸操作是否为指关节触摸操作。
在一种可能的实施方式中,电子设备从至少一个形变量波形数据中确定出较优的形变量波形数据,可以通过以下方式实现:电子设备计算至少一个形变量波形数据中每个形变量波形数据的波形中的波峰与波谷之间的形变量差值,电子设备将至少一个形变量波形数据中波形的波峰与波谷之间的形变量差值大于预设阈值的形变量波形数据,确定为较优的形变量波形数据。在另一种可能的实现方式中,将波形的波峰与波谷之间形变量差值最大的波形数据作为最优波形数据。形变量差值最大的形变量波形数据,最接近第一触摸操作对触摸板产生的实际形变量,根据该形变量差值最大的形变量波形数据进行识别,可以提升触摸板上不同区域的指关节触摸操作识别的有效性和一致性,提升用户体验。
下面结合图4,对计算任一个形变量波形数据的波形中的波峰与波谷之间的形变量差值进行介绍。
如图4所示的第一时段的形变量波形数据的波形,并不是一个完整周期的波形,即图4中的时间长度t2内的波形,其波峰为L1,波谷为L2,该形变量波形数据在第一时段的波形中的波峰与波谷之间的形变量差值为L1-L2。
在一种可能的实施方式中,上述根据较优的(或最优的)形变量波形数据和触摸数据,识别第一触摸操作是否为指关节触摸操作,可以通过以下方式实现:电子设备从较优的(或最优的)形变量波形数据中提取波形特征,例如,该波形特征可以包括但不限于如图5中(c)和(d)所示的波形振幅和波形的频率等特征,其中,图5中(c)的形变量波形的振幅较大、波形较窄,图5中(d)所示的形变量波形振幅较小、波形较宽,本申请实施例中,形变量波形越窄的波形,在波形的一个振动周期内,其振动频率越大,振动频率为振动周期的倒数;形变量波形越宽,在波形的一个振动周期内,其振动频率越小。电子设备从触摸数据中提取触摸图像特征,触摸图像特征可以包括但不限于触摸位置的坐标、触摸区域的面积,然后,电子设备可以根据波形特征、触摸图像特征以及指关节触摸识别模型,识别第一触摸操作是否为指关节触摸操作,指关节触摸识别模型为根据预先采集的指关节触摸操作对应的波形特征和触摸图像特征训练得到的。其中,指关节触摸识别模型为可以实现本申请识别指关节触摸操作的任意模型,例如可以为神经网络模型,本申请对此不作限定。
在一个示例中,波形特征中振幅大于或等于振幅阈值、振动频率大于或等于频率阈值、且触摸图像特征中触摸区域的面积小于或等于面积阈值,电子设备识别第一触摸操作为指关节触摸操作;波形特征中振幅小于振幅阈值、振动频率小于频率阈值、且触摸图像特征中触摸区域的面积大于面积阈值,则电子设备识别第一触摸操作为非指关节触摸操作。
图5为本申请实施例提供的指关节触摸操作与非指关节触摸操作的特征差异的示意图。如图5中(a)所示的指关节触摸操作的触摸区域,其对应的形变量波形的振幅大于或等于振幅阈值、振动频率大于或等于频率阈值,触摸区域的面积小,如图5中(b)所示的非指关节触摸操作的触摸操作,其对应的形变量波形的振幅小于振幅阈值、振动频率小于频率阈值,触摸区域的面积大。
S304,电子设备识别第一触摸操作为指关节触摸操作,触发第一触摸操作对应的第一响应功能。
其中,第一响应功能可以为***预设或用户自定义。例如,识别出来是指关节触摸操作以后,就弹出设置界面,用于设置屏幕亮度或者音量大小等等。以第一响应功能为设置音量功能为例,电子设备可以在识别第一触摸操作为指关节触摸操作后,触发设置音量功能,也就是说,无论是单指还是双指实现的指关节触摸操作,该指关节触摸操作对应的响应功能都设置为同一功能,这种情况下可以不用判断第一触摸操作使用的手指数量。
上述图3所示的实施例,可以适用于单指的单次指关节触摸操作,也适用于多指(例如两个手指)的单次指关节触摸操作,以两个手指的单次指关节触摸操作为例,这两个手指的指关节同时触摸到触摸板,或者,这两个手指的指关节触摸到触摸板的时间差很短,可以忽略不计。在实际应用中,可以设置该时间差的具体数值,例如设置为1ms,本申请实施例对于时间差的具体数值不作限定。举个例子,用户食指和中指一起敲击触摸板,食指和中指敲击触摸板的时间差小于1ms,即认为食指和中指实现了一次双指关节单击操作。
在其它一些实施例中,可以为单指指关节单击操作和双指指关节单击操作设置不同的响应功能,电子设备可以在每一次识别出触摸操作为指关节触摸操作时,根据触摸图像数据判断用于实现指关节触摸操作所使用的手指数量,以第一触摸操作识别为指关节触摸操作为例,如果是单个手指实现该第一触摸操作,即为单指关节单击操作,如果是两个手指实现该第一触摸操作,即为双指关节单击操作。在其它实施例中,也可以是识别出两次甚至多次连续的指关节触摸操作之后,判断指关节触摸操作使用的手指数量。本申请对于实现指关节触摸操作的手指数量的判断时机不作限定。
S305,不作任何处理。
上述实施例中,提供了识别一个触摸操作(即第一触摸操作)是否为指关节触摸操作的具体实现方式,在其它一些实施例中,在电子设备的触摸板接收第一触摸操作之后,在预设时长内可能会接收到其它的触摸操作,例如触摸板接收到第二触摸操作,此时,可以判断这两个触摸操作是否为指关节双击操作,即使用指关节连续快速地敲击两下触摸板。在实际应用中,可以设置该预设时长的具体数值,例如设置为10ms,本申请实施例对于预设时长的具体数值不作限定。
下面介绍如何确定第一触摸操作和第二触摸操作是否为指关节双击操作。
一种可能的实施方式中,若第一触摸操作与第二触摸操作满足第一预设条件,则确定第一触摸操作与第二触摸操作为指关节双击操作,其中,第一预设条件包括以下三项内容:第一项,第一触摸操作与第二触摸操作均为指关节触摸操作;第二项,第一触摸操作作用于触摸板的时间与第二触摸操作作用于触摸板的时间之间的间隔小于时间阈值;第三项,第一触摸操作的位置与第二触摸操作的位置之间的距离小于距离阈值。
若第一触摸操作与第二触摸操作不满足第一预设条件,即不满足第一预设条件中的任一项或多项内容,则确定第一触摸操作与第二触摸操作为非指关节双击操作。在一种可能的实现方式中,先分别识别第一触摸操作与第二触摸操作是否为指关节触摸操作,在识别出第一触摸操作与第二触摸操作均为指关节触摸操作之后,然后再判断每个触摸操作使用的手指数量,例如,若每个触摸操作都使用单个手指,确定为单指关节双击操作,若每个触摸操作使用两个手指,确定为双指关节双击操作。在另一种可能的实现方式中,在判断 每次触摸操作是否为指关节触摸操作的同时就直接判断该次触摸操作的手指数量。即本申请对于判断是双指关节还是单指关节的时机不做具体限定。下面对指关节双击操作的响应功能进行举例说明。在一个示例中,电子设备识别第一触摸操作与第二触摸操作为单指关节双击操作,触发***截屏功能;电子设备识别第一触摸操作与第二触摸操作识别为双指关节双击操作,触发***录屏功能。在其它一些实施例中,电子设备识别第一触摸操作与第二触摸操作为单指关节双击操作,触发***录屏功能;电子设备识别第一触摸操作与第二触摸操作识别为双指关节双击操作,触发***截屏功能。
下面结合图2,提供一个识别触摸操作的方法的详细示例。
请参见图6,为本申请实施例提供的识别触摸操作的方法的另一种流程示意图。具体的,所述流程包括:
S601,电子设备中的多个形变量传感器周期性向MCU上报采集的形变量数据,MCU将多个形变量数据存放于数据缓冲区。
S602,当触摸板接收到第一触摸操作时,MCU获取第一触摸操作对应的触摸数据,并从数据缓冲区中获取第一时段内的多个形变量数据,并根据形变量数据生成形变量波形数据。
其中,第一时段为波形选取开始时间与波形选取结束时间之间的时段,其中,触摸操作发生时间往前一段时间作为波形选取开始时间,再往后一定时间作为波形选取结束时间。
S603,MCU从多个形变量波形数据中选取波形中的波峰与波谷的差值最大的形变量波形数据作为最优的形变量波形数据。
S604,MCU根据触摸数据和最优的形变量波形数据,基于内置的指关节触摸识别模型进行特征提取,从触摸数据中提取触摸图像特征,从最优的形变量波形数据中提取波形特征。
S605,MCU基于触摸图像特征和波形特征进行分类识别,判断第一触摸操作是否为指关节触摸操作。若是,执行S606;否则,执行S618。
S606,MCU判断第一触摸操作使用的手指数量;若为单手指或双手指,执行S607;若为多手指,执行S618。需要说明的是,本申请并不限定识别手指数量的时机。
在S606中,若MCU判断第一触摸操作使用的手指数量为多手指(指的是至少为三个手指),则认为第一触摸操作是误触操作,不作任何处理。
S607,MCU确定触摸板是否接收到第二次指关节触摸操作。若是,执行S612;否则,在第一触摸操作使用单手指时执行S608,在第一触摸操作使用双手指时执行S610。
S607的具体实现原理请参见前文的识别第一触摸操作是否为指关节触摸操作的实现方式,此处不重复赘述。
S608,MCU通过OS驱动向OS侧应用上报单指关节单击事件。
S609,OS侧应用调用OS API触发设置音量功能。需要说明的是,在本申请提供的实施例中单指关节单击对应触发的是设置音量功能,在其他的实现方式中,单指关节单击也可以对应不同的功能,本申请对此不做限定。
S610,MCU通过OS驱动向OS侧应用上报双指关节单击事件。
S611,OS侧应用调用OS API触发设置亮度功能。需要说明的是,在本申请提供的实施例中单指关节双击对应触发的是设置亮度功能,在其他的实现方式中,单指关节双击也 可以对应不同的功能,本申请对此不做限定。
S612,MCU确定两次触摸操作的触摸位置是否小于距离阈值,且间隔时间是否小于时间阈值;若是,执行S613;否则,执行S618。
S613,MCU结合S606判断两次指关节触摸操作使用的手指数量;若均为单手指,执行S614;若为双手指,执行S616;若为其它情况,例如均为多手指(指的是至少为三个手指)或者两个触摸操作使用的手指数量不同,执行S618。可以理解的是,这里手指的数量可以在判断每次触摸操作是否为指关节触摸操作的同时进行判断(例如在步骤S605中判断)。
S614,MCU通过OS驱动向OS侧应用上报单指关节双击事件。
S615,OS侧应用调用OS API触发截屏功能。需要说明的是,在本申请提供的实施例中单指关节双击对应触发的是截屏功能,在其他的实现方式中,单指关节双击也可以对应不同的功能,本申请对此不做限定。
S616,MCU通过OS驱动向OS侧应用上报双指关节双击事件。
S617,OS侧应用调用OS API触发录屏功能。需要说明的是,在本申请提供的实施例中双指关节双击对应触发的是录屏功能,在其他的实现方式中,双指关节双击也可以对应不同的功能,本申请对此不做限定。
S618,不作任何处理。
应用场景二,边缘滑动操作识别场景。
图7为本申请实施例提供的边缘滑动操作识别的架构示意图。
如图7所示,该***架构由下向上依次包括硬件层、内核层以及应用层。
内核层是硬件和软件之间的层。内核层至少包含OS驱动,OS驱动为可以使***和硬件设备通信的特殊程序,为硬件设备到操作***提供了接口。
硬件层可以包括触摸板、与触摸板连接的震动器件和MCU,MCU分别与触摸板、至少一个震动器件连接。
其中,当电子设备为笔记本电脑时,笔记本电脑包括主机模组和显示模组,显示模组用于提供视觉输出,主机模组上有触摸板(又称为触控板),触摸板具有触摸传感器,可以采集触摸数据。当电子设备为平板电脑或者移动手机时,触摸传感器与显示屏组成触摸屏,也称“触控屏”,既可以采集触摸数据,也可以提供视觉输出。在以下的实施例中,是以电子设备为笔记本电脑为例进行说明的,如果将本申请方案应用在平板电脑或者手机,触摸板的实现方式也可以应用于触摸屏上,具体实现方式参考笔记本电脑的实现方式。
在本申请提供的实施例中,通过添加震动器件使得用户感知到功能已被触发,提升用户体验。至少一个震动器件可以布局于触摸板的多个区域,例如,至少一个震动器件等间距均匀分布于触摸板下方,本申请对于至少一个震动器件的布局方式不作限制,又例如至少一个震动器件采用轴对称方式分布于触摸板下方,又例如至少一个震动器件采用中心对称方式分布于触摸板下方,本申请对于至少一个震动器件的布局方式不作限制,震动器件用于输出震动提示。
MCU可以用于在用户对触摸板的边缘特定区域进行滑动操作时采集滑动时各个触摸点的触摸数据。MCU还可以用于进行边缘滑动操作的识别。在MCU完成对触摸操作的识别后,且触摸操作为边缘滑动触摸操作,则将边缘触控事件通过OS驱动上报给OS侧应 用。
应用层可以包括一系列OS侧应用和OS,OS包括各种***应用程序接口,OS侧应用为OS的预装应用,OS侧应用可以通过调用***API来触发边缘滑动操作相关的功能,同时触发震动器件输出震动提示,边缘滑动操作对应的功能可通过OS侧应用进行自定义。OS侧应用可以提供用户界面给用户,以便于用户在用户界面上自行定义边缘滑动操作对应的功能。
下面结合实施例对本申请实施例提供的识别触摸操作的方法进行详细说明。
图8为本申请实施例提供的识别触摸操作的方法的流程示意图。该方法可以适用于具有触摸板的电子设备,触摸板设有震动器件。如图8所示,所述方法的流程包括:
S801,电子设备接收作用于触摸板的第三触摸操作。
S802,电子设备识别第三触摸操作是否为边缘滑动触摸操作。若是,则执行S803;若否,则执行S804。
为了提高边缘滑动触摸操作识别的成功率,本申请提供了一种识别方法,具体的,步骤S802的实现方式如下:电子设备响应于第三触摸操作,采集触摸数据,此处触摸数据可以包括触摸板上的各触摸点的坐标数据和任意两个相邻的触摸点之间的间隔时间,然后根据触摸数据确定第三触摸操作是否为滑动触摸操作;例如,第三触摸操作对应的各触摸点中任意两个相邻的触摸点之间的间隔时间大于时间阈值,则识别第三触摸操作不是滑动触摸操作,那么电子设备就判断第三触摸操作不是边缘滑动触摸操作,而是误触操作,不作任何处理。若任意两个相邻的触摸点之间的间隔时间小于或等于时间阈值,识别出第三触摸操作为滑动触摸操作,则继续识别第三触摸操作是否为边缘滑动触摸操作。一种可能实现识别第三触摸操作是否为边缘滑动触摸操作的实现方式为:电子设备根据各触摸点的坐标数据,计算各触摸点的坐标数据的拟合直线与第一方向的夹角、以及每个触摸点在第二方向上的投影点与第三触摸操作的起始点在第二方向上的投影点之间的距离差值,根据夹角以及各个触摸点对应的距离差值,识别第三触摸操作是否为边缘滑动触摸操作;其中,第一方向为沿着触摸板第一侧边的方向,第一侧边为第三触摸操作的触摸位置所在的侧边,第二方向为与第一侧边垂直的方向。
下面结合图9和图10,说明如何计算夹角和距离差值。
如图9所示,用户手指在触摸板的第一侧边的边缘特定区域(边缘特定区域例如图1C所示的上边缘滑动区域)滑动,形成从滑动起点到滑动终点的滑动轨迹,其滑动方向为沿着第一侧边的方向。以触摸轨迹的包括5个触摸点(可以通过采样得到)为例,电子设备采集触摸板上的5个触摸点在触摸板所在的平面坐标系的坐标数据。这5个触摸点的坐标数据进行直线拟合,然后计算拟合直线与第一方向的夹角,第一方向为图9中的滑动方向,与触摸板的第一侧边平行,与触摸板的第二侧边垂直。
然后,计算每个触摸点在第一方向上的投影点与第三触摸操作的起始点在第一方向上的投影点之间的距离差值,并从5个触摸点对应的距离差值中选择距离差值最大的一个距离差值,即为距离差值最大值。如图10中所示,距离差值最大值为第2个触摸点对应的最大距离t。
在一种可能的实施方式中,若夹角以及距离差值最大值满足预设条件,识别第三触摸操作为边缘滑动触摸操作,则确定第三触摸操作为边缘滑动触摸操作,其中,预设条件包 括:夹角小于或等于夹角阈值、且距离差值最大值小于或等于差值阈值。
在一个示例中,若夹角以及距离差值最大值不满足预设条件,即夹角大于夹角阈值或者距离差值最大值大于阈值,则确定第三触摸操作不是边缘滑动触摸操作。
通过上述实施方式,基于第三触摸操作的各触摸点的坐标数据识别边缘滑动触摸操作,可以降低误触率,提升识别率。
在其它一些实施例中,上述触摸数据还可以包括触摸面积,为了进一步降低误触率,在根据夹角以及各个触摸点对应的距离差值,识别第三触摸操作是否为边缘滑动触摸操作之前,MCU还可以先确定第三触摸操作对应的触摸面积是否小于或等于预设阈值,若是,则继续根据夹角和距离差值识别第三触摸操作,若否,则识别第三触摸操作为误触操作,例如,用户手掌触摸到触摸板,手掌的触摸面积比较大,这种情况下通过计算触摸操作的触摸面积可以识别出来为误触操作。
S803,电子设备触发边缘滑动触摸操作对应的第二响应功能,并通过震动器件输出震动提示。
上述S803中的执行边缘滑动触摸操作对应的第二响应功能可以通过以下方式实现:在识别出第三触摸操作为边缘滑动触摸操作之后,电子设备还可以基于边缘滑动触摸操作在第一方向上的滑移动距离m或者移动速度,触发相应的响应功能。如图10所示,滑动距离可以为边缘滑动触摸操作的滑动终点在第一方向上的投影点与滑动起点在第一方向上的投影点之间的距离m。
例如,边缘滑动触摸操作对应的滑动距离为0.5cm,触发一次边缘触控事件,边缘滑动触摸操作对应的滑动距离为1cm,触发两次边缘触控事件。又例如,滑动距离均为0.5cm的两个边缘滑动触摸操作,移动速度快的边缘滑动触摸操作调节音量增大2dB,移动速度慢的边缘滑动触摸操作调节音量增大1dB。
一种可能的实现方式中,上述S803中的通过震动器件输出震动提示可以通过以下方式实现:通过与第三触摸操作对应的每个触摸点距离小于距离阈值的所有震动器件输出震动提示。举个例子,以距离阈值为1cm为例,可以通过与每个触摸点距离1cm范围内的所有震动器件输出震动提示。
另一种可能的实现方式中,上述S803中的通过震动器件输出震动提示可以通过以下方式实现:电子设备通过与第三触摸操作对应的每个触摸点距离最近的k个震动器件输出震动提示,k为正整数。
以第一触摸点为例,电子设备还可以根据第一触摸点分别与k个震动器件之间的距离、以及第一映射关系,确定与第一触摸点距离最近的k个震动器件对应的震动信号值,第一映射关系包括距离与震动信号值之间的映射关系,第一触摸点为第三触摸操作对应的各个触摸点中的任意一个触摸点。从而使得与所述第一触摸点距离最近的k个震动器件输出的震动信号到达所述第一触摸点时的震动强度一致。其中,该映射关系可以为列表的形式,也可以为函数公式。
以k为2为例,如图11所示,一个触摸点O周围有两个震动器件,其中震动器件1与触摸点O的距离为s1、震动器件2与触摸点O的距离为s2,那么这两个震动器件就是与触摸点O最近的震动器件,基于距离与震动信号值之间的映射关系,可以确定出与s1对应的震动信号值x1、以及与s2对应的震动信号值x2,然后将震动信号值x1发生至震动器件1,将震动信号值x2发生至震动器件2,震动器件1根据震动信号值x1输出震动提 示,震动器件2根据震动信号值x2输出震动提示。这样用户在触摸点O处感受到的两个震动器件的震动强度一致。
本申请实施例中,在识别出边缘滑动触摸操作时,通过在触摸板上设置的震动器件输出震动反馈,以指示用户滑动功能已触发,可以提升用户体验。
S804,电子设备识别出第三触摸操作为误触操作,不作任何处理。
下面结合图7,提供另一个识别触摸操作的方法的详细示例。
请参见图12,为本申请实施例提供的识别触摸操作的方法的另一种流程示意图。具体的,所述流程包括:
S1201,当触摸板接收到第三触摸操作,MCU获取第三触摸操作对应的触摸数据,其中,触摸数据包括各触摸点的坐标数据、各触摸点中任意相邻两个触摸点之间的时间间隔以及触摸面积。
S1202,MCU确定是否第三触摸操作对应的触摸面积小于或等于预设阈值、且第三触摸操作对应的各触摸点中任意相邻的两个触摸点之间的时间间隔小于时间阈值、且触摸位置在边缘特定区域内,若是,则执行步骤S1203;否则,执行S1211。
S1203,MCU根据第三触摸操作对应的各触摸点的坐标数据拟合直线。
S1204,MCU计算拟合直线与第一方向的夹角α。
其中,第一方向为沿着触摸板第一侧边的方向,第一侧边为第三触摸操作的触摸位置所在的侧边。
S1205,MCU计算每个触摸点在第二方向上的投影点与第三触摸操作的起始点在第二方向上的投影点之间的距离差值,确定距离差值最大值t。
其中,第二方向为与第一侧边垂直的方向。
S1206,MCU确定是否夹角α小于或等于夹角阈值且距离差值最大值t小于或等于差值阈值,若是,则执行步骤S1207;否则,执行S1211。
S1207,MCU识别第三触摸操作为边缘滑动触摸操作,基于边缘滑动触摸操作在第一方向上的滑移动距离m或者移动速度,向OS侧应用上报边缘滑动触摸事件。
S1208,OS侧应用调用OS API触发边缘滑动触摸操作对应的响应功能。
S1209,MCU基于当前的触摸点与震动器件的距离,选取距离最小的k个震动器件,计算所述k个震动器件分别对应的震动信号值。
S1210,MCU向k个震动器件分别发送各自的震动信号值,使得k个震动器件根据接收到的震动信号值进行震动提示。
S1211,MCU识别第三触摸操作为误触操作,不作处理。
上述本申请提供的实施例中,从电子设备(例如平板电脑)作为执行主体的角度对本申请实施例提供的方法进行了介绍。为了实现上述本申请实施例提供的方法中的各功能,电子设备可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上述各功能。上述各功能中的某个功能以硬件结构、软件模块、还是硬件结构加软件模块的方式来执行,取决于技术方案的特定应用和设计约束条件。
基于以上实施例以及相同构思,图13为本申请实施例提供的电子设备的示意图,如图13所示,该电子设备1300可以实现上述方法实施例中的电子设备所执行的步骤。该电子设备可以包括接收模块1301、第一采集模块1302、第二采集模块1303、识别模块1304、 响应模块1305。
接收模块1301,用于接收第一触摸操作;
第一采集模块1302,用于采集形变量数据;
第二采集模块1303,用于采集触摸数据;
识别模块1304,用于根据形变量数据以及触摸数据确定第一触摸操作为第一指关节触摸操作;
响应模块1305,触发第一指关节触摸操作对应的响应功能。
本申请实施例中,接收模块1301以及第二采集模块1303的功能可以由上述方法实施例中的触摸板实现,第一采集模块1302的功能可以由上述方法实施例中的至少一个形变量传感器实现,识别模块1304的功能可以由上述方法实施例中的MCU实现,响应模块1305的功能可以由OS侧应用实现。
在一种可能的实现方式中,形变量数据为非完整周期的形变量波形数据。
在一种可能的实现方式中,在识别模块1304确定第一触摸操作为第一指关节触摸操作后,响应模块1305触发第一指关节触摸操作对应的响应功能之前,识别模块1304还用于:判断第一指关节触摸操作为单指关节操作还是双指关节操作;响应模块1305具体用于:当判断第一指关节触摸操作为单指关节操作时,触发单指关节操作对应的响应功能,或者,当判断第一指关节触摸操作为双指关节操作时,触发双指关节操作对应的响应功能。
在一种可能的实现方式中,识别模块1304,具体用于:根据至少一个形变量传感器采集的形变量数据生成至少一个形变量波形数据;从至少一个形变量波形数据中确定出目标波形数据,目标波形数据的波峰与波谷的差值大于或等于预设阈值;根据目标波形数据与触摸数据确定第一触摸操作为第一指关节触摸操作。
在一种可能的实现方式中,识别模块1304,具体用于:根据形变量数据的特征、触摸数据的特征、以及预训练的识别模型确定第一触摸操作为第一指关节触摸操作,预训练的识别模型为根据预先采集的指关节触摸操作对应的形变量数据的特征和触摸数据的特征训练得到的。
在一种可能的实现方式中,接收模块1301还用于接收作用于触摸板的第二触摸操作;识别模块1304,还用于:若第一触摸操作与第二触摸操作满足预设条件,则确定第一触摸操作与第二触摸操作为指关节双击操作;触发指关节双击操作对应的响应功能;预设条件包括以下内容:第二触摸操作为指关节触摸操作;第一触摸操作作用于触摸板的时间与第二触摸操作作用于触摸板的时间之间的间隔小于时间阈值;第一触摸操作的位置与第二触摸操作的位置之间的距离小于距离阈值。
在一种可能的实现方式中,电子设备1300还包括显示模块1306,用于显示用户界面,以使用户通过用户界面设置第一指关节触摸操作对应的响应功能。该显示模块1306的功能可以由显示屏实现。
基于以上实施例以及相同构思,图14为本申请实施例提供的另一电子设备的示意图,如图14所示,该电子设备1400可以实现上述方法实施例中的电子设备所执行的步骤。该电子设备可以包括接收模块1401、识别模块1402、响应模块1403。
接收模块1401,用于接收第三触摸操作;
识别模块1402,用于识别第三触摸操作为边缘滑动触摸操作;
响应模块1403,用于触发边缘滑动触摸操作对应的响应功能,并通过震动器件输出震动提示。
本申请实施例中,接收模块1401和采集模块1405的功能可以由上述方法实施例中的触摸板实现,识别模块1402和处理模块1404的功能可以由上述方法实施例中的MCU实现,响应模块1403的功能可以由上述方法实施例中的OS侧应用实现。
在一种可能的实现方式中,响应模块1403,具体用于通过与边缘滑动触摸操作对应的每个触摸点距离最近的k个震动器件输出震动提示,k为正整数。
在一种可能的实现方式中,该电子设备1400还包括处理模块1404,处理模块1404具体用于:根据第一触摸点分别与k个震动器件之间的距离、以及第一映射关系,确定与第一触摸点距离最近的k个震动器件对应的震动信号值,第一映射关系包括距离与震动信号值之间的映射关系,第一触摸点为第三触摸操作对应的各个触摸点中的任意一个触摸点;响应模块1403,具体用于根据所述k个震动器件对应的震动信号值输出震动提示。
在一种可能的实现方式中,与第一触摸点距离最近的k个震动器件输出的震动信号到达第一触摸点时的震动强度一致。
在一种可能的实现方式中,该电子设备1400还包括采集模块1405,用于采集触摸数据;处理模块1404,具体用于响应于第三触摸操作,获取触摸板采集的触摸数据;在根据触摸数据确定第三触摸操作为滑动触摸操作、且滑动触摸操作作用于触摸板的边缘区域后,根据各触摸点的坐标数据,计算各触摸点的坐标数据的拟合直线与第一方向的夹角,以及每个触摸点在第二方向上的投影点与第三触摸操作的起始点在第二方向上的投影点之间的距离差值;第一方向为沿着触摸板第一侧边的方向,第一侧边为第三触摸操作的触摸位置所在的侧边,第二方向为与第一侧边垂直的方向;识别模块1402,具体用于若夹角以及各个触摸点对应的距离差值满足预设条件,识别第三触摸操作为边缘滑动触摸操作。
在一种可能的实现方式中,预设条件包括:夹角小于或等于夹角阈值、且各个触摸点对应的距离差值中的距离差值最大值小于或等于差值阈值。
在一种可能的实现方式中,电子设备1400还包括显示模块1406,用于显示用户界面,以使用户通过用户界面设置边缘滑动触摸操作对应的响应功能。该显示模块1406的功能可以由显示屏实现。
采用硬件实现时,该电子设备的硬件结构实现可参考图1D及其相关描述。
一种实现方式中,电子设备包括:处理器、存储器、触摸板、以及至少一个形变量传感器;所述触摸板用于接收触摸操作、以及采集触摸数据;所述形变量传感器用于采集形变量数据;所述存储器用于存储一个或多个计算机程序,所述一个或多个计算机程序被所述处理器执行时,使得所述电子设备执行上述任一实施例中的方法。
另一种实现方式中,电子设备包括处理器、存储器、触摸板、以及至少一个震动器件;所述触摸板用于接收触摸操作、以及采集触摸数据;所述震动器件用于输出震动信号;所述存储器用于存储一个或多个计算机程序,所述一个或多个计算机程序被所述处理器执行时,使得所述电子设备执行上述任一实施例中的方法。
本申请实施例还提供一种计算机存储介质,该计算机存储介质中存储有计算机指令,当该计算机指令在电子设备上运行时,使得电子设备执行上述相关方法步骤实现上述实施例中的方法。
本申请实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的方法。
另外,本申请的实施例还提供一种装置,这个装置具体可以是芯片,组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装置运行时,处理器可执行存储器存储的计算机执行指令,以使芯片执行上述各方法实施例中的方法。
其中,本申请实施例提供的电子设备、计算机存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上实施方式的描述,所属领域的技术人员可以了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其他的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其他的形式。
以上实施例中所用,根据上下文,术语“当…时”或“当…后”可以被解释为意思是“如果…”或“在…后”或“响应于确定…”或“响应于检测到…”。类似地,根据上下文,短语“在确定…时”或“如果检测到(所陈述的条件或事件)”可以被解释为意思是“如果确定…”或“响应于确定…”或“在检测到(所陈述的条件或事件)时”或“响应于检测到(所陈述的条件或事件)”。另外,在上述实施例中,使用诸如第一、第二之类的关系术语来区份一个实体和另一个实体,而并不限制这些实体之间的任何实际的关系和顺序。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本发明实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用 介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。在不冲突的情况下,以上各实施例的方案都可以组合使用。
需要指出的是,本专利申请文件的一部分包含受著作权保护的内容。除了对专利局的专利文件或记录的专利文档内容制作副本以外,著作权人保留著作权。

Claims (17)

  1. 一种识别触摸操作的方法,其特征在于,应用于电子设备,且所述电子设备包括触摸板和至少一个形变量传感器,所述方法包括:
    接收作用于所述触摸板的第一触摸操作;
    获取所述至少一个形变量传感器采集的形变量数据以及所述触摸板采集的触摸数据;
    根据所述形变量数据以及所述触摸数据确定所述第一触摸操作为第一指关节触摸操作后,触发所述第一指关节触摸操作对应的响应功能。
  2. 根据权利要求1所述的方法,其特征在于,所述形变量数据为非完整周期的形变量波形数据。
  3. 根据权利要求1或2所述的方法,其特征在于,在所述确定所述第一触摸操作为第一指关节触摸操作后,触发所述第一指关节触摸操作对应的响应功能之前,所述方法还包括:
    判断所述第一指关节触摸操作为单指关节操作还是双指关节操作;
    所述触发所述第一指关节触摸操作对应的响应功能,包括:
    当判断所述第一指关节触摸操作为单指关节操作时,触发所述单指关节操作对应的响应功能;或者,
    当判断所述第一指关节触摸操作为双指关节操作时,触发所述双指关节操作对应的响应功能。
  4. 根据权利要求1-3任一所述的方法,其特征在于,所述根据所述形变量数据以及所述触摸数据确定所述第一触摸操作为第一指关节触摸操作,包括:
    根据所述至少一个形变量传感器采集的形变量数据生成至少一个形变量波形数据;
    从所述至少一个形变量波形数据中确定出目标波形数据,所述目标波形数据的波峰与波谷的差值大于或等于预设阈值;
    根据所述目标波形数据与所述触摸数据确定所述第一触摸操作为第一指关节触摸操作。
  5. 根据权利要求1-4任一所述的方法,其特征在于,所述根据所述形变量数据以及所述触摸数据确定所述第一触摸操作为第一指关节触摸操作,包括:
    根据所述形变量数据的特征、所述触摸数据的特征、以及预训练的识别模型确定所述第一触摸操作为第一指关节触摸操作,所述预训练的识别模型为根据预先采集的指关节触摸操作对应的形变量数据的特征和触摸数据的特征训练得到的。
  6. 根据权利要求1-5任一所述的方法,其特征在于,所述方法还包括:
    接收作用于所述触摸板的第二触摸操作;
    若所述第一触摸操作与所述第二触摸操作满足预设条件,则确定所述第一触摸操作与所述第二触摸操为指关节双击操作;
    触发所述指关节双击操作对应的响应功能;
    所述预设条件包括以下内容:
    所述第二触摸操作为指关节触摸操作;
    所述第一触摸操作作用于所述触摸板的时间与所述第二触摸操作作用于所述触摸板的时间之间的间隔小于时间阈值;
    所述第一触摸操作的位置与所述第二触摸操作的位置之间的距离小于距离阈值。
  7. 根据权利要求1-6任一所述的方法,其特征在于,所述方法还包括:
    显示用户界面,以使用户通过所述用户界面设置不同触摸操作对应的响应功能。
  8. 一种识别触摸操作的方法,其特征在于,应用于电子设备,且所述电子设备包括触摸板和震动器件,所述方法包括:
    接收作用于所述触摸板的第三触摸操作;
    识别所述第三触摸操作为边缘滑动触摸操作;
    触发所述边缘滑动触摸操作对应的响应功能,并通过所述震动器件输出震动提示。
  9. 根据权利要求8所述的方法,其特征在于,所述通过所述震动器件输出震动提示,包括:
    通过与所述边缘滑动触摸操作对应的每个触摸点距离最近的k个震动器件输出震动提示,所述k为正整数。
  10. 根据权利要求8或9所述的方法,其特征在于,所述通过所述震动器件输出震动提示,包括:
    根据第一触摸点分别与距离最近的k个震动器件之间的距离、以及第一映射关系,确定与所述第一触摸点距离最近的k个震动器件对应的震动信号值,所述第一映射关系包括距离与震动信号值之间的映射关系,所述第一触摸点为所述第三触摸操作对应的各个触摸点中的任意一个触摸点
    根据所述k个震动器件对应的震动信号值输出震动提示。
  11. 根据权利要求10所述的方法,其特征在于,与所述第一触摸点距离最近的k个震动器件输出的震动信号到达所述第一触摸点时的震动强度一致。
  12. 根据权利要求8-11任一所述的方法,其特征在于,所述识别所述第三触摸操作为边缘滑动触摸操作,包括:
    响应于所述第三触摸操作,获取所述触摸板采集的触摸数据;
    在根据所述触摸数据确定所述第三触摸操作为滑动触摸操作、且所述滑动触摸操作作用于所述触摸板的边缘区域后,根据所述各触摸点的坐标数据,计算所述各触摸点的坐标数据的拟合直线与第一方向的夹角,以及每个触摸点在第二方向上的投影点与所述第三触摸操作的起始点在所述第二方向上的投影点之间的距离差值;所述第一方向为沿着触摸板第一侧边的方向,所述第一侧边为所述第三触摸操作的触摸位置所在的侧边,所述第二方向为与所述第一侧边垂直的方向;
    若所述夹角以及各个触摸点对应的距离差值满足预设条件,识别所述第三触摸操作为边缘滑动触摸操作。
  13. 根据权利要求12所述的方法,其特征在于,所述预设条件包括:
    所述夹角小于或等于夹角阈值、且各个触摸点对应的距离差值中的距离差值最大值小于或等于差值阈值。
  14. 根据权利要求8-13中任一项所述的方法,其特征在于,所述方法还包括:
    显示用户界面,以使用户通过所述用户界面设置所述边缘滑动触摸操作对应的响应功能。
  15. 一种电子设备,其特征在于,包括处理器、存储器、触摸板、以及至少一个形变量传感器;
    所述触摸板用于接收触摸操作、以及采集触摸数据;
    所述形变量传感器用于采集形变量数据;
    所述存储器用于存储一个或多个计算机程序,所述一个或多个计算机程序被所述处理器执行时,使得所述电子设备执行如权利要求1至7任一所述的方法。
  16. 一种电子设备,其特征在于,包括处理器、存储器、触摸板、以及至少一个震动器件;
    所述触摸板用于接收触摸操作、以及采集触摸数据;
    所述震动器件用于输出震动信号;
    所述存储器用于存储一个或多个计算机程序,所述一个或多个计算机程序被所述处理器执行时,使得所述电子设备执行如权利要求8至14任一所述的方法。
  17. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质包括计算机程序,当计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求1至7任一所述的方法;或者,当计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求8至14任一所述的方法。
PCT/CN2022/120972 2021-09-29 2022-09-23 一种识别触摸操作的方法及电子设备 WO2023051411A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111153203.0A CN113919390A (zh) 2021-09-29 2021-09-29 一种识别触摸操作的方法及电子设备
CN202111153203.0 2021-09-29

Publications (1)

Publication Number Publication Date
WO2023051411A1 true WO2023051411A1 (zh) 2023-04-06

Family

ID=79237241

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/120972 WO2023051411A1 (zh) 2021-09-29 2022-09-23 一种识别触摸操作的方法及电子设备

Country Status (2)

Country Link
CN (1) CN113919390A (zh)
WO (1) WO2023051411A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116450026A (zh) * 2023-06-16 2023-07-18 荣耀终端有限公司 用于识别触控操作的方法和***

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113919390A (zh) * 2021-09-29 2022-01-11 华为技术有限公司 一种识别触摸操作的方法及电子设备
CN117389454A (zh) * 2022-07-01 2024-01-12 荣耀终端有限公司 指关节操作的识别方法及电子设备
CN116088745A (zh) * 2022-08-05 2023-05-09 荣耀终端有限公司 应用的打开方法和相关装置
CN117827023A (zh) * 2022-09-29 2024-04-05 荣耀终端有限公司 触控方法及电子设备
CN116894210B (zh) * 2023-09-11 2023-12-05 深圳市力准传感技术有限公司 包含力传感器的电子设备及数据处理方法
CN117572985B (zh) * 2024-01-16 2024-04-19 深圳市亚米拉电子科技有限公司 一种用于电脑触摸板的智能防误触识别***及方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406587A (zh) * 2015-07-16 2017-02-15 小米科技有限责任公司 终端触控识别方法及装置
CN106445120A (zh) * 2016-09-05 2017-02-22 华为技术有限公司 触控操作的识别方法及装置
CN106485126A (zh) * 2016-10-31 2017-03-08 宇龙计算机通信科技(深圳)有限公司 基于指关节的信息处理方法、装置及终端
CN106527933A (zh) * 2016-10-31 2017-03-22 努比亚技术有限公司 移动终端边缘手势的控制方法及装置
WO2017088694A1 (zh) * 2015-11-27 2017-06-01 努比亚技术有限公司 手势校准方法、装置及手势输入处理方法、计算机存储介质
CN112433612A (zh) * 2020-11-25 2021-03-02 江西欧迈斯微电子有限公司 控制触控板振动提醒方法、存储介质、触控装置和设备
CN112650405A (zh) * 2019-10-10 2021-04-13 华为技术有限公司 一种电子设备的交互方法及电子设备
CN113919390A (zh) * 2021-09-29 2022-01-11 华为技术有限公司 一种识别触摸操作的方法及电子设备

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406587A (zh) * 2015-07-16 2017-02-15 小米科技有限责任公司 终端触控识别方法及装置
WO2017088694A1 (zh) * 2015-11-27 2017-06-01 努比亚技术有限公司 手势校准方法、装置及手势输入处理方法、计算机存储介质
CN106445120A (zh) * 2016-09-05 2017-02-22 华为技术有限公司 触控操作的识别方法及装置
CN106485126A (zh) * 2016-10-31 2017-03-08 宇龙计算机通信科技(深圳)有限公司 基于指关节的信息处理方法、装置及终端
CN106527933A (zh) * 2016-10-31 2017-03-22 努比亚技术有限公司 移动终端边缘手势的控制方法及装置
CN112650405A (zh) * 2019-10-10 2021-04-13 华为技术有限公司 一种电子设备的交互方法及电子设备
CN112433612A (zh) * 2020-11-25 2021-03-02 江西欧迈斯微电子有限公司 控制触控板振动提醒方法、存储介质、触控装置和设备
CN113919390A (zh) * 2021-09-29 2022-01-11 华为技术有限公司 一种识别触摸操作的方法及电子设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116450026A (zh) * 2023-06-16 2023-07-18 荣耀终端有限公司 用于识别触控操作的方法和***
CN116450026B (zh) * 2023-06-16 2023-10-20 荣耀终端有限公司 用于识别触控操作的方法和***

Also Published As

Publication number Publication date
CN113919390A (zh) 2022-01-11

Similar Documents

Publication Publication Date Title
WO2023051411A1 (zh) 一种识别触摸操作的方法及电子设备
WO2020177585A1 (zh) 一种手势处理方法及设备
CN111782102B (zh) 窗口的显示方法及相关装置
WO2021063098A1 (zh) 一种触摸屏的响应方法及电子设备
WO2021115210A1 (zh) 一种触控区域调整方法及装置
CN112740152B (zh) 手写笔检测方法、***及相关装置
WO2021068627A1 (zh) 一种电子设备的交互方法及电子设备
WO2021000943A1 (zh) 一种指纹开关的管理方法及装置
WO2021057699A1 (zh) 具有柔性屏幕的电子设备的控制方法及电子设备
CN113805487A (zh) 控制指令的生成方法、装置、终端设备及可读存储介质
US11893302B2 (en) Content transmission method and terminal device
CN110968252B (zh) 交互***的显示方法、交互***及电子设备
WO2017143575A1 (zh) 对图片的内容进行检索的方法、便携式电子设备和图形用户界面
EP4390639A1 (en) Handwriting processing method, and terminal device and chip system
WO2024032124A1 (zh) 卷轴屏开合方法及相关产品
CN113391775A (zh) 一种人机交互方法及设备
WO2023029916A1 (zh) 批注展示方法、装置、终端设备及可读存储介质
EP4321978A1 (en) Display method, electronic device, storage medium and program product
WO2022002213A1 (zh) 翻译结果显示方法、装置及电子设备
CN113821129A (zh) 一种显示窗口控制方法及电子设备
CN116521018B (zh) 误触提示方法、终端设备及存储介质
CN115639905B (zh) 一种手势控制方法及电子设备
WO2023088093A1 (zh) 显示方法和电子设备
WO2022143094A1 (zh) 一种窗口页面的交互方法、装置、电子设备以及可读存储介质
WO2022194007A1 (zh) 截屏方法、电子设备及存储介质

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE