CN115480666A - Touch detection method and device and storage medium - Google Patents

Touch detection method and device and storage medium Download PDF

Info

Publication number
CN115480666A
CN115480666A CN202110576615.9A CN202110576615A CN115480666A CN 115480666 A CN115480666 A CN 115480666A CN 202110576615 A CN202110576615 A CN 202110576615A CN 115480666 A CN115480666 A CN 115480666A
Authority
CN
China
Prior art keywords
mobile terminal
attitude
historical
target
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110576615.9A
Other languages
Chinese (zh)
Inventor
李金龙
房美琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110576615.9A priority Critical patent/CN115480666A/en
Publication of CN115480666A publication Critical patent/CN115480666A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a touch detection method, a touch detection device and a storage medium, wherein the touch detection method is applied to a mobile terminal and comprises the following steps: acquiring a sensing signal acquired by at least one inertial sensor in the mobile terminal; determining a current posture of the mobile terminal; and determining whether a tapping action is detected on the mobile terminal according to the current posture and the sensing signal. Therefore, whether the knocking action occurs or not can be determined by integrating the current posture of the mobile terminal and the sensing signal, so that the identification accuracy of the knocking action is increased.

Description

Touch detection method and device and storage medium
Technical Field
The present disclosure relates to the field of electronic devices, and in particular, to a touch detection method and apparatus, and a storage medium.
Background
The integrated functions of the mobile phone are more and more extensive, and various functional requirements appear. The back tapping refers to tapping the back of the mobile phone, and the corresponding tapping signal is identified by using an inertial sensor of the mobile phone, so that quick functions similar to screen shots, flashlights, cameras and the like are realized. The existing back knocking mode is to identify sensing signals acquired by an inertial sensor in a fixed axial direction and determine whether knocking occurs or not. However, with the diversification of the use scenes of the mobile phone, some use scenes can affect the sensing signals acquired by the inertial sensor, so that whether the tapping happens or not can not be determined based on the sensing signals, and the use of the shortcut gesture is affected.
Disclosure of Invention
The disclosure provides a touch detection method, a touch detection device and a storage medium.
According to a first aspect of the embodiments of the present disclosure, a touch detection method is provided, which is applied to a mobile terminal, and includes:
acquiring a sensing signal acquired by at least one inertial sensor in the mobile terminal;
determining a current posture of the mobile terminal;
and determining whether a tapping action is detected on the mobile terminal according to the current posture and the sensing signal.
Optionally, the current pose comprises: a first pose indicating a screen orientation and a second pose indicating a screen orientation;
the determining whether a tapping action is detected on the mobile terminal according to the current posture and the sensing signal includes:
determining whether a signal waveform on a target axis in the sensing signal conforms to a preset waveform or not based on the first posture;
if the first attitude is in accordance with the preset waveform, determining the variation of the target attitude angle according to the first attitude;
and determining whether the mobile terminal detects a knocking action according to the variable quantity of the target attitude angle determined based on the first attitude and a preset angle threshold determined based on the second attitude.
Optionally, the determining whether a tapping action is detected on the mobile terminal according to the variation of the target attitude angle determined based on the first attitude and a preset angle threshold determined based on the second attitude includes:
and when the variation of the target attitude angle determined based on the first attitude is larger than a preset angle threshold determined based on the second attitude, determining that the knocking action is detected on the mobile terminal.
Optionally, the method further comprises:
when the second posture indicates that the screen of the mobile terminal faces the ground, determining that the preset angle threshold is as follows: a first angle threshold;
when the second posture indicates that the screen of the mobile terminal faces away from the ground, determining the preset angle threshold as a second angle threshold; wherein the second angle threshold is greater than the first angle threshold.
Optionally, the first gesture comprises: a horizontal screen or a vertical screen;
the method further comprises the following steps:
when the mobile terminal is in a landscape screen, the target axis determined based on the first posture is as follows: an axis on which the long side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: transverse rolling angle;
when the mobile terminal is in a vertical screen, the target axis determined based on the first posture is as follows: an axis on which the short side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: and (6) a pitch angle.
Optionally, the determining whether a tapping action is detected on the mobile terminal according to the current posture and the sensing signal includes:
and processing the sensing signal and the posture data corresponding to the current posture based on a preset target recognition model, and determining whether a knocking action is detected on the mobile terminal.
Optionally, the method further comprises:
acquiring historical sample data acquired on the mobile terminal based on historical touch actions, wherein the historical sample data comprises: historical processing data and corresponding historical recognition result data; the historical processing data includes: historical sensing signals and historical attitude data;
inputting the historical sample data into an initial recognition model to be trained for iterative processing until the difference value between the output recognition result data and the historical recognition result data in the historical sample data meets a preset convergence condition, and obtaining the target recognition model.
Optionally, the processing the sensing signal and the posture data corresponding to the current posture based on a preset target recognition model to determine whether a tapping action is detected on the mobile terminal includes:
inputting the attitude data and the sensing signal into the target recognition model to obtain recognition result data;
and comparing the identification result data with a preset probability threshold value, and determining whether the mobile terminal detects a knocking action.
According to a second aspect of the embodiments of the present disclosure, there is provided a touch detection apparatus applied to a mobile terminal, including:
the signal acquisition module is used for acquiring sensing signals acquired by at least one inertial sensor in the mobile terminal;
the gesture determining module is used for determining the current gesture of the mobile terminal;
and the action detection module is used for determining whether a knocking action is detected on the mobile terminal according to the current posture and the sensing signal.
Optionally, the current pose comprises: a first pose indicating a screen orientation and a second pose indicating a screen orientation;
the motion detection module comprises:
the waveform determining module is used for determining whether a signal waveform on a target axis in the sensing signal conforms to a preset waveform or not based on the first posture;
the variable quantity determining module is used for determining the variable quantity of the target attitude angle according to the first attitude if the target attitude angle accords with the preset waveform;
and the action detection submodule is used for determining whether the mobile terminal detects a knocking action according to the variable quantity of the target attitude angle determined based on the first attitude and a preset angle threshold determined based on the second attitude.
Optionally, the action detection sub-module is further configured to:
and when the variation of the target attitude angle determined based on the first attitude is larger than a preset angle threshold determined based on the second attitude, determining that the knocking action is detected on the mobile terminal.
Optionally, the apparatus further comprises:
a first determining module, configured to determine that the preset angle threshold is: a first angle threshold;
the second determining module is used for determining that the preset angle threshold is a second angle threshold when the second posture indicates that the screen of the mobile terminal faces away from the ground; wherein the second angle threshold is greater than the first angle threshold.
Optionally, the first pose comprises: a horizontal screen or a vertical screen;
the device further comprises:
a third determining module, configured to determine, when the mobile terminal is in a landscape, that a target axis based on the first posture is: an axis on which the long side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: transverse roll angle;
a fourth determining module, configured to determine, when the mobile terminal is in a portrait screen, that the target axis based on the first posture is: an axis on which the short side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: and (6) a pitch angle.
Optionally, the motion detection module includes:
and the model processing module is used for processing the sensing signal and the posture data corresponding to the current posture based on a preset target recognition model and determining whether a knocking action is detected on the mobile terminal.
Optionally, the apparatus further comprises:
the data acquisition module is used for acquiring historical sample data acquired on the mobile terminal based on historical touch actions, wherein the historical sample data comprises: historical processing data and corresponding historical recognition result data; the historical processing data includes: historical sensing signals and historical attitude data;
and the iteration processing module is used for inputting the historical sample data into an initial recognition model to be trained for iteration processing until the difference value between the output recognition result data and the historical recognition result data in the historical sample data meets a preset convergence condition, so as to obtain the target recognition model.
Optionally, the model processing module is further configured to:
inputting the attitude data and the sensing signal into the target recognition model to obtain recognition result data;
and comparing the identification result data with a preset probability threshold value, and determining whether the mobile terminal detects a knocking action.
According to a third aspect of the embodiments of the present disclosure, there is provided a touch detection apparatus, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: the method of any one of the above first aspects is implemented when executable instructions stored in the memory are executed.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the steps in the method provided by any one of the above first aspects.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the touch detection method provided by the embodiment of the disclosure, after the sensing signal acquired by at least one inertial sensor is acquired, the current posture of the mobile terminal is further determined, and whether a tapping action is detected on the mobile terminal is further determined according to the current posture and the sensing signal. Because the mobile terminal can generate different influences on the sensing signals acquired by the inertial sensor under different postures, in the process of identifying the knocking action, the sensing signals and the current posture of the mobile terminal can be acquired, and the knocking action is determined whether to be detected on the mobile terminal or not by combining the sensing signals and the current posture of the mobile terminal, so that the identification accuracy of the knocking action is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a flowchart illustrating a touch detection method according to an exemplary embodiment.
Fig. 2 is a diagram illustrating pitch, roll and heading angles on a mobile terminal according to an example embodiment.
Fig. 3 is a schematic diagram of two screen orientations of a mobile terminal provided in accordance with an example embodiment.
Fig. 4 is a flowchart illustrating a touch detection method according to an exemplary embodiment.
Fig. 5 is a flowchart illustrating a touch detection method according to an exemplary embodiment.
Fig. 6 is a schematic structural diagram of a touch detection device according to an exemplary embodiment.
Fig. 7 is a block diagram illustrating a touch detection device according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
An embodiment of the present disclosure provides a touch detection method, and fig. 1 is a schematic flowchart illustrating a touch detection method according to an exemplary embodiment, where as shown in fig. 1, the touch detection method includes the following steps:
step 101, acquiring a sensing signal acquired by at least one inertial sensor in a mobile terminal;
step 102, determining the current posture of the mobile terminal;
and 103, determining whether the mobile terminal detects a tapping action according to the current posture and the sensing signal.
It should be noted that the touch detection method may be applied to any mobile terminal, and the mobile terminal may be: a smart phone, a tablet computer, or a wearable electronic device, etc.
The mobile terminal includes: a back shell, which is a portion of the housing facing away from the screen, a screen, and at least one Inertial Measurement Unit (IMU) located inside the housing.
In some embodiments, the inertial sensor comprises: and sensors such as an accelerometer and/or a gyroscope for measuring inertial force, instead of devices such as a touch screen or a touch panel for detecting touch operation. The gyroscope can be used for detecting rotation data of the mobile terminal; and an accelerometer may be used to detect the current pose of the mobile terminal.
In some embodiments, the accelerometer may be a three-axis accelerometer or a single-axis accelerometer, based on which more attitude information of the mobile terminal may be detected.
Illustratively, the attitude angle of the mobile terminal may be detected while using a three-axis accelerometer. The attitude angle includes: pitch angle, roll angle and course angle. And the pitch angle, the roll angle and the course angle are all swing angles of the mobile terminal relative to a corresponding plane of an inertial coordinate system. The inertial coordinate system is also referred to as a ground-surface inertial reference frame (earth-surface inertial reference frame), wherein the inertial coordinate system may be determined based on a right-hand rule.
For example, the determination of the inertial coordinate system may be: when the mobile terminal is defined to be in a vertical screen state perpendicular to the ground, the right, front and upper three directions of the screen form a right-handed system, and the front direction can be as follows: the direction the screen is facing. The right direction may be: and the right direction of the screen of the mobile terminal. The upper direction may be: perpendicular to the forward and right directions and opposite to the direction of gravity. Thus, the front direction may be the Z axis, the right direction may be the X axis, the up direction may be the Y axis,
FIG. 2 is a schematic diagram illustrating pitch, roll and heading angles on a mobile terminal according to an exemplary embodiment, as shown in FIG. 2:
pitch angle 201 is the angle at which mobile terminal 200 "pitches" with respect to the XOY plane of the inertial coordinate system; the pitch angle can also be regarded as an included angle between an X axis of a mobile terminal coordinate system and an XOY plane; the pitch angle represents the rotation of the mobile terminal around the X axis, and when the X axis of the mobile terminal coordinate system is above the XOY plane of the inertial coordinate system, the pitch angle is a positive value, otherwise, the pitch angle is a negative value.
Roll angle 202 is the angle at which mobile terminal 200 "swings" relative to the YOZ plane of the inertial coordinate system; it can also be considered as the angle between the Y-axis of the mobile terminal coordinate system and the YOZ plane. The roll angle represents the rotation of the mobile terminal around the Y axis, and the roll angle is a positive value when the mobile terminal rotates clockwise around the Y axis.
Heading angle 203 is the angle at which mobile terminal 200 "wobbles" with respect to the ZOX plane of the inertial coordinate system; it can also be considered as the angle between the Z-axis of the mobile terminal coordinate system and the ZOX plane. The course angle represents the rotation of the mobile terminal around the Z axis, and the course angle is a positive value when the mobile terminal rotates clockwise around the Z axis.
In the embodiment of the present disclosure, the current posture of the mobile terminal may be determined based on the data detected by the accelerometer, that is, based on the data detected by the accelerometer, the posture angle of the mobile terminal may be determined, and based on the posture angle, the current posture of the mobile terminal may be further determined. In other embodiments, the current posture of the mobile terminal may also be inferred based on the picture acquired by the camera, and the present disclosure does not limit the determination of the current posture of the mobile terminal.
Since the sensor signal collected by the inertial sensor inside the housing changes when the mobile terminal is tapped, for example, when the back shell of the mobile terminal is tapped, the mobile terminal slightly rotates around a shaft, so that a sinusoidal characteristic or a characteristic similar to a peak appears in a signal waveform on a target shaft in the sensor signal detected by the gyroscope in the inertial sensor inside the mobile terminal. Then in the related art, it is possible to determine whether a tapping motion is currently detected by monitoring the sensing signal. However, in some application scenarios, the current posture of the mobile terminal may affect the sensing signal acquired by the inertial sensor, for example, a sinusoidal feature or a peak-like feature may not appear on the signal on the target axis, and therefore, the tapping motion detection cannot be accurately implemented based on the feature of the sensing signal. Therefore, in the embodiment of the disclosure, when it is determined whether the knocking action is detected on the mobile terminal, the current posture of the mobile terminal and the sensing signal acquired by the inertial sensor can be combined to perform comprehensive judgment, so that the detection accuracy is improved.
The pivoting may be: after the mobile terminal is knocked, the body of the mobile terminal rotates around the coordinate axis. Here, when the mobile terminal is in different screen orientations, the coordinate axes corresponding to the rotations are different; for example, when the mobile terminal is in a landscape screen, the coordinate axis which is correspondingly rotated after the knocking action occurs is the Y axis; when the mobile terminal is in the vertical screen, the coordinate axis corresponding to the rotation is the X axis. In some embodiments, the current pose of the mobile terminal comprises: a first pose indicating the orientation of the screen and a second pose indicating the orientation of the screen. The screen orientation and the screen direction can be used for representing the use condition of the screen.
In some embodiments, the first pose comprises: a horizontal screen or a vertical screen; as shown in fig. 3, fig. 3 is a schematic diagram of two screen orientations of a mobile terminal that may be currently in a landscape use state 301 or a portrait use state 302, according to an example embodiment.
The second pose includes: the screen faces upwards or downwards, wherein the screen faces upwards can be the screen faces away from the ground, and the screen faces downwards can be the screen faces towards the ground; illustratively, the mobile terminal is buckled on a desktop, and when the desktop is placed on the ground, the screen of the mobile terminal can be considered to face downwards.
Here, when the mobile terminal is in a different first posture or a different second posture, the success rate of detection of the tapping motion may be affected differently. Taking the inertial sensor as a three-axis inertial sensor as an example, when the mobile terminal is located in different screen orientations, whether a tapping action is detected is judged based on sensing signals on different target axes. If a tapping action is performed on the mobile terminal, for example, when the mobile terminal is in a portrait screen, a sinusoidal characteristic or a characteristic similar to a peak appears in the signal on the X-axis of the three-axis inertial sensor. If a tapping motion is performed on the mobile terminal while the mobile terminal is in the landscape, a sinusoidal or spike-like characteristic appears in the signal on the Y-axis of the three-axis inertial sensor. Then, if only the sensing signal on the fixed shaft of the three-axis inertial sensor is detected to determine whether the tapping motion is detected, an error exists.
For example, when the mobile terminal is in a different screen orientation, the amount of change in the data of the attitude angle for determining whether or not the tapping motion is detected is also different. If the mobile terminal is knocked on the back shell of the mobile terminal when the screen of the mobile terminal faces downwards, the inertial sensor positioned in the mobile terminal can easily detect corresponding sensing signals due to the clamping of the gravity. When the screen of the mobile terminal faces upwards, if the mobile terminal is knocked on the back shell, a relatively larger knocking force is needed to enable the inertial sensor to acquire a corresponding sensing signal. If the same signal strength is used to determine whether the tapping motion is detected, there will be errors, which are not favorable for detection.
According to the method and the device, whether the knocking action occurs or not is comprehensively judged by combining the current posture and the sensing signal, so that the detection accuracy can be improved.
In some embodiments, the determining whether a tapping action is detected on the mobile terminal according to the current posture and the sensing signal includes:
determining whether a signal waveform on a target axis in the sensing signal conforms to a preset waveform or not based on the first posture;
if the first attitude angle accords with the preset waveform, determining the variation of the target attitude angle according to the first attitude;
and determining whether the knocking action is detected on the mobile terminal according to the variable quantity of the target attitude angle determined based on the first attitude and a preset angle threshold determined based on the second attitude.
Here, the inertial sensor is taken as a three-axis inertial sensor for detecting sensing signals of the mobile terminal in three detection directions. In other words, the sensed sensor signals include signals in three detection directions.
By way of example: the mobile terminal is square, and the screen of the square mobile terminal comprises: long side and short side. The three detection directions that triaxial inertial sensor corresponds include: the X-axis direction along the short side, the Y-axis direction along the long side and the Z-axis direction along the thickness of the mobile terminal body.
The current pose of the mobile terminal comprises: a first pose indicating the orientation of the screen and a second pose indicating the orientation of the screen. In some embodiments, the first pose comprises: a horizontal screen or a vertical screen; the second pose includes: screen up, which may be screen-to-floor, or screen down, which may be screen-to-floor.
The target axis and the target attitude angle may be determined based on a currently detected first attitude, different first attitudes corresponding to different target axes and different target attitude angles.
In some embodiments, when the mobile terminal is in a landscape, the target axis determined based on the first pose is: an axis on which the long side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: and (4) transverse rolling angle.
In other embodiments, when the mobile terminal is in a portrait screen, the target axis determined based on the first posture is: an axis on which the short side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: and (6) a pitch angle.
The preset waveform may be: a waveform that conforms to a sinusoidal or spike-like characteristic.
Here, if it is currently detected that the screen orientation on the mobile terminal is that the screen is in a vertical screen state, when a tapping motion is performed on the mobile terminal, a sinusoidal feature or a peak-like feature appears in a sensing signal on the X axis in three-axis sensing signals acquired by the three-axis inertial sensor. If the screen orientation on the mobile terminal is detected to be the screen in the horizontal screen state, when the mobile terminal is knocked, sinusoidal features or features similar to peaks appear in sensing signals on the Y axis in three-axis sensing signals acquired by the three-axis inertial sensor. Therefore, after the current posture of the mobile terminal is detected, the target axis needing to be identified can be determined based on the current posture, and then the signal waveform of the sensing signal on the target axis is identified, so that the primary detection of the knocking action is realized.
Further, if the screen of the mobile terminal is in a vertical screen state, the pitch angle of the mobile terminal changes obviously when the mobile terminal is knocked; if the screen of the mobile terminal is in a horizontal screen state, the roll angle of the mobile terminal can be obviously changed when the mobile terminal is knocked. And when the mobile terminal is in a second different posture, the requirement for the variation of the pitch angle or the roll angle is different. For example, when the mobile terminal is facing upward, if the pitch angle or roll angle is changed by an amount greater than a second angle threshold, it may be determined that a tapping motion is detected; when the mobile terminal is facing downward, if the amount of change in the pitch angle or the roll angle is greater than a first angle threshold, it may be determined that a tapping motion is detected. Thus, after the initial detection of the tapping action is completed, whether the tapping action is detected on the mobile terminal is further determined according to the variation of the target attitude angle determined based on the first attitude and the preset angle threshold determined based on the second attitude, so that the accuracy of detection is improved. The preset angle threshold includes: a first angle threshold or a second angle threshold.
In some embodiments, the determining whether a tapping action is detected on the mobile terminal according to the variation of the target attitude angle determined based on the first attitude and a preset angle threshold determined based on the second attitude includes:
and when the variation of the target attitude angle determined based on the first attitude is larger than a preset angle threshold determined based on the second attitude, determining that the knocking action is detected on the mobile terminal.
If the screen of the mobile terminal is in a vertical screen state, the pitch angle of the mobile terminal is obviously changed when the mobile terminal is knocked; at this time, it is possible to set: and when the variation of the pitch angle of the mobile terminal is detected to be larger than the preset angle threshold determined based on the second posture, considering that the knocking action is detected.
If mobile terminal is in the horizontal screen state, when knocking the action on mobile terminal, mobile terminal's roll angle can take place obvious change, then, just this moment can set up: and when detecting that the variation of the roll angle of the mobile terminal is larger than a preset angle threshold determined based on the second posture, considering that the knocking action is detected.
Here, the preset angle threshold includes: a first angle threshold or a second angle threshold.
In some embodiments, when the second gesture indicates that the screen of the mobile terminal faces the ground, the preset angle threshold is determined as follows: a first angle threshold; when the second posture indicates that the screen of the mobile terminal faces away from the ground, determining that the preset angle threshold is a second angle threshold; wherein the second angle threshold is greater than the first angle threshold.
Because the mobile terminal is in different orientations, the detected tap strength is different. That is, when the orientations of the mobile terminals are different, the corresponding preset angle thresholds take different values.
Here, the screen facing away from the ground may be equivalent to the orientation of the mobile terminal being upward. The screen faces the ground with the orientation equivalent to the mobile terminal facing downward.
For example, when the orientation of the mobile terminal is upward, the knocking dorsal shell with high strength is needed to detect the knocking action, so the value of the preset angle threshold value needs to be relatively large, and the knocking action can be detected only when the strength larger than the second angle threshold value is detected. The second angle threshold is an angle threshold corresponding to the mobile terminal when the orientation of the mobile terminal is upward.
For another example, when the orientation of the mobile terminal is downward, the knocking action can be detected by adopting small force to knock the back shell, so that the value of the preset angle threshold value can be relatively small, and the knocking action can be detected only when the force larger than the first angle threshold value is detected. The first angle threshold is an angle threshold corresponding to the mobile terminal when the orientation of the mobile terminal is downward.
The second angle threshold is greater than the first angle threshold.
As such, as shown in fig. 4, the touch detection method of the present disclosure may determine a current posture of the mobile terminal based on a sensing signal acquired by an accelerometer in the mobile terminal, for example, determine a screen orientation (first posture) and a screen orientation (second posture) during movement. The updating of the preset angle threshold may be performed based on the different screen orientations, where different screen orientations correspond to different preset angle thresholds. And then, based on the updated preset angle threshold, judging the variation of the target attitude angle determined based on the first attitude, so as to realize accurate detection of the knocking action.
For example, in some embodiments, determining whether a tapping action is detected on the mobile terminal may be:
when the mobile terminal is in a vertical screen state, determining whether a signal waveform on an X axis in a sensing signal is a preset waveform or not based on a gyroscope in the mobile terminal;
determining the direction of the mobile terminal when the signal waveform corresponding to the X axis is a preset waveform;
when the orientation is upward, determining the variation of the pitch angle of the mobile terminal based on the sensing signal;
and when the variation of the pitch angle is larger than a second angle threshold value, determining that the mobile terminal detects a knocking action.
In some embodiments, determining whether a tapping action is detected on the mobile terminal may be:
when the mobile terminal is in a vertical screen state, determining whether a signal waveform on an X axis in a sensing signal is a preset waveform;
determining the orientation of the mobile terminal when the signal waveform corresponding to the X-axis signal is a preset waveform;
when the orientation is downward, determining the variation of the pitch angle of the mobile terminal based on the sensing signal;
when the variation of the pitch angle is larger than a first threshold value, determining that the mobile terminal detects a knocking action; wherein the second threshold is greater than the first threshold.
In some embodiments, determining whether a tapping action is detected on the mobile terminal may be:
when the mobile terminal is in a horizontal screen state, determining whether a signal waveform on a Y axis in a sensing signal is a preset waveform;
determining the orientation of the mobile terminal when the signal waveform corresponding to the Y-axis signal is a preset waveform;
when the orientation is upward, determining the change amount of the roll angle of the mobile terminal based on the sensing signal;
and when the variation of the roll angle is larger than a second threshold value, determining that the mobile terminal detects a knocking action.
In some embodiments, determining whether a tapping action is detected on the mobile terminal may be:
when the mobile terminal is in a horizontal screen state, determining whether a signal waveform on a Y axis in a sensing signal is a preset waveform;
determining the orientation of the mobile terminal when the signal waveform corresponding to the Y-axis signal is a preset waveform;
when the orientation is downward, determining the variation of the roll angle of the mobile terminal based on the sensing signal;
when the variation of the roll angle is larger than a first threshold value, determining that a knocking action is detected on a back shell of the mobile terminal; wherein the second threshold is greater than the first threshold.
Therefore, the mobile terminal can generate different influences on the sensing signals acquired by the inertial sensor under different postures, so that in the process of identifying the knocking action, the sensing signals can be acquired, the current posture of the mobile terminal can also be acquired, and whether the knocking action is detected on the mobile terminal is determined by combining the sensing signals and the current posture, so that the identification accuracy of the knocking action is improved.
In some embodiments, the step 103 of determining whether a tapping motion is detected on the mobile terminal according to the current posture and the sensing signal may further include:
step 1032, processing the attitude data corresponding to the current attitude and the sensing signal based on a preset target recognition model, and determining whether a tapping action is detected on the mobile terminal.
Here, in the embodiment of the present disclosure, the gesture data and the sensing signal may also be processed based on a preset target recognition model, so as to obtain a determination result whether a tapping action is detected on the mobile terminal; i.e. the result is output directly through the model.
The mobile terminal is preset with a target recognition model, after the sensing data are obtained, the posture data are correspondingly obtained, the posture data and the sensing signals are further processed based on the preset target recognition model, and a determination result of whether knocking actions occur is output.
The target recognition model may be: based on any neural network model capable of realizing prediction. For example, a neural network model such as a Back Propagation (BP) model or a Long Short-Term Memory (LSTM) model.
The target recognition model can be obtained by training historical processing data and corresponding historical recognition result data, wherein the historical processing data comprises: historical sensing signals and historical attitude data; inputting historical processing data and corresponding historical recognition result data in the target prediction model, and outputting the following data: a determination of whether a tapping action has occurred.
In some embodiments, the method further comprises:
step 301, obtaining history sample data acquired based on a history touch action on the mobile terminal, wherein the history sample data includes: historical processing data and corresponding historical recognition result data; the historical processing data includes: historical sensing signals and historical attitude data;
step 302, inputting the historical sample data into an initial prediction model to be trained for iterative processing until a difference value between the output recognition result data and the historical recognition result data in the historical sample data meets a preset convergence condition, so as to obtain the target recognition model.
It should be noted that the steps 301 to 302 occur before the step 1032.
After the target recognition model is obtained, the sensing data and the posture data acquired in steps 101 and 102 may be processed based on the target recognition model, and a determination result of whether a tapping motion occurs is output.
It should be noted that the historical processing data and the historical recognition result data acquired at one time are a set of corresponding data.
The initial recognition model to be trained may be any neural network model that enables prediction. Such as a Back Propagation (BP) model or a Long Short-Term Memory (LSTM) model. And training the initial recognition model to continuously optimize the parameters of the model through historical sample data acquired in the historical use of the mobile terminal, so as to obtain the target recognition model capable of accurately detecting the knocking action.
Here, the history sample data includes: historical sensory signals, historical attitude data, and corresponding historical recognition result data.
The historical sample data can be laboratory data, namely data obtained by collecting historical sensing signals, historical attitude data and corresponding historical recognition result data through multiple experiments before the knock action is predicted.
The historical sensing signals are: in the use of the mobile terminal, the sensing signals are detected by at least two inertial sensors arranged in the mobile terminal.
The historical attitude data is: and detecting the current posture of the mobile terminal at the detection moment of detecting the historical sensing signal to obtain posture data.
The historical recognition result data is: and the mobile terminal detects the identification result of the knocking action at the detection moment. Here, the recognition result may be characterized by an identification, e.g., "1" indicating that a tapping action is detected; "0" indicates no tapping motion is detected; then, the historical sensing signal and the historical recognition result data corresponding to the historical posture data detected at the same detection time may be "1" or "0".
Here, after the acquired sensing signal and the attitude data are input to the initial recognition model, if the output recognition result data is extremely close to or identical to the historical recognition result data in the historical sample data, the recognition model at that time is regarded as the target recognition model.
And the condition that the output identification result data is the same as the historical identification result data in the historical sample data needs a lot of time to reach and rarely occurs. Therefore, in the embodiment of the present disclosure, when the output recognition result data is extremely close to the historical recognition result data in the historical sample data, the target recognition model is considered to be obtained.
Considering complexity of model training and experimental efficiency, whether the output touch coordinate is extremely close to the historical recognition result data in the historical sample data may be determined by setting a convergence condition.
In some embodiments, the convergence condition may be: and the difference between the output identification result data and the historical identification result data is less than a preset value. That is, a preset value is set, the difference between the output recognition result data and the historical recognition result data in the historical sample data is compared with the preset value, if the difference is smaller than the preset value, the convergence condition is considered to be met, the model parameter at the moment is the model parameter of the final target recognition model, and the target recognition model is considered to be obtained.
If the difference value between the output identification result data and the historical identification result data in the corresponding historical sample data is not less than the preset value all the time, the model parameters are continuously adjusted, the historical processing data are processed through the new neural network model after the model parameters are adjusted, the identification result data are output again, the identification result data are continuously compared with the historical identification result data in the corresponding historical sample data until the difference value is less than the preset value, and the target identification model is obtained.
In some embodiments, the processing, based on a preset target recognition model, the gesture data corresponding to the current gesture and the sensing signal to determine whether a tapping motion is detected on the mobile terminal includes:
inputting the attitude data and the sensing signal into the preset target recognition model to obtain output recognition result data;
and comparing the identification result data with a preset probability threshold value, and determining whether the mobile terminal detects a knocking action.
Here, when the posture data corresponding to the sensing signal and the current posture is processed based on the target recognition model after the target recognition model is obtained, the approximate probability is obtained as data such as "0" or "1", but may be "0.96" or "0.02". To increase the accuracy of detection and reduce the workload, the present disclosure sets a probability threshold to further classify the output results, e.g., to classify the output results as: "0" or "1", where "0 is that no tapping motion is detected" and "1 is that tapping motion is detected".
The probability threshold may be determined based on empirical values or may be determined based on accuracy in detection, which is not limited by this disclosure.
For example, assuming that the probability threshold is 0.1, if the output recognition result data is 0.95, since the difference between 0.95 and the recognition result data 1 is less than 0.1, the recognition result of this time is considered to be: a tapping action is detected. Correspondingly, if the output recognition result data is 0.05, since the difference between 0.05 and the recognition result data 0 is also less than 0.1, the recognition result of this time is considered as: no tapping action is detected.
The present disclosure also provides the following embodiments: fig. 5 is a schematic flowchart illustrating a touch detection method according to an exemplary embodiment, and as shown in fig. 5, the touch detection method may be divided into 3 steps: the method comprises the steps of firstly establishing a model, secondly training the model and thirdly using the model.
In the first step of model establishment, historical attitude data can be acquired based on an accelerometer and historical sensing signals are acquired based on a gyroscope in historical use of the mobile terminal, and the historical sensing signals and historical recognition result data corresponding to the historical attitude data form historical sample data.
In the second step of model establishment, training the selected initial recognition model based on the historical sample data in the first step; for example, the selected initial recognition model may be a BP model. In training, model parameters are continuously optimized based on output results to obtain a target prediction model. Here, in the training, under different posture data, the judgment parameters of the corresponding tapping signals may be different. Similarly, the detected tapping signal varies with the different sensing data.
And in the third step of using the model, reading the sensing signals and the attitude data in real time, inputting the sensing signals and the attitude data into the target recognition model and outputting recognition result data so as to improve the knocking recognition effect.
Therefore, a target recognition model capable of accurately detecting the knocking action can be obtained by carrying out iteration processing on the initial recognition model to be trained through historical sample data acquired in historical use, and a basis is provided for subsequent accurate detection of the knocking action and corresponding functional response.
The present disclosure further provides a touch detection device, and fig. 6 is a schematic structural diagram of a touch detection device according to an exemplary embodiment, as shown in fig. 6, the touch detection device 500, applied to a mobile terminal, includes:
a signal acquiring module 501, configured to acquire a sensing signal acquired by at least one inertial sensor in the mobile terminal;
a pose determination module 502 for determining a current pose of the mobile terminal;
and an action detection module 503, configured to determine whether a tapping action is detected on the mobile terminal according to the current posture and the sensing signal.
In some embodiments, the current pose comprises: a first pose indicating a screen orientation and a second pose indicating a screen orientation;
the motion detection module includes:
the waveform determining module is used for determining whether the signal waveform on the target axis in the sensing signal conforms to a preset waveform or not based on the first posture;
the variable quantity determining module is used for determining the variable quantity of the target attitude angle according to the first attitude if the first attitude accords with the preset waveform;
and the action detection submodule is used for determining whether a knocking action is detected on the mobile terminal according to the variable quantity of the target attitude angle determined based on the first attitude and a preset angle threshold determined based on the second attitude.
In some embodiments, the action detection sub-module is further configured to:
and when the variation of the target attitude angle determined based on the first attitude is larger than a preset angle threshold determined based on the second attitude, determining that the knocking action is detected on the mobile terminal.
In some embodiments, the apparatus further comprises:
a first determining module, configured to determine that the preset angle threshold is: a first angle threshold;
the second determining module is used for determining that the preset angle threshold is a second angle threshold when the second posture indicates that the screen of the mobile terminal faces away from the ground; wherein the second angle threshold is greater than the first angle threshold.
In some embodiments, the first pose comprises: a horizontal screen or a vertical screen;
the device further comprises:
a third determining module, configured to determine, when the mobile terminal is on a landscape screen, that a target axis based on the first posture is: an axis on which the long side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: transverse roll angle;
a fourth determining module, configured to determine, when the mobile terminal is in a portrait screen, that a target axis based on the first posture is: an axis on which the short side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: and (6) a pitch angle.
In some embodiments, the motion detection module comprises:
and the model processing module is used for processing the sensing signal and the posture data corresponding to the current posture based on a preset target recognition model and determining whether a knocking action is detected on the mobile terminal.
In some embodiments, the apparatus further comprises:
the data acquisition module is used for acquiring historical sample data acquired on the mobile terminal based on historical touch actions, wherein the historical sample data comprises: historical processing data and corresponding historical recognition result data; the historical processing data includes: historical sensing signals and historical attitude data;
and the iteration processing module is used for inputting the historical sample data into an initial recognition model to be trained for iteration processing until the difference value between the output recognition result data and the historical recognition result data in the historical sample data meets a preset convergence condition, so as to obtain the target recognition model.
In some embodiments, the model processing module is further configured to:
inputting the attitude data and the sensing signal into the target recognition model to obtain recognition result data;
and comparing the identification result data with a preset probability threshold value, and determining whether the mobile terminal detects a knocking action.
With regard to the apparatus in the above embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be described in detail here.
Fig. 7 is a block diagram illustrating a touch sensing device 1800 according to an exemplary embodiment. For example, the apparatus 1800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, fitness device, personal digital assistant, and so forth.
Referring to fig. 7, the apparatus 1800 may include one or more of the following components: a processing component 1802, a memory 1804, a power component 1806, a multimedia component 1808, an audio component 1810, an input/output (I/O) interface 1812, a sensor component 1814, and a communications component 1816.
The processing component 1802 generally controls overall operation of the device 1800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1802 may include one or more processors 1820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 1802 may also include one or more modules that facilitate interaction between the processing component 1802 and other components. For example, the processing component 1802 can include a multimedia module to facilitate interaction between the multimedia component 1808 and the processing component 1802.
The memory 1804 is configured to store various types of data to support operation at the apparatus 1800. Examples of such data include instructions for any application or method operating on the device 1800, contact data, phonebook data, messages, images, videos, and so forth. The memory 1804 may be implemented by any type or combination of volatile or non-volatile storage devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 1806 provide power to various components of the device 1800. The power components 1806 may include: a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the apparatus 1800.
The multimedia component 1808 includes a screen that provides an output interface between the device 1800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1808 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 1800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and/or rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
Audio component 1810 is configured to output and/or input audio signals. For example, audio component 1810 may include a Microphone (MIC) configured to receive external audio signals when apparatus 1800 is in an operational mode, such as a call mode, a record mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1804 or transmitted via the communication component 1816. In some embodiments, audio component 1810 may further comprise a speaker for outputting audio signals.
I/O interface 1812 provides an interface between processing component 1802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 1814 includes one or more sensors to provide various aspects of state assessment for the apparatus 1800. For example, the sensor assembly 1814 can detect an open/closed state of the device 1800, the relative positioning of components such as a display and keypad of the device 1800, the sensor assembly 1814 can also detect a change in position of the device 1800 or a component of the device 1800, the presence or absence of user contact with the device 1800, orientation or acceleration/deceleration of the device 1800, and a change in temperature of the device 1800. The sensor assembly 1814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1816 is configured to facilitate communications between the apparatus 1800 and other devices in a wired or wireless manner. The device 1800 may access a wireless network based on a communication standard, such as WiFi, 2G, or tri G, or a combination thereof. In an exemplary embodiment, the communication component 1816 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 1816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, or other technologies.
In an exemplary embodiment, the apparatus 1800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided that includes instructions, such as the memory 1804 that includes instructions executable by the processor 1820 of the device 1800 to perform the above-described method. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium, wherein instructions, when executed by a processor, enable performance of the above-described method.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (18)

1. A touch detection method is applied to a mobile terminal and comprises the following steps:
acquiring a sensing signal acquired by at least one inertial sensor in the mobile terminal;
determining a current posture of the mobile terminal;
and determining whether a tapping action is detected on the mobile terminal according to the current posture and the sensing signal.
2. The method of claim 1, wherein the current pose comprises: a first pose indicating a screen orientation and a second pose indicating a screen orientation;
the determining whether a tapping action is detected on the mobile terminal according to the current posture and the sensing signal comprises:
determining whether a signal waveform on a target axis in the sensing signal conforms to a preset waveform or not based on the first posture;
if the first attitude is in accordance with the preset waveform, determining the variation of the target attitude angle according to the first attitude;
and determining whether the knocking action is detected on the mobile terminal according to the variable quantity of the target attitude angle determined based on the first attitude and a preset angle threshold determined based on the second attitude.
3. The method according to claim 2, wherein determining whether a tapping action is detected on the mobile terminal according to the variation of the target attitude angle determined based on the first attitude and a preset angle threshold determined based on the second attitude comprises:
and when the variation of the target attitude angle determined based on the first attitude is larger than a preset angle threshold determined based on the second attitude, determining that the knocking action is detected on the mobile terminal.
4. The method of claim 2, further comprising:
when the second posture indicates that the screen of the mobile terminal faces the ground, determining that the preset angle threshold is as follows: a first angle threshold;
when the second posture indicates that the screen of the mobile terminal faces away from the ground, determining that the preset angle threshold is as follows: a second angle threshold; wherein the second angle threshold is greater than the first angle threshold.
5. The method of claim 2, wherein the first gesture comprises: a horizontal screen or a vertical screen;
the method further comprises the following steps:
when the mobile terminal is in a landscape screen, the target axis determined based on the first posture is as follows: an axis on which the long side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: transverse roll angle;
when the mobile terminal is in a vertical screen, the target axis determined based on the first posture is as follows: an axis on which the short side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: and (6) a pitch angle.
6. The method of claim 1, wherein determining whether a tapping motion is detected on the mobile terminal according to the current gesture and the sensing signal comprises:
and processing the sensing signal and the posture data corresponding to the current posture based on a preset target recognition model, and determining whether a knocking action is detected on the mobile terminal.
7. The method of claim 6, further comprising:
acquiring historical sample data acquired on the mobile terminal based on historical touch actions, wherein the historical sample data comprises: historical processing data and corresponding historical recognition result data; the historical processing data includes: historical sensing signals and historical attitude data;
inputting the historical sample data into an initial recognition model to be trained for iterative processing until a difference value between the output recognition result data and the historical recognition result data in the historical sample data meets a preset convergence condition, and obtaining the target recognition model.
8. The method according to claim 6, wherein the processing the sensing signal and the posture data corresponding to the current posture based on a preset target recognition model to determine whether a tapping motion is detected on the mobile terminal comprises:
inputting the attitude data and the sensing signal into the target recognition model to obtain recognition result data;
and comparing the identification result data with a preset probability threshold value, and determining whether the mobile terminal detects a knocking action.
9. A touch detection device is applied to a mobile terminal and comprises:
the signal acquisition module is used for acquiring sensing signals acquired by at least one inertial sensor in the mobile terminal;
the gesture determining module is used for determining the current gesture of the mobile terminal;
and the action detection module is used for determining whether a knocking action is detected on the mobile terminal according to the current posture and the sensing signal.
10. The apparatus of claim 9, wherein the current pose comprises: a first pose indicating a screen orientation and a second pose indicating a screen orientation;
the motion detection module includes:
the waveform determining module is used for determining whether a signal waveform on a target axis in the sensing signal conforms to a preset waveform or not based on the first posture;
the variable quantity determining module is used for determining the variable quantity of the target attitude angle according to the first attitude if the target attitude angle accords with the preset waveform;
and the action detection submodule is used for determining whether the mobile terminal detects a knocking action according to the variable quantity of the target attitude angle determined based on the first attitude and a preset angle threshold determined based on the second attitude.
11. The apparatus of claim 10, wherein the motion detection sub-module is further configured to:
and when the variation of the target attitude angle determined based on the first attitude is larger than a preset angle threshold determined based on the second attitude, determining that the knocking action is detected on the mobile terminal.
12. The apparatus of claim 10, further comprising:
a first determining module, configured to determine that the preset angle threshold is: a first angle threshold;
the second determining module is used for determining that the preset angle threshold is a second angle threshold when the second posture indicates that the screen of the mobile terminal faces away from the ground; wherein the second angle threshold is greater than the first angle threshold.
13. The apparatus of claim 10, wherein the first pose comprises: a horizontal screen or a vertical screen;
the device further comprises:
a third determining module, configured to determine, when the mobile terminal is in a landscape, that a target axis based on the first posture is: an axis on which the long side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: transverse roll angle;
a fourth determining module, configured to determine, when the mobile terminal is in a portrait screen, that the target axis based on the first posture is: an axis on which the short side of the mobile terminal is located; and the target attitude angle determined based on the first attitude is: and (6) a pitch angle.
14. The apparatus of claim 9, wherein the motion detection module comprises:
and the model processing module is used for processing the sensing signal and the posture data corresponding to the current posture based on a preset target recognition model and determining whether a knocking action is detected on the mobile terminal.
15. The apparatus of claim 14, further comprising:
the data acquisition module is used for acquiring historical sample data acquired on the mobile terminal based on historical touch actions, wherein the historical sample data comprises: historical processing data and corresponding historical recognition result data; the historical processing data includes: historical sensing signals and historical attitude data;
and the iteration processing module is used for inputting the historical sample data into an initial recognition model to be trained for iteration processing until the difference value between the output recognition result data and the historical recognition result data in the historical sample data meets a preset convergence condition, so as to obtain the target recognition model.
16. The apparatus of claim 14, wherein the model processing module is further configured to:
inputting the attitude data and the sensing signal into the target recognition model to obtain recognition result data;
and comparing the identification result data with a preset probability threshold value, and determining whether the mobile terminal detects a knocking action.
17. A touch detection device, comprising:
a processor and a memory for storing executable instructions operable on the processor, wherein:
the processor is configured to execute the executable instructions, which when executed perform the steps of the method as provided in any one of the preceding claims 1 to 8.
18. A non-transitory computer-readable storage medium having stored thereon computer-executable instructions that, when executed by a processor, perform steps in a method as provided by any one of claims 1 to 8.
CN202110576615.9A 2021-05-26 2021-05-26 Touch detection method and device and storage medium Pending CN115480666A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110576615.9A CN115480666A (en) 2021-05-26 2021-05-26 Touch detection method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110576615.9A CN115480666A (en) 2021-05-26 2021-05-26 Touch detection method and device and storage medium

Publications (1)

Publication Number Publication Date
CN115480666A true CN115480666A (en) 2022-12-16

Family

ID=84419870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110576615.9A Pending CN115480666A (en) 2021-05-26 2021-05-26 Touch detection method and device and storage medium

Country Status (1)

Country Link
CN (1) CN115480666A (en)

Similar Documents

Publication Publication Date Title
US10191564B2 (en) Screen control method and device
CN106572299B (en) Camera opening method and device
CN105488527B (en) Image classification method and device
KR101712301B1 (en) Method and device for shooting a picture
EP3038345B1 (en) Auto-focusing method and auto-focusing device
EP3173970A1 (en) Image processing method and apparatus
CN107656682B (en) Mobile terminal and bending angle calculation method
CN111105454A (en) Method, device and medium for acquiring positioning information
CN112414400B (en) Information processing method and device, electronic equipment and storage medium
CN112202962B (en) Screen brightness adjusting method and device and storage medium
EP4040332A1 (en) Method and apparatus for upgrading an intelligent model and non-transitory computer readable storage medium
CN110930351A (en) Light spot detection method and device and electronic equipment
CN104123075A (en) Method and device for controlling terminal
CN105678296A (en) Method and apparatus for determining angle of inclination of characters
CN112525224B (en) Magnetic field calibration method, magnetic field calibration device, and storage medium
CN114290338A (en) Two-dimensional hand-eye calibration method, device, storage medium, and program product
CN112187995A (en) Illumination compensation method, illumination compensation device, and storage medium
CN113642551A (en) Nail key point detection method and device, electronic equipment and storage medium
CN115480666A (en) Touch detection method and device and storage medium
CN108595930B (en) Terminal device control method and device
CN113315904A (en) Imaging method, imaging device, and storage medium
CN105510939B (en) Obtain the method and device of motion path
EP3889637A1 (en) Method and device for gesture detection, mobile terminal and storage medium
CN115268677A (en) Touch method, touch device and computer readable storage medium
CN109670432B (en) Action recognition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination