CN114415850A - Control method and device, touch control pen and computer readable storage medium - Google Patents

Control method and device, touch control pen and computer readable storage medium Download PDF

Info

Publication number
CN114415850A
CN114415850A CN202111639336.9A CN202111639336A CN114415850A CN 114415850 A CN114415850 A CN 114415850A CN 202111639336 A CN202111639336 A CN 202111639336A CN 114415850 A CN114415850 A CN 114415850A
Authority
CN
China
Prior art keywords
mode
stylus
gesture
sensor
touch pen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111639336.9A
Other languages
Chinese (zh)
Inventor
李常富
张丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202111639336.9A priority Critical patent/CN114415850A/en
Publication of CN114415850A publication Critical patent/CN114415850A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application discloses a control method, a device, a touch pen and a computer readable storage medium, wherein the touch method is applied to the touch pen comprising an attitude sensor, and the control method comprises the following steps: responding to the acquired preset instruction, and controlling the stylus to enter a first mode; in the first mode, the touch pen can identify the motion track of the touch pen as a gesture command based on the gesture sensor; controlling the stylus to switch from a first mode to a second mode when the gesture sensor recognizes a first gesture command in the first mode; in the second mode, the stylus can recognize a motion trajectory thereof as a position coordinate based on the posture sensor.

Description

Control method and device, touch control pen and computer readable storage medium
Technical Field
The present application relates to the field of automatic control technology, and relates to, but is not limited to, a control method, apparatus, stylus pen, and computer-readable storage medium.
Background
With the continuous development of scientific technology, electronic devices are applied more and more in scenes, such as online education, online medical treatment, lectures, meetings and the like. In addition, for convenience of operation and display, the electronic device is often used in combination with a capacitive or electromagnetic stylus to bring better experience by utilizing higher linearity, precision and pressure of the stylus.
In the related art, the capacitive pen is provided with a physical key or a touch film to implement an auxiliary function of the touch pen, for example, waking up, function switching, etc. are implemented through the physical key or the touch film. And with the increase of the use times, the sensitivity of the touch pen is often reduced.
Disclosure of Invention
In view of the above, embodiments of the present application provide a control method, an apparatus, a stylus pen and a computer-readable storage medium.
The technical scheme of the embodiment of the application is realized as follows:
an embodiment of the present application provides a control method, including:
responding to the acquired preset instruction, and controlling the stylus to enter a first mode;
in the first mode, the touch pen can identify the motion track of the touch pen as a gesture command based on the gesture sensor;
controlling the stylus to switch from a first mode to a second mode when the gesture sensor recognizes a first gesture command in the first mode;
in the second mode, the stylus can recognize a motion trajectory thereof as a position coordinate based on the posture sensor.
An embodiment of the present application provides a stylus, the stylus includes:
the gesture sensor is used for identifying the motion track as a gesture command or a position coordinate;
the controller responds to the acquired preset instruction and controls the stylus to enter a first mode;
when the gesture sensor recognizes a first gesture command in a first mode, the controller controls the stylus to switch from the first mode to a second mode;
in the first mode, the touch pen can identify the motion track of the touch pen as a gesture command based on the gesture sensor;
in the second mode, the stylus can recognize a motion trajectory thereof as a position coordinate based on the posture sensor.
In some embodiments, the stylus further comprises a transceiver for transmitting one of a motion trajectory of the stylus, packet-processed packet trajectory information, the gesture command, and the location coordinate to the electronic device manipulated by the stylus.
An embodiment of the present application provides a control device, including:
the response module is used for responding to the acquired preset instruction and controlling the touch pen to enter a first mode; in the first mode, the touch pen can identify the motion track of the touch pen as a gesture command based on the gesture sensor;
the control module is used for controlling the stylus to be switched from a first mode to a second mode when the gesture sensor identifies a first gesture command in the first mode; in the second mode, the stylus can recognize a motion trajectory thereof as a position coordinate based on the posture sensor.
An embodiment of the present application provides a computer-readable storage medium, in which computer-executable instructions are stored, and the computer-executable instructions are configured to execute the control method.
The embodiment of the application provides a control method, a device, a touch pen and a computer readable storage medium, wherein the control method is applied to the touch pen comprising an attitude sensor, and the control method comprises the following steps: under the condition that the stylus receives a preset command of being held, picked up or moved, the preset command is responded by controlling the stylus to enter a first mode, wherein the stylus can recognize the motion track of the stylus as a gesture command based on a gesture sensor of the stylus under the first mode; then, if the gesture sensor identifies a first gesture command in the first mode, controlling the stylus to switch from the first mode to a second mode, wherein in the second mode, the stylus can identify the motion track of the stylus as the position coordinate based on the gesture sensor; finally, the stylus also sends the position coordinates to the manipulated electronic device, so that the electronic device moves the cursor in its display screen based on the position coordinates. Therefore, mode switching can be realized based on the motion track detected by the attitude sensor, misoperation is reduced, and convenience in operation is improved. In addition, the gesture sensor is less interfered by the external environment, so that the touch pen can keep high sensitivity.
Drawings
In the drawings, which are not necessarily drawn to scale, like reference numerals may describe similar components in different views. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed herein.
Fig. 1 is a schematic flow chart of an implementation of a control method provided in the related art;
FIG. 2 is a schematic flow chart of another implementation of a control method provided in the related art;
fig. 3 is a schematic flow chart of an implementation of the control method according to the embodiment of the present application;
fig. 4 is a schematic flowchart of an implementation of a method for entering a first mode according to an embodiment of the present application;
fig. 5 is a schematic flowchart of another implementation of the control method according to the embodiment of the present application;
fig. 6A is a schematic flowchart of an implementation of a method for entering an initialization state according to an embodiment of the present application;
fig. 6B is a schematic flowchart of an implementation of a method for entering a sleep mode according to an embodiment of the present application;
fig. 7 is a schematic diagram of a function implementation and switching block diagram of a stylus pen according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a control device according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a stylus according to an embodiment of the present disclosure.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order, but rather the terms "first \ second \ third" are used to interchange specific orders or sequences, where appropriate, so as to enable the embodiments of the application described herein to be practiced in other than the order shown or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
In the related art, the auxiliary function of the stylus pen can be realized by the following two ways:
the first mode is a solid key and a basic key design scheme. Referring to fig. 1, through steps S101 to S105, key values may be uploaded to a host system (i.e., an electronic device operated by a stylus) through a pen protocol or bluetooth, so as to implement various key functions.
However, this method has the following disadvantages: the pen body shell needs to be dug, the design of a closed integrated pen body cannot be realized, and the appearance is not good; the keycaps, the key supports and the key materials are added, the assembly steps are increased, and the cost is increased; the PCBA board needs to reserve a key space, and the PCBA board frame needs to be large; if the waterproof rubber mat is not added, the waterproof rubber mat cannot be used for waterproofing, and the cost is higher due to the addition of the waterproof rubber mat; the key position is near the hand holding position, so that the error touch is easy to occur; the hand feeling of the key is affected as the number of times of use increases.
And in the second mode, the design scheme of the touch film and the pen body shell without digging holes is adopted. Referring to fig. 2, key values are uploaded to the host system through bluetooth through steps S201 to S204, so as to implement functions such as double-click wake-up or eraser switching.
However, this method has the following disadvantages: the whole set of materials of the touch control film is added, the assembly steps are increased, and the cost is high; the touch film is required to be tightly attached to the inner wall of the pen body, and the material of the pen body is limited and cannot be a metal material with a signal shielding effect; the touch control film needs to be tightly attached to the inner wall of the pen body, the coverage area is large, false touch is easy to occur, and key values capable of being output are few in order to avoid false touch; in order to handle touch reporting while holding, the power consumption brought by the touch film is large.
Based on the problems in the related art, the embodiments of the present application provide a control method, which can be applied to a stylus pen, and the method provided in the embodiments can be implemented by a computer program, and when the computer program is executed, each step in the control method provided in the embodiments is completed. In some embodiments, the computer program may control a processor in the stylus to execute. Fig. 3 is a schematic flow chart of an implementation of a control method provided in an embodiment of the present application, and as shown in fig. 3, the control method includes:
step S301, in response to the acquired predetermined instruction, controlling the stylus to enter a first mode.
Here, the stylus may be a capacitive stylus or an electromagnetic stylus; the touch control pen is also connected with the electronic device in a communication mode, and the electronic device connected with the touch control pen in the communication mode can be operated, such as writing on the electronic device or remotely controlling page turning, power off and the like of a display interface of the electronic device.
In the embodiment of the present application, a gesture sensor is disposed in the stylus pen, and the gesture sensor can recognize a motion track in any direction, for example, the gesture sensor may be a six-axis sensor, wherein the gesture sensor may be composed of three accelerometers and three gyroscopes.
In practical implementation, the starting state of the stylus is an initialization state, and in the initialization state, if pose change data is detected by the pose sensor, a predetermined instruction is obtained, where the pose change data is used to represent data indicating that the pose of the stylus changes, for example, when the stylus is held up or moved, the pose sensor generates pose change data; and then, responding to the acquired preset instruction by controlling the stylus to switch from the initialization state to the first mode, wherein when the stylus is in the first mode, the stylus can identify the motion track of the stylus as a gesture command based on the gesture sensor.
In the embodiment of the present application, the gesture command is a command for switching an operating mode of the stylus, where the operating mode of the stylus may include a first mode, a second mode, and a third mode, and based on this, the stylus is capable of switching the stylus from the first mode to the second mode or the third mode based on the gesture command.
In step S302, when the gesture sensor recognizes a first gesture command in the first mode, the stylus is controlled to switch from the first mode to the second mode.
Here, the first gesture command may be a default gesture command, and the gesture command may also be a custom gesture command; for example, the first gesture command may be a command formed by a stylus pen clockwise scribing a first specific track, the first specific track may be a circle with a radius larger than a first radius threshold, and the first specific track may also be a square with a side length larger than a first side length threshold.
In the embodiment of the application, if the stylus is in the first mode and the gesture sensor further recognizes the first gesture command, the stylus is controlled to be switched from the first mode to the second mode, so that mode switching of the stylus is realized. When the touch pen is in the second mode, the touch pen can identify the motion track of the touch pen as the position coordinate based on the attitude sensor.
In practical implementation, the second mode may be an air mouse mode, that is, in the second mode, the stylus pen may move or manipulate a cursor in the display screen of the electronic device through its own motion track.
In some embodiments, when the stylus is in the second mode, the stylus constructs a motion coordinate system, obtains a motion track of the stylus through the attitude sensor, and converts the motion track into position coordinates by combining the motion coordinate system; then, the touch control pen sends the position coordinates to the manipulated electronic equipment through the established communication connection, so that the electronic equipment moves a cursor in the display screen of the electronic equipment based on the position coordinates, and the function of the air mouse is realized.
In the embodiment of the present application, through the above steps S301 and S302, when the stylus receives a predetermined instruction of being held, picked up, or moved, the stylus is controlled to enter the first mode to respond to the predetermined instruction, where in the first mode, the stylus can recognize its own motion trajectory as a gesture command based on its own posture sensor; then, if the gesture sensor identifies a first gesture command in the first mode, controlling the stylus to switch from the first mode to a second mode, wherein in the second mode, the stylus can identify the motion track of the stylus as the position coordinate based on the gesture sensor; finally, the stylus also sends the position coordinates to the manipulated electronic device, so that the electronic device moves the cursor in its display screen based on the position coordinates. Therefore, mode switching can be realized based on the motion track detected by the attitude sensor, misoperation is reduced, and convenience in operation is improved. In addition, because the gesture sensor is slightly interfered by the external environment, the touch pen can always keep high sensitivity in the using process.
In some embodiments, as shown in fig. 4, the step S301 "controlling the stylus to enter the first mode in response to the obtained predetermined instruction" may be implemented by:
in step S3011, it is determined whether the stylus is in an initialization state.
Here, the initialization state refers to a configuration in which all the setting data in the stylus pen are restored to factory, and in actual implementation, the step S3011 may be implemented in three ways:
the first method is as follows: and acquiring firmware information of the stylus, and determining that the stylus is in an initialization state if the firmware information represents that the firmware of the stylus is in an initialization mode.
Here, the firmware information of the stylus may be periodically or in real time acquired from the memory, and if the firmware information is factory information set by the firmware when the firmware is shipped, the firmware information may represent that the firmware of the stylus is in an initialization mode. Meanwhile, if the firmware information is not factory information set when the firmware is factory, the firmware information represents that the firmware of the stylus is not in the initialization mode, and based on the firmware information, the stylus is not in the initialization state.
The second method comprises the following steps: and acquiring the state information of a charging module in the stylus, and determining that the stylus is in an initialization state if the state information represents that the stylus is in a charging state.
For example, the state information of the charging module may be 0 and 1, where 0 is the corresponding state information when the charging module is not connected to the external power source, and 1 is the corresponding state information when the charging module is connected to the external power source. Based on this, the state information of the charging module can be acquired periodically or in real time, if the state information of the charging module is 1, the charging module is characterized to be connected with an external power supply, the touch pen can be charged through the external power supply, and at this time, the touch pen is determined to be in an initialization state. And if the state information of the charging module is 0, the charging module is not connected with an external power supply, the touch pen is not charged, and at the moment, the touch pen is determined not to be in an initialization state.
The third method comprises the following steps: and acquiring induction information of the induction control in the stylus, and determining that the stylus is in an initialization state if the induction information represents that the position relationship between the stylus and the matching device meets a preset relationship.
Here, the matching device may be a pen slot, a charging seat, or a charging device, and when the stylus contacts the matching device, the touch control generates sensing information through the sensing control, and at this time, it is determined that the positional relationship between the stylus and the matching device satisfies a predetermined relationship, and based on this, it is determined that the stylus is in an initialization state. When the touch control pen is not in contact with the matching device, the sensing control cannot generate sensing information, the position relation between the touch control pen and the matching device is considered not to meet the preset relation, and the touch control pen is determined not to be in an initialization state.
In the embodiment of the present application, if it is determined that the stylus pen is in the initialization state, step S3012 is performed; if the stylus pen is not in the initialization state, the process proceeds to step S3014.
Step S3012, a predetermined instruction is generated based on the detected posture change data of the posture sensor.
At this time, the stylus is in an initialized state, the attitude sensor of the stylus detects pose change data in real time, and when the pose change data is detected, a predetermined instruction is generated based on the pose change data, wherein the predetermined instruction is a state switching instruction, and the stylus can be switched from the initialized state to other states based on the predetermined instruction.
In step S3013, control switches the stylus from the initialization state to the first mode.
Here, the stylus may be switched from the initialization state to the first mode based on a predetermined instruction, so as to achieve the purpose of changing the state of the stylus by the predetermined instruction.
Step S3014, determine the current mode of the stylus, and generate a mode prompt message.
Here, the current mode of the stylus may be obtained through the state reading instruction, and a mode prompt message may be generated based on the current mode. The mode prompting message may be a character type message, and the mode prompting message may also be a sound type message.
Step S3015, send the mode prompting message to the electronic device operated by the stylus, so that the electronic device obtains the current mode of the stylus.
Here, the stylus transmits a mode prompting message to the electronic device through the established communication connection, so that the electronic device can know a current mode of the stylus, wherein the electronic device is in communication with the stylus, and meanwhile, the stylus can operate the electronic device through the established communication connection, for example, the stylus writes, clicks and the like on the electronic device; then, the electronic device can also output the mode prompt message, and when the mode prompt message is actually output, the mode prompt message of a character type can be output through a display screen of the electronic device, and the mode prompt message of a sound type can also be output through a buzzer of the electronic device.
In this embodiment of the application, through the steps S3011 to S3015, when the stylus is determined to be in the initialization state, if the gesture sensor further detects pose change data, a predetermined instruction is generated based on the pose change data to control the stylus to switch from the initialization state to the first mode; and if the attitude sensor does not detect the pose change data, keeping the current initialization state. Then, the current mode of the touch pen can be obtained, and a mode prompt message is generated; and finally, sending the mode prompt message to the electronic equipment controlled by the touch pen, so that the electronic equipment obtains the current mode of the touch pen, the information intercommunication between the touch pen and the electronic equipment is enhanced, and the cooperation of the touch pen and the electronic equipment is improved. Therefore, mode switching can be achieved through the touch posture change, the mode switching process is simplified, and convenience of mode switching is improved. In addition, whether the touch pen is in an initialization state can be judged in multiple modes, and robustness of the touch pen is improved.
In some embodiments, as shown in fig. 5, after the step S301 "controlling the stylus to enter the first mode in response to the obtained predetermined instruction" is performed, the following steps may be further performed:
step S302' judges whether the attitude sensor recognizes the first gesture command.
Here, the motion trajectory recognized by the gesture sensor may be obtained first, and then, whether the motion trajectory is the same as the first specific trajectory corresponding to the first gesture command is determined through a comparison method, if the motion trajectory is the same as the first specific trajectory corresponding to the first gesture command, the gesture sensor is characterized to recognize the first gesture command, and the process proceeds to step S303'. If the motion trajectory is not the same as the first specific trajectory corresponding to the first gesture command, the gesture sensor does not recognize the first gesture command, and the process proceeds to step S304'.
In this embodiment of the application, the first gesture command may be a default gesture command, and the gesture instruction may also be a custom gesture command; for example, the first gesture command may be a command formed by a stylus pen clockwise scribing a first specific track, the first specific track may be a circle with a radius larger than a first radius threshold, and the first specific track may also be a square with a side length larger than a first side length threshold.
In step S303', the stylus is controlled to switch from the first mode to the second mode.
At this time, the motion track is the same as the first specific track corresponding to the first gesture command, and the gesture sensor recognizes the first gesture command, and then the stylus is controlled to switch from the first mode to the second mode based on the first gesture command. The second mode may be an air mouse mode, that is, in the second mode, the stylus may move or manipulate a cursor in the display screen of the electronic device through its own motion track.
And step S304', judging whether the motion trail meets the preset condition.
At this time, the motion track is not the same as the first specific track corresponding to the first gesture command, and if the first gesture command is not recognized by the representation attitude sensor, whether the motion track meets the predetermined condition is continuously judged.
Here, the predetermined condition may be a specific character track, and exemplarily, the predetermined condition may be a C track, a W track, an M track, or the like. In the embodiment of the present application, whether the motion trajectory meets the predetermined condition may be determined by a comparison method, and if the motion trajectory meets the predetermined condition, the step S305' is performed; and if the motion trajectory does not satisfy the predetermined condition, the process proceeds to step S307'.
In step S305', the motion trajectory is encapsulated to obtain encapsulation trajectory information.
At the moment, if the motion trail meets the preset condition, the motion trail is cut and edited to realize the packet processing of the motion trail, and packet trail information is obtained.
Step S306', the packet track information is sent to the electronic device operated by the stylus pen, so that the electronic device determines and responds to the gesture command corresponding to the packet track information.
Here, the stylus pen transmits the packet track information to the manipulated electronic device, which may be a computer, a mobile phone, a television, etc., through the established communication connection, so that the electronic device determines and responds to the gesture command corresponding to the packet track information.
In actual implementation, after receiving the packet track information, the electronic device determines a gesture instruction corresponding to the packet track information, for example, a screen clearing instruction corresponding to the C track; and then responds to the gesture command. For example, when the electronic device determines that the gesture command corresponding to the packet information is a screen clearing command, the electronic device may clear the content displayed on the display screen of the electronic device based on the screen clearing command.
In some embodiments, when it is determined that the motion trajectory satisfies the predetermined condition, the stylus may further directly generate a corresponding gesture instruction based on the motion trajectory, for example, generate a screen clearing instruction based on the C trajectory; then, the gesture instruction is sent to the electronic device, so that the electronic device executes a corresponding operation based on the gesture instruction, for example, after the electronic device receives the screen clearing instruction, the electronic device executes an operation of clearing the display content of the display screen based on the screen clearing instruction.
In this application embodiment, no matter determine the gesture instruction through electronic equipment or determine the gesture instruction through the touch-control pen, the touch-control pen all can realize the control to electronic equipment through the movement track, promotes the convenience of control, improves control efficiency.
Step S307' determines whether the gesture sensor recognizes the second gesture command.
At this time, if the gesture sensor does not recognize the first gesture command and the motion trajectory does not meet the predetermined condition, continuing to judge whether the motion trajectory is a second specific trajectory corresponding to the second gesture command by using a comparison method, and if the motion trajectory is the second specific trajectory corresponding to the second gesture command and the characterization sensor recognizes the second gesture command, entering step S308'; and if the motion track is not the second specific track corresponding to the second gesture command, the gesture sensor does not recognize the second gesture command, and the step returns to step S302'.
In this embodiment of the application, the second gesture command may be a default gesture command, and the gesture instruction may also be a custom gesture command; for example, the second gesture command may be a command formed by the stylus drawing a second specific track counterclockwise, the second specific track may be a circle with a radius larger than a second radius threshold, and the second specific track may also be a square with a side length larger than a second side length threshold.
In step S308', the stylus is controlled to switch from the first mode to the third mode.
At this time, the motion track is a second specific track gesture corresponding to the second gesture command, and if the characterization sensor recognizes the second gesture command, the stylus is controlled to switch from the first mode to the third mode based on the second gesture command. The third mode may be a writing mode of the stylus pen, and the stylus pen can recognize the gravity center change information of the stylus pen in the third mode.
Step S309', in response to the motion trajectory generated by the stylus touch on the electronic device manipulated by the stylus, generating a corresponding motion trajectory on the display screen of the electronic device.
Here, since the stylus pen can recognize the gravity center change information of itself in the third mode without recognizing the gesture command, the power consumption of the third mode is less than that of the second mode.
In this application embodiment, the stylus can generate a corresponding movement track by acting on the display screen of the electronic device, and also can send the movement track to the electronic device, so that the movement track is displayed through the display screen of the electronic device, and the writing function of the stylus is realized.
In the embodiment of the present application, through the steps S302 'to S309', the motion trajectory of the stylus is first obtained through the attitude sensor, and then whether to perform mode switching is determined based on the motion trajectory, and if the mode switching is performed, which mode the stylus is switched from the initialization state to is also determined; in addition, after the mode switching is completed, the touch pen realizes the function corresponding to the switched mode, so that convenient and efficient mode switching is realized, misoperation can be avoided, and the switching accuracy is improved.
In some embodiments, to achieve flexibility of mode switching and energy saving, as shown in fig. 6A and 6B, after "determining the current mode of the stylus pen" in step S3014, the following steps may be further performed:
step S3015', it is determined whether the current mode is one of the first mode, the second mode, and the third mode.
Here, it is determined whether the current mode is one of the first mode, the second mode and the third mode by the comparison method, referring to fig. 6A, if the current mode is one of the first mode, the second mode and the third mode, step S3016A' is entered. If the current mode is not one of the first mode, the second mode, and the third mode, the process returns to step S3014.
Step S3016A', obtain the status information of the charging module in the stylus pen.
At this time, the current mode is one of the first mode, the second mode and the third mode, and then the state information of the charging module is obtained, where the implementation process of step S3016A 'is similar to the implementation process of the second mode in step S3011, and therefore, the implementation process of step S3016A' may refer to the implementation process of the second mode in step S3011.
Step S3017A', determine whether the status information indicates that the stylus is in the charging status.
Here, the implementation procedure of step S3017A 'is similar to the implementation procedure of mode two in step S3011, and therefore, the implementation procedure of step S3017A' may refer to the implementation procedure of mode two in step S3011.
In this embodiment of the application, if the state information indicates that the stylus is in the charging state, step S3018A'; if the state information indicates that the stylus pen is not in the charging state, the process returns to step S3016A'.
Step S3018A', the current mode is switched to the initialization state.
At this time, the state information represents that the stylus is in the charging state, and the current mode of the stylus is switched to the initialization state.
In this embodiment of the application, through the above steps S3015 'to S3018A', the current mode is one of the first mode, the second mode, and the third mode, the state information of the charging module is continuously obtained, and if the state information of the charging module indicates that the stylus is in the charging state, the stylus is switched to the initialization state, so that initialization of the stylus is achieved through a simple operation, so as to prepare for subsequent work.
In other embodiments, referring to fig. 6B, when implementing step S3015 '″ of determining whether the current mode is one of the first mode, the second mode, and the third mode, "if the current mode is one of the first mode, the second mode, and the third mode, step S3016B' may be further entered. If the current mode is not one of the first mode, the second mode, and the third mode, the process returns to step S3014.
Step S3016B', obtaining the standing time of the stylus by using the attitude sensor.
The gesture sensor also has a timing function, if the gesture sensor detects that the touch pen is static, timing is started until the touch pen moves or shakes, a timing result is obtained, and the timing result is the standing time length. The timing result can represent the duration of the stylus in standing.
Step S3017B', it is determined whether the standing time period is greater than the time period threshold value.
Here, the duration threshold may be a default value or a custom value, and may be, for example, 10 seconds, 11 seconds, 12 seconds, or the like.
In this embodiment of the application, the size relationship between the standing time and the time threshold may be determined by a size comparison method, and if the standing time is greater than the time threshold, it is indicated that the stylus does not move or shake for a long time, that is, the stylus does not operate the electronic device for a long time, the process proceeds to step S3018B'; and if the standing time length is less than or equal to the time length threshold value, which indicates that the stylus normally operates the electronic device, returning to step S3016B'.
In step S3018B', the mode of the stylus is switched to the sleep mode.
At this time, the standing time is longer than the time threshold, which indicates that the stylus does not move or shake for a long time, that is, the stylus does not operate the electronic device for a long time, and in order to save energy consumption, the mode of the stylus is switched to the sleep mode with lower power consumption.
In the embodiment of the application, the attitude sensor can identify the gravity center change information of the attitude sensor in the sleep mode. That is, whether the stylus is moved or shaken can be detected through the gravity center change information by the posture sensor.
Step S3019B' determines whether the attitude sensor recognizes the center of gravity change information.
Here, the information obtained by the sensor may be analyzed to obtain an analysis result. If the analysis result includes the gravity center change information, it is determined that the attitude sensor recognizes the gravity center change information, and the process proceeds to step S3020B'; if the center of gravity change information is not included in the analysis result, it is determined that the attitude sensor does not recognize the center of gravity change information, and the process proceeds to step S3021B'.
Step S3020B' returns the stylus from the sleep mode to the current mode.
At this time, the attitude sensor recognizes the gravity center change information, and represents that the stylus is moving or jittering, and then returns the stylus from the sleep mode to the current mode. The current mode is the current mode in step S3015', that is, the current mode is the mode of the stylus before switching to the sleep mode.
Step S3021B' maintains the sleep mode of the stylus.
At this time, the gravity center change information is not recognized by the attitude sensor, and the stylus is still kept still, so that the stylus is kept in the sleep mode.
In this embodiment, through the steps S3016B 'to S3021B', when the current mode of the stylus is one of the first mode, the second mode, and the third mode, the standing duration of the stylus is obtained, and if the standing duration is greater than a threshold value, the stylus is switched to the sleep mode; then, if the gravity center change information of the stylus is detected in the sleep mode, the stylus is returned to the current mode from the sleep mode, and the corresponding operation is executed again. Therefore, the power consumption of the touch control pen can be reduced on the premise of not influencing the normal work of the touch control pen, the energy is saved, and the service life of the touch control pen is prolonged.
Based on the foregoing embodiments, the present application further provides a control method, where the control method is applied to a stylus, and the stylus is provided with a 6-axis sensor, where the 6-axis sensor is equivalent to the attitude sensor in the foregoing embodiments. With reference to fig. 7, a schematic diagram of a function implementation and switching block diagram of a stylus provided in this embodiment is provided, and with reference to fig. 7, a 6-axis sensor is provided, and with reference to the magnetic wireless charging technology, while a sleep wake-up function and an air mouse function of a gravity sensor are implemented, the 6-axis sensor is fully utilized to have a gesture recognition capability, and an entity key and a touch membrane key function are replaced, so that a low-cost active stylus with rich gesture "key functions" and integration and high water resistance is implemented.
In the embodiment of the application, when the stylus magnetic attraction is charged to the corresponding magnetic attraction position, the 6-axis sensor is initialized to enter an attitude mode, and the attitude mode corresponds to the initialization state in the embodiment; various gesture definitions which are completed in one stroke can be set, and package data of corresponding tracks are uploaded to a host system through Bluetooth, wherein the host system corresponds to the electronic equipment in the embodiment; and then the host system calls or realizes the corresponding convenient function. Meanwhile, a gravity mode and an empty mouse mode with low power consumption are respectively switched into through a set specific gesture, so that a basic sleep awakening function and an empty mouse function are respectively realized, wherein the gravity mode corresponds to the third mode in the embodiment, and the empty mouse mode corresponds to the second mode in the embodiment.
Based on the control method provided by the embodiment of the application, the function of gesture recognition of the 6-axis sensor is fully utilized, the mode switching of the touch pen is quickly and simply realized, meanwhile, the working efficiency of the touch pen is improved, and the manufacturing cost is also reduced.
Based on the foregoing embodiments, the present application provides a control apparatus, where each module included in the apparatus and each unit included in each module may be implemented by a processor in a computer device; of course, the implementation can also be realized through a specific logic circuit; in the implementation process, the processor may be a CPU, a Microprocessor Unit (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
Fig. 8 is a schematic structural diagram of a control device provided in an embodiment of the present application, and as shown in fig. 8, the control device 800 includes:
a response module 801, configured to control the stylus to enter a first mode in response to an acquired predetermined instruction; in the first mode, the touch pen can identify the motion track of the touch pen as a gesture command based on the gesture sensor;
a control module 802 configured to control the stylus to switch from the first mode to a second mode when the gesture sensor recognizes a first gesture command in the first mode; in the second mode, the stylus can recognize a motion trajectory thereof as a position coordinate based on the posture sensor.
In some embodiments, the response module 801 includes:
and the control sub-module is used for generating the preset instruction based on the detected pose change data of the attitude sensor when the stylus is in an initialization state, and controlling the stylus to be switched to the first mode from the initialization state.
In some embodiments, the control device 800 further comprises:
the first acquisition module is used for acquiring firmware information of the stylus, and if the firmware information represents that the firmware of the stylus is in an initialization mode, determining that the stylus is in the initialization state; or,
the second obtaining module is used for obtaining the state information of the charging module in the stylus, and if the state information represents that the stylus is in the charging state, the stylus is determined to be in the initialization state; or,
and the third acquisition module is used for acquiring induction information of an induction control in the stylus, and determining that the stylus is in the initialization state if the position relation between the stylus and a matching device represented by the induction information meets a preset relation.
In some embodiments, the control device 800 further comprises:
the first generation module is used for determining the current mode of the touch pen and generating a mode prompt message;
the first sending module is used for sending the mode prompt message to the electronic equipment controlled by the touch pen, so that the electronic equipment obtains the current mode of the touch pen.
In some embodiments, the control device 800 further comprises:
the packaging module is used for determining that the motion trail meets a preset condition in the first mode and packaging the motion trail to obtain packaging trail information;
and the second sending module is used for sending the packet track information to the electronic equipment controlled by the stylus so that the electronic equipment determines and responds to the gesture instruction corresponding to the packet track information.
In some embodiments, the control device 800 further comprises:
the second generation module is used for determining that the motion trail meets a preset condition and generating a corresponding gesture instruction based on the motion trail in the first mode;
and the third sending module is used for sending the gesture instruction to the electronic equipment controlled by the stylus so that the electronic equipment executes corresponding operation based on the gesture instruction.
In some embodiments, the control module 802 is further configured to control the stylus to switch from the first mode to a third mode when the gesture sensor recognizes a second gesture command in the first mode;
the response module 801 is further configured to, in the third mode, respond to a motion trajectory generated by the stylus touch acting on the electronic device operated by the stylus to generate a corresponding motion trajectory on a display screen of the electronic device, where power consumption in the third mode is smaller than power consumption in the second mode.
In some embodiments, the control device 800 further comprises:
the fourth acquisition module is used for acquiring the current mode of the stylus and the state information of the charging module in the stylus;
the first switching module is configured to determine that the current mode is one of the first mode, the second mode, and the third mode, and the state information indicates that the stylus is in a charging state, and switch the current mode to an initialization state.
In some embodiments, the control device 800 further comprises:
a fifth obtaining module, configured to determine that the current mode is one of the first mode, the second mode, and the third mode, and obtain a standing duration of the stylus by using the attitude sensor;
the second switching module is used for determining that the standing time is greater than a time threshold value, and switching the mode of the stylus to a sleep mode, wherein the attitude sensor can identify gravity center change information of the attitude sensor in the sleep mode;
a return module for returning the stylus from the sleep mode to the current mode in response to the center of gravity change information identified with the attitude sensor.
It should be noted that the description of the control device in the embodiment of the present application is similar to the description of the embodiment of the method described above, and has similar beneficial effects to the embodiment of the method. For technical details not disclosed in the embodiments of the apparatus, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be noted that, in the embodiment of the present application, if the system upgrading method is implemented in the form of a software functional module and is sold or used as a standalone product, the system upgrading method may also be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the related art may be embodied in the form of a software product stored in a storage medium, and including several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
Accordingly, embodiments of the present application provide a computer-readable storage medium on which a computer program is stored, which, when executed by a processor, implements the control method provided in the above embodiments.
An embodiment of the present application provides a stylus, where the stylus includes an attitude sensor, and fig. 9 is a schematic structural diagram of the stylus provided in the embodiment of the present application, and as shown in fig. 9, the stylus 900 further includes: a controller 901, at least one communication bus 902, a user interface 903, at least one external communication interface 904 and a memory 905. Wherein the communication bus 902 is configured to enable connective communication between these components. Where the user interface 903 comprises a display screen, the external communication interface 904 may comprise a standard wired interface and a wireless interface. Wherein the controller 901 is configured to execute a program of a control method stored in the memory to implement the control method provided in the above-described embodiments.
The above description of the stylus and storage medium embodiments is similar to the description of the method embodiments above, with similar beneficial effects as the method embodiments. For technical details not disclosed in the embodiments of the stylus and the storage medium of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an AC to perform all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The above description is only for the embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A control method applied to a stylus including a gesture sensor, the method comprising:
responding to the acquired preset instruction, and controlling the stylus to enter a first mode;
in the first mode, the touch pen can identify the motion track of the touch pen as a gesture command based on the gesture sensor;
controlling the stylus to switch from the first mode to a second mode when the gesture sensor recognizes a first gesture command in the first mode;
in the second mode, the stylus can recognize a motion trajectory thereof as a position coordinate based on the posture sensor.
2. The method of claim 1, wherein said controlling the stylus to enter a first mode in response to the retrieved predetermined instruction comprises:
when the stylus is in an initialization state, generating the predetermined instruction based on the detected pose change data of the attitude sensor, and controlling to switch the stylus from the initialization state to the first mode.
3. The method as recited in claim 2, the method further comprising:
acquiring firmware information of the stylus, and determining that the stylus is in an initialization state if the firmware information represents that the firmware of the stylus is in an initialization mode; or,
acquiring state information of a charging module in the stylus, and determining that the stylus is in the initialization state if the state information represents that the stylus is in the charging state; or,
and acquiring induction information of a sensing control in the stylus, and determining that the stylus is in the initialization state if the induction information represents that the position relation between the stylus and a matching device meets a preset relation.
4. The method as recited in claim 1, the method further comprising:
determining the current mode of the touch pen, and generating a mode prompt message;
and sending the mode prompt message to the electronic equipment controlled by the touch pen so that the electronic equipment obtains the current mode of the touch pen.
5. The method as recited in claim 1, the method further comprising:
in the first mode, determining that the motion track meets a preset condition, and performing packet processing on the motion track to obtain packet track information;
and sending the packet track information to the electronic equipment controlled by the stylus so that the electronic equipment determines and responds to the gesture instruction corresponding to the packet track information.
6. The method as recited in claim 1, the method further comprising:
in the first mode, determining that the motion trail meets a preset condition, and generating a corresponding gesture instruction based on the motion trail;
and sending the gesture instruction to the electronic equipment controlled by the touch pen so as to enable the electronic equipment to execute corresponding operation based on the gesture instruction.
7. The method as recited in claim 1, the method further comprising:
controlling the stylus to switch from the first mode to a third mode when the gesture sensor recognizes a second gesture command in the first mode;
and in the third mode, responding to a motion track generated by the touch of the touch pen on the electronic equipment controlled by the touch pen to generate a corresponding motion track on a display screen of the electronic equipment, wherein the power consumption of the third mode is less than that of the second mode.
8. The method as recited in claim 7, the method further comprising:
acquiring the current mode of the touch pen and the state information of a charging module in the touch pen;
and determining that the current mode is one of the first mode, the second mode and the third mode, and the state information represents that the stylus is in a charging state, and switching the current mode to an initialization state.
9. The method as recited in claim 7, the method further comprising:
determining that the current mode is one of the first mode, the second mode and the third mode, and obtaining a standing time length of the stylus by using the attitude sensor;
determining that the standing time is greater than a time threshold, and switching the mode of the stylus to a sleep mode, wherein the attitude sensor can identify the change information of the gravity center of the attitude sensor in the sleep mode;
returning the stylus from the sleep mode to the current mode in response to the center of gravity change information identified with the attitude sensor.
10. A stylus, comprising:
the gesture sensor is used for identifying the motion track as a gesture command or a position coordinate;
the controller responds to the acquired preset instruction and controls the stylus to enter a first mode;
when the gesture sensor recognizes a first gesture command in a first mode, the controller controls the stylus to switch from the first mode to a second mode;
in the first mode, the touch pen can identify the motion track of the touch pen as a gesture command based on the gesture sensor;
in the second mode, the stylus can recognize a motion trajectory thereof as a position coordinate based on the posture sensor.
CN202111639336.9A 2021-12-29 2021-12-29 Control method and device, touch control pen and computer readable storage medium Pending CN114415850A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111639336.9A CN114415850A (en) 2021-12-29 2021-12-29 Control method and device, touch control pen and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111639336.9A CN114415850A (en) 2021-12-29 2021-12-29 Control method and device, touch control pen and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114415850A true CN114415850A (en) 2022-04-29

Family

ID=81270178

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111639336.9A Pending CN114415850A (en) 2021-12-29 2021-12-29 Control method and device, touch control pen and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114415850A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023082768A1 (en) * 2021-11-11 2023-05-19 荣耀终端有限公司 Functional mode switching method, electronic device, and system
CN116736991A (en) * 2022-09-23 2023-09-12 荣耀终端有限公司 Control method of touch pen, touch pen and storage medium
CN117032485A (en) * 2022-08-09 2023-11-10 荣耀终端有限公司 Touch pen-based use method and device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023082768A1 (en) * 2021-11-11 2023-05-19 荣耀终端有限公司 Functional mode switching method, electronic device, and system
CN117032485A (en) * 2022-08-09 2023-11-10 荣耀终端有限公司 Touch pen-based use method and device
CN116736991A (en) * 2022-09-23 2023-09-12 荣耀终端有限公司 Control method of touch pen, touch pen and storage medium
CN116736991B (en) * 2022-09-23 2024-05-14 荣耀终端有限公司 Control method of touch pen, touch pen and storage medium

Similar Documents

Publication Publication Date Title
CN114415850A (en) Control method and device, touch control pen and computer readable storage medium
AU2018282404B2 (en) Touch-sensitive button
CN103870028B (en) The user terminal and method of interaction are provided using pen
KR102160767B1 (en) Mobile terminal and method for detecting a gesture to control functions
US20150058651A1 (en) Method and apparatus for saving battery of portable terminal
CN105760019B (en) Touch operation method and system based on interactive electronic whiteboard
CN104115099A (en) Engagement-dependent gesture recognition
CN107407980A (en) Stylus with the multiple operation parts for being configured to transmit synchronized signal
KR101815720B1 (en) Method and apparatus for controlling for vibration
KR20150003626A (en) Method for controlling digitizer mode
TWM361061U (en) Touch type mouse wake-up device
US11328469B2 (en) Electronic device and method for providing drawing environment
KR20150005020A (en) coordinate measuring apparaturs which measures input position of coordinate indicating apparatus and method for controlling thereof
CN108769299B (en) Screen control method and device and mobile terminal
CN102411439A (en) Stylus modes
CN109844702B (en) Control method for electronic equipment and input equipment
WO2011163601A1 (en) Activation objects for interactive systems
US20140340336A1 (en) Portable terminal and method for controlling touch screen and system thereof
CN112164608A (en) Electronic device, control method thereof and control device
KR20150007862A (en) Mobile terminal and method for controlling data providing
CN115291786A (en) False touch judgment method and device based on machine learning and storage medium
CN106020673B (en) Control method and electronic equipment
CN104536556B (en) Information processing method and electronic equipment
CN112181181A (en) Stylus, control method and control device of stylus and storage medium
CN108600823B (en) Video data processing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination