CN106325702B - Method and device for recognizing mouse gestures - Google Patents

Method and device for recognizing mouse gestures Download PDF

Info

Publication number
CN106325702B
CN106325702B CN201510397490.8A CN201510397490A CN106325702B CN 106325702 B CN106325702 B CN 106325702B CN 201510397490 A CN201510397490 A CN 201510397490A CN 106325702 B CN106325702 B CN 106325702B
Authority
CN
China
Prior art keywords
mouse
cursor
identification
moving
moving direction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510397490.8A
Other languages
Chinese (zh)
Other versions
CN106325702A (en
Inventor
徐华荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201510397490.8A priority Critical patent/CN106325702B/en
Publication of CN106325702A publication Critical patent/CN106325702A/en
Application granted granted Critical
Publication of CN106325702B publication Critical patent/CN106325702B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method for recognizing mouse gestures, which comprises the following steps: receiving a mouse gesture trigger instruction, acquiring a cursor position according to the movement trend of a cursor corresponding to a mouse in a movement period, obtaining at least one sampling point, and taking the movement trend direction of one sampling point as the primary movement direction of the cursor; converting the moving direction into a corresponding identification code, and generating an identification sequence from all the identification codes in the moving period; and searching the operation corresponding to the identification sequence in the database, and executing the operation. The invention also discloses a device for recognizing the mouse gestures. According to the invention, the mouse gesture is converted into a simple recognition sequence, and the terminal can obtain the recognition sequence corresponding to the mouse gesture only by recognizing the moving direction of the cursor corresponding to the mouse, so that the accuracy of mouse gesture recognition is effectively improved; because the association between the mouse gesture and the operation function is more intuitive, the user does not need to particularly memorize a complicated mapping relation, and the user operation is more convenient.

Description

Method and device for recognizing mouse gestures
Technical Field
The invention relates to the technical field of internet, in particular to a method and a device for recognizing mouse gestures.
Background
When a user interacts with a terminal, the most commonly used input devices are a keyboard and a mouse. Particularly, in the use process of the application, more operations can realize various functions by clicking menus and operation controls on an application interface through a mouse. In order to make the mouse operation more convenient and diversified, the mouse gesture is added in the application of the terminal to expand the operation function of the mouse. The mouse gesture is to press a right mouse button and draw a specific track on a screen of the terminal to realize a corresponding function. However, because the recognition accuracy of the current terminal on the mouse track is not very high, the existing mouse gestures are generally simple, such as left, right, up, down, left first and down, up first and down second and left first and so on, although the gestures are simple, the association between the gestures and the operation functions is not intuitive, and the user often needs to remember the functions corresponding to the mouse gestures to realize correct operation, which causes a certain trouble to the user operation and limits the use of the mouse gestures.
Disclosure of Invention
The embodiment of the invention provides a method and a device for recognizing mouse gestures, which can recognize complex mouse gestures, enable the association between the mouse gestures and operation functions to be more visual and facilitate the operation of a user.
The embodiment of the invention provides a method for recognizing mouse gestures, which comprises the following steps:
receiving a mouse gesture trigger instruction, acquiring a cursor position according to the movement trend of a cursor corresponding to a mouse in a movement period, obtaining at least one sampling point, and taking the movement trend direction of one sampling point as the primary movement direction of the cursor;
converting the moving direction into a corresponding identification code, and generating an identification sequence from all the identification codes in the moving period;
and searching the operation corresponding to the identification sequence in a database, and executing the operation.
The embodiment of the present invention further provides a device for mouse gesture recognition, including:
the command receiving module is used for receiving a mouse gesture triggering command;
the tracking module is used for acquiring the cursor position according to the movement trend of the cursor corresponding to the mouse in a movement period to obtain at least one sampling point, and the movement trend direction of one sampling point is taken as the primary movement direction of the cursor;
the identification module is used for converting the moving direction into a corresponding identification code and generating an identification sequence from all the identification codes in the moving period;
and the operation module is used for searching the operation corresponding to the identification sequence in a database and executing the operation.
According to the embodiment of the invention, the mouse gesture is converted into the simple recognition sequence, the terminal can obtain the recognition sequence corresponding to the mouse gesture only by recognizing the moving direction of the cursor corresponding to the mouse, and no matter how complex the mouse gesture is, the terminal can obtain the corresponding recognition sequence, so that the accuracy of mouse gesture recognition is effectively improved; meanwhile, the association between the mouse gesture and the operation function is more visual, so that a user does not need to particularly memorize a complicated mapping relation, and the user operation is more convenient.
Drawings
FIG. 1 is a bus diagram of a terminal where a mouse gesture recognition device is located in an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for mouse gesture recognition according to a first embodiment of the present invention;
FIG. 3 is a flowchart illustrating a method for mouse gesture recognition according to a second embodiment of the present invention;
FIG. 4 is a flowchart illustrating a method for mouse gesture recognition according to a third embodiment of the present invention;
FIG. 5 is a flowchart illustrating a fourth exemplary embodiment of a mouse gesture recognition method according to the present invention;
FIG. 6 is a flowchart illustrating a fifth exemplary embodiment of a mouse gesture recognition method according to the present invention;
FIG. 7 is a flowchart illustrating a sixth exemplary embodiment of a mouse gesture recognition method according to the present invention;
FIG. 8 is a block diagram illustrating a mouse gesture recognition apparatus according to a first embodiment of the present invention;
FIG. 9 is a block diagram illustrating a mouse gesture recognition apparatus according to a second embodiment of the present invention;
FIG. 10a is a diagram illustrating a first embodiment of a mouse gesture movement trajectory according to an embodiment of the present invention;
FIG. 10b is a diagram illustrating a movement trajectory of a mouse gesture according to a second embodiment of the present invention;
FIG. 10c is a diagram illustrating a movement trajectory of a mouse gesture according to a third embodiment of the present invention;
FIG. 10d is a diagram illustrating a fourth embodiment of a mouse gesture movement trajectory according to the embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a bus diagram of a terminal where a mouse gesture recognition device is located in an embodiment of the present invention. The terminal may include: at least one processor 101, e.g., a CPU, at least one network interface 104, a user interface 103, a memory 105, at least one communication bus 102. Wherein the communication bus 102 is used for enabling connection communication between these components. The user interface 103 may include a Display (Display), a Keyboard (Keyboard), a standard wired interface, and a standard wireless interface. The network interface 104 may include a standard wired interface, a wireless interface (e.g., a WIFI interface). The memory 105 may be a high-speed RAM memory or a non-volatile memory (e.g., at least one disk memory). The memory 105 may also be at least one storage device located remotely from the aforementioned processor 101. The memory 105, which is a type of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a mouse gesture recognition program.
In the terminal where the mouse gesture recognition apparatus shown in fig. 1 is located, the network interface 104 is mainly used for connecting a server or other terminals and performing data communication with the server or other terminals; the user interface 103 is mainly used for receiving user instructions and interacting with users; and the processor 101 may be configured to invoke a mouse gesture recognition program stored in the memory 105 and perform the following operations:
receiving a mouse gesture trigger instruction through the user interface 103;
acquiring a cursor position according to the movement trend of a cursor corresponding to a mouse in a movement period to obtain at least one sampling point, and taking the movement trend direction of one sampling point as the primary movement direction of the cursor;
converting the moving direction into a corresponding identification code, and generating an identification sequence from all the identification codes in the moving period;
and searching the operation corresponding to the identification sequence in a database, and executing the operation.
In one embodiment, the processor 101 invoking the mouse gesture recognition program stored in the memory 105 may further perform the following operations:
sequentially searching identification codes corresponding to the moving directions each time from the database;
judging whether the identification codes corresponding to the two adjacent moving directions are the same or not;
if yes, keeping any one of the two same identification codes;
if not, the identification codes corresponding to the two adjacent moving directions are reserved;
and sequentially arranging the reserved identification codes according to the sequence of the corresponding moving direction to generate the identification sequence.
In one embodiment, the processor 101 invoking the mouse gesture recognition program stored in the memory 105 may further perform the following operations:
acquiring the cursor position according to the set sampling time or sampling distance in a moving period to obtain a plurality of sampling points;
and taking the vector directions of two adjacent sampling points as the primary moving direction of the cursor.
In one embodiment, the processor 101 invoking the mouse gesture recognition program stored in the memory 105 may further perform the following operations:
respectively taking two adjacent sampling points as a moving front sampling point and a moving rear sampling point of one movement;
acquiring coordinates (x1, y1) of a sampling point of the cursor before the current movement and coordinates (x2, y2) of the sampling point after the current movement;
judging whether the sliding distance | X2-X1| of the cursor moving on the X axis at the time is larger than or equal to the sliding distance | Y2-Y1| on the Y axis;
if yes, when x2-x1>0, determining that the current movement direction of the cursor is rightward; when x2-x1<0, determining that the current movement direction of the cursor is leftward;
if not, when y2-y1>0, determining that the current movement direction of the cursor is upward; when y2-y1<0, the current direction of movement of the cursor is determined to be downward.
In one embodiment, the processor 101 invoking the mouse gesture recognition program stored in the memory 105 may further perform the following operations:
when a mouse gesture trigger key is pressed down, acquiring coordinates (x01, y01) of a sampling point of the cursor before the cursor moves for the first time and coordinates (x02, y02) of the sampling point after the cursor moves for the first time;
when the X-axis sliding distance | X02-X01| or the Y-axis sliding distance | Y02-Y01| of the cursor moving for the first time is larger than or equal to a sliding threshold value, generating a mouse gesture trigger instruction, and counting the first movement into the moving period.
In one embodiment, the processor 101 invoking the mouse gesture recognition program stored in the memory 105 may further perform the following operations:
receiving a mouse gesture customization instruction through the user interface 103;
acquiring a cursor position according to the movement trend of a cursor corresponding to a mouse in a user-defined movement period to obtain at least one sampling point, and taking the movement trend direction of one sampling point as the primary movement direction of the cursor;
converting the moving direction into a corresponding identification code, and generating an identification sequence from all the identification codes in the user-defined moving period;
receiving operation options recorded on the operation option menu;
and associating the identification sequence with the operation corresponding to the operation option and storing the identification sequence in a database.
In the device for recognizing the mouse gesture and the terminal thereof described in fig. 1 of the present embodiment, the mouse gesture is converted into a simple recognition sequence, and the terminal can obtain the recognition sequence corresponding to the mouse gesture only by recognizing the moving direction of the cursor corresponding to the mouse, and no matter how complex the mouse gesture is, the terminal can obtain the corresponding recognition sequence, so that the accuracy of the mouse gesture recognition is effectively improved; meanwhile, the association between the mouse gesture and the operation function is more visual, so that a user does not need to particularly memorize a complicated mapping relation, and the user operation is more convenient.
Fig. 2 is a flowchart illustrating a method for mouse gesture recognition according to a first embodiment of the present invention. The method for recognizing mouse gestures provided by the embodiment comprises the following steps:
step S10, receiving a mouse gesture trigger instruction;
in this embodiment, a user starts a mouse gesture on an application interface on a terminal to generate a mouse gesture trigger instruction. For example, a user presses a gesture trigger key of the mouse, such as a right key or a middle key, or presses a specific key on a keyboard, and slides the mouse while keeping the pressed state, and then the terminal determines that the mouse gesture is triggered at the moment and enters a mouse gesture recognition program. Or sliding the mouse over a specific area, or clicking a specific operation control on the application interface, or sliding a specific track of the mouse can be used as a starting condition for triggering a mouse gesture.
Step S20, collecting cursor position according to the cursor movement trend corresponding to the mouse in a movement period, obtaining at least one sampling point, and taking the movement trend direction of one sampling point as the primary movement direction of the cursor;
when a user presses the mouse gesture trigger key, a moving period starts, the mouse continuously moves, the user keeps the mouse gesture trigger key in a pressing state all the time, the mouse is still in the moving period, and the moving period is finished until the user releases the mouse gesture trigger key or the mouse leaves a terminal screen. When the mouse continuously moves, the mouse does not necessarily move in the same direction, and a user can control the mouse to draw various tracks on a terminal screen, such as letters, numbers, circles, polygons and the like, and even Chinese characters can be realized. Of course, since the mouse cannot leave the terminal screen, when the mouse is used for drawing a track, continuous completion is needed, the stroke sequence of some letters or Chinese characters may not be in accordance with the conventional stroke sequence, and the movement is only needed to be completed according to the stroke sequence defined by the user. For letters, numbers, characters and shapes with various lines, the moving directions of the mouse cursor are also various, and the moving track for changing the directions for many times is included in one moving period.
In order to recognize a mouse-drawn figure, it is necessary to first recognize the direction of movement of the cursor corresponding to the mouse. When the cursor moves, a moving trend exists, and for coordinate points on a moving track with the same moving trend, the directions of the moving trends are the same, so that the direction of the moving trend can be obtained by only collecting a sampling point on the moving trend. When the cursor changes the moving direction for many times, the moving trend corresponding to the current coordinate point of the cursor changes, a sampling point is collected when the cursor changes the moving trend each time, the moving trend direction of the sampling point is obtained, and the moving trend direction corresponding to the current sampling point can be used as the one-time moving direction of the cursor. And combining the moving directions of each time, and finally drawing a complete graph, namely the content drawn on the terminal screen by the user through a mouse. For example, the moving direction of the number "3" in fig. 10a is right, lower, left, right, lower, and left in order, the moving direction of the letter "G" in fig. 10b is left, lower, right, upper, left, and right in order, the moving direction of the letter "a" in fig. 10c is right upper, right lower, left upper, and right in order, and all the moving directions may be controlled only within the range of upper, lower, left, and right, for example, the moving direction of the letter "a" in fig. 10d is upper, lower, left, and right in order, or upper, lower, upper, left, and right in order.
Step S30, converting the moving direction into corresponding identification codes, and generating an identification sequence from all the identification codes in the moving period;
the identification codes corresponding to each moving direction one by one are preset in the terminal, for example, the left identification code is L, the right identification code is R, the upper identification code is U, the lower identification code is D, the upper right identification code is RU, the lower right identification code is RD, the upper left identification code is LU, the lower left identification code is LD, the identification codes corresponding to the moving direction of the mouse gesture are sequenced according to the sequence of the moving track to form an identification sequence for representing the moving gesture, for example the identification sequence corresponding to the direction of movement of the number "3" in figure 10a is RDLRDL, the identification sequence corresponding to the direction of movement of the letter "G" in FIG. 10b is LDRUR, the identification sequence corresponding to the direction of movement of the letter "A" in FIG. 10c is [ RU ] [ RD ] [ LU ] R, the identification sequence corresponding to the moving direction of the letter "a" in fig. 10d is UDLR or UDULR.
And step S40, searching the database for the operation corresponding to the identification sequence and executing the operation.
The method comprises the steps that a one-to-one corresponding relation of mouse gestures, identification sequences and operation contents is preset in a terminal database, when the corresponding relation is preset, more visual mouse gestures can be set according to characteristics of the operation contents, for example, characteristic information in the operation of starting a *** search engine is ***, a letter G can be preset as the mouse gesture of the operation, the visual corresponding relation is established between the identification sequence LDULR of the letter G and the operation of starting the *** search engine, when the mouse gesture of a user sliding on a screen is G, the terminal converts the G into the corresponding identification sequence LDLR, searches the corresponding operation '*** search engine' from the database, executes the operation, and displays a browser on a terminal screen to open a *** search interface.
In the embodiment, the mouse gesture is converted into the simple recognition sequence, the terminal can obtain the recognition sequence corresponding to the mouse gesture only by recognizing the moving direction of the cursor corresponding to the mouse, and the terminal can obtain the corresponding recognition sequence no matter how complex the mouse gesture is, so that the accuracy of mouse gesture recognition is effectively improved; meanwhile, the association between the mouse gesture and the operation function is more visual, so that a user does not need to particularly memorize a complicated mapping relation, and the user operation is more convenient.
Referring to fig. 3, fig. 3 is a flowchart illustrating a mouse gesture recognition method according to a second embodiment of the present invention. This embodiment includes all the steps in the embodiment shown in fig. 2, and further to the detailed description of the generation of the identification sequence, step S30 includes:
step S31, sequentially searching the identification code corresponding to each moving direction from the database;
step S32, judging whether the identification codes corresponding to the two adjacent moving directions are the same; if so, go to step S33; if not, go to step S34;
step S33, retaining any one of the two identical identification codes;
step S34, reserving identification codes corresponding to the two adjacent moving directions;
and step S35, arranging the reserved identification codes in sequence according to the sequence of the corresponding moving direction to generate an identification sequence.
In this embodiment, when the identification sequence is generated, the corresponding identification codes are sequentially ordered according to the sequence of the moving directions of the cursor corresponding to the mouse, so as to generate the identification sequence. The identification sequence can be generated by identifying all moving directions, converting into corresponding identification codes at the same time, and then sequencing in sequence; or the first moving direction is identified to obtain the first identification code, the first identification code is put into the identification sequence, then the second moving direction is identified to obtain the second identification code, the second identification code is added into the identification sequence, and so on. For the second way of generating the identification sequence, taking fig. 10b as an example, the first moving direction of the letter "G" is left, the first identification code L is obtained and added to the identification sequence, and the identification sequence is L at this time; if the second moving direction is downward, obtaining a second identification code D, and adding the second identification code D into the identification sequence, wherein the identification sequence is LD; if the third moving direction is right, obtaining a third identification code R, and adding the third identification code R into the identification sequence, wherein the identification sequence is LDR; and repeating the steps until the user releases the mouse or the mouse leaves the terminal screen, wherein the finally obtained identification sequence is LDRUR.
When the user controls the mouse to slide, the situations that the mouse cursor moves unstably and the sliding track shakes and the like due to the user self are considered, and therefore identification codes which repeatedly appear can be filtered. For the first way of generating the identification sequence, it is compared whether the identification codes corresponding to the two adjacent moving directions are the same, and if the identification codes are the same, the two corresponding moving directions are also the same or similar, which may be because the mouse is held by shaking, which causes the moving direction of the mouse to slightly shift, and then two identical or similar moving directions are obtained, and at this time, only one of the two adjacent identification codes is retained in the identification sequence. For the second method of generating the identification sequence, since the identification codes are added to the identification sequence one by one, the identification code corresponding to one moving direction is only required to be obtained and then compared with the last identification code in the identification sequence, that is, compared with the identification code corresponding to the moving direction obtained last time, if the two identification codes are the same, the identification code obtained this time is abandoned, and if the two identification codes are not the same, the identification code obtained this time is added to the identification sequence.
In the embodiment, a simple recognition sequence is adopted to replace a complex mouse gesture, so that the arrangement sequence of the recognition codes in the recognition sequence is consistent with the sequence of the moving directions generated by the movement of the mouse cursor, and the accuracy of mouse gesture recognition is effectively improved; meanwhile, one repeated identification code in the two adjacent identification codes is eliminated from the identification sequence, so that the direction deviation caused by the shaking of the mouse is avoided, and the accuracy of the mouse gesture identification is further improved.
Referring to fig. 4, fig. 4 is a flowchart illustrating a method for mouse gesture recognition according to a third embodiment of the present invention. This embodiment includes all steps in the embodiment shown in fig. 2, and further describes in detail the obtaining of the moving direction of the cursor corresponding to the mouse, where step S20 includes:
step S21, collecting cursor position according to set sampling time or sampling distance in a moving period to obtain a plurality of sampling points;
step S22, the vector direction of two adjacent sampling points is taken as the primary moving direction of the cursor;
in the embodiment, when the moving direction of the cursor is determined, the position of the mouse cursor on the display screen is collected according to the time or distance set by the system, and the collected multiple mouse focus positions are multiple sampling points. The set sampling interval or sampling can adopt equal time interval or distance interval, sampling time or sampling distance with specific change rule, or random sampling time or sampling distance and other modes for selection. A vector direction pointing to the next sampling point B from the previous sampling point A exists between the two adjacent sampling points, and the vector direction is the direction in which the mouse cursor moves from the position of the previous sampling point A to the position of the next sampling point B and also corresponds to the moving trend direction of the previous sampling point A. The embodiment determines the moving direction of the cursor through the vector directions of the two sampling points, so that the determination of the moving direction of the cursor is simpler, more convenient and faster.
Referring to fig. 5, fig. 5 is a flowchart illustrating a method for mouse gesture recognition according to a fourth embodiment of the present invention. This embodiment includes all the steps in the embodiment shown in fig. 4, and step S22 includes:
step S23, taking two adjacent sampling points as a moving front sampling point and a moving rear sampling point of one movement respectively;
in this embodiment, a vector direction pointing from a previous sampling point to a next sampling point exists between two adjacent sampling points, and the vector direction is a direction in which a mouse cursor moves from the previous sampling point to the next sampling point, and also corresponds to a moving direction of the mouse. And the former sampling point of the two adjacent sampling points is taken as a moving front sampling point of the movement, and the latter sampling point is taken as a moving rear sampling point of the movement.
Step S24, acquiring coordinates (x1, y1) of a sampling point before the cursor moves this time and coordinates (x2, y2) of the sampling point after the cursor moves this time;
the terminal records the coordinates of the sampling point corresponding to the mouse cursor when the moving direction of the mouse changes every time in a moving period. And subtracting the coordinate values of the sampling points before and after the movement to obtain the sliding distance of the mouse cursor movement. The sliding distance on the X axis is | X2-X1|, and the sliding distance on the Y axis is | Y2-Y1 |.
Step S25, judging whether the sliding distance | X2-X1| of the cursor moving on the X axis is larger than or equal to the sliding distance | Y2-Y1| on the Y axis; if so, go to step S26; if not, go to step S27;
when the sliding distance of the mouse cursor on the X axis is larger than or equal to the sliding distance on the Y axis, the mouse is judged to move left and right, and when the sliding distance of the mouse cursor on the X axis is smaller than the sliding distance on the Y axis, the mouse is judged to move up and down.
Step S26, comparing x2-x1>0 or x2-x1< 0;
step S261, when x2-x1>0, determining that the current moving direction of the cursor is rightward;
step S262, when x2-x1 is less than 0, determining that the current moving direction of the cursor is leftward;
after the mouse cursor is judged to move transversely, when the sliding distance of the mouse cursor on the X axis is larger than 0, the mouse is judged to move rightwards, when the sliding distance of the mouse cursor on the X axis is smaller than 0, the mouse is judged to move leftwards, and when the sliding distance of the mouse cursor on the X axis is equal to 0, the mouse does not move.
Step S27, comparing y2-y1>0 or y2-y1< 0;
step S271, when y2-y1>0, determining that the current moving direction of the cursor is upward;
in step S272, when y2-y1<0, the current moving direction of the cursor is determined to be downward.
After the longitudinal movement of the mouse cursor is judged, when the sliding distance of the mouse cursor on the Y axis is larger than 0, the mouse is judged to move upwards, when the sliding distance of the mouse cursor on the Y axis is smaller than 0, the mouse is judged to move downwards, and when the sliding distance of the mouse cursor on the Y axis is equal to 0, the mouse does not move.
In order to make the direction identification simpler, the present embodiment only needs to identify the up, down, left, and right directions, and simplifies the oblique directions such as up-left, down-left, up-right, down-right, etc. into up, down, left, and right directions. For example, if the letter "A" moves in the direction of the full recognition, the moving direction is sequentially up right, down right, up left, and right as shown in FIG. 10c, and the recognition sequence corresponding to the moving direction is [ RU ] [ RD ] [ LU ] R, whereas if the letter "A" moves in the simplified recognition of this embodiment, the moving direction is sequentially up, down, left, and right as shown in FIG. 10d, and the recognition sequence corresponding to the moving direction is UDLR. Meanwhile, in order to avoid the situation that the user does not move accurately when drawing the mouse gesture, for example, the horizontal line or the vertical line is drawn as an oblique line, in this embodiment, the oblique directions such as upper left, lower left, upper right, lower right and the like are simplified into four directions, namely, upper, lower, left and right, so that the direction recognition can be more accurate.
Referring to fig. 6, fig. 6 is a flowchart illustrating a method for mouse gesture recognition according to a fifth embodiment of the present invention. The present embodiment includes all steps in the embodiment shown in fig. 2, and further to generate the detailed description of the mouse gesture triggering instruction, step S10 further includes:
step S51, when the mouse gesture trigger button is pressed, acquiring coordinates (x01, y01) of a sampling point before the cursor moves for the first time and coordinates (x02, y02) of the sampling point after the cursor moves for the first time;
in step S52, when the X-axis sliding distance | X02-X01| or the Y-axis sliding distance | Y02-Y01| of the first movement of the cursor is greater than or equal to the sliding threshold, a mouse gesture trigger instruction is generated, and the first movement is counted in the movement period.
The mouse gesture trigger key of the embodiment may be a right key or a middle key of a mouse, or two or three of a left key, a right key and a middle key are simultaneously clicked, or one or more specific keys on a keyboard. When a mouse gesture trigger button is pressed down and kept in a pressed state, the mouse is slid, and coordinates of sampling points before and after the mouse moves for the first time in a moving period are recorded by the terminal. In order to eliminate the situation that a user accidentally touches a mouse gesture trigger button, the terminal presets a sliding threshold, when the sliding distance of the mouse on an X axis or a Y axis reaches the sliding threshold, the terminal judges that the mouse gesture is triggered at the moment and enters a mouse gesture recognition program, otherwise, the terminal judges that the user performs misoperation without processing, and therefore the accuracy of mouse gesture recognition is effectively improved.
FIG. 7 is a flowchart illustrating a method for mouse gesture recognition according to a sixth embodiment of the present invention. This embodiment includes all the steps in the embodiment shown in fig. 2, and further details the custom mouse gesture, before step S10, the method further includes:
step S61, receiving a mouse gesture self-defining instruction;
the terminal database of the embodiment is preset with the intuitive mapping relation between the mouse gesture and the operation function, so that the user can conveniently memorize and operate the terminal database. In order to provide richer mouse gestures, the terminal also provides a way for a user to define the mouse gestures, and the user can change or add the mouse gestures according to own use habits. When a user clicks a control for adding or changing a mouse gesture on an application interface, a mouse gesture self-defining instruction is triggered, and a self-defining mode is entered.
Step S62, collecting the cursor position according to the cursor movement trend corresponding to the mouse in a user-defined movement period, obtaining at least one sampling point, and taking the movement trend direction of one sampling point as the primary movement direction of the cursor;
at the moment, in a user-defined movement period, the user moves the mouse to draw a gesture track. For example, assuming that the right mouse button is a mouse gesture trigger button, after entering the user-defined mode, the user clicks the right mouse button, slides the mouse while keeping the right mouse button in a pressed state, draws a track, and obtains the moving directions of the plurality of mouse cursors.
Step S63, converting the moving direction into corresponding identification codes, and generating an identification sequence from all the identification codes in the user-defined moving period;
the moving direction is converted into the identification code, and the identification sequence is generated, and the process may refer to the principle of generating the identification sequence in the foregoing embodiment, which is not described herein again. In order to ensure the accuracy of drawing the mouse gesture, after one-time drawing is finished, the terminal can prompt the user to draw repeatedly, if the two-time drawn mouse gestures are the same, the obtained identification sequence is also the same, the drawing of the mouse gesture is considered to be finished, otherwise, the terminal prompts the user whether the input is wrong, and the user asks to draw the mouse gesture again.
Step S64, receiving the operation options recorded on the operation option menu;
and after the moving direction recognition is finished and the recognition sequence is generated, popping up an operation option menu, and prompting a user to select the operation which needs to be associated with the mouse gesture. The operation option menu comprises a plurality of preset operation options, and the operation options correspond to background operation programs.
And step S65, associating the identification sequence with the operation corresponding to the operation option and storing the identification sequence in a database.
And after the user selects the operation option, clicking to determine, and associating the operation option selected by the user with the recognition sequence of the mouse gesture by the terminal. For example, the mouse gesture completed by the user is the letter "G", the corresponding recognition sequence is LDRULR, the operation option selected by the user is "start *** search engine", and the terminal corresponds the letter "G", the recognition sequence LDRULR, and the operation "start *** search engine" one to one, and stores them in the database, thereby completing the mouse gesture custom association.
The embodiment provides a user-defined mouse gesture path for the user, the user can change or add the mouse gesture according to the use habit, the association between the mouse gesture and the operation function is more visual, the user can remember and operate conveniently, more abundant mouse gestures are provided for the user, and the application of the mouse gesture is wider.
Fig. 8 is a schematic block diagram of a mouse gesture recognition apparatus according to a first embodiment of the present invention. The device for mouse gesture recognition mentioned in this embodiment includes:
the instruction receiving module 210 is configured to receive a mouse gesture trigger instruction;
the tracking module 220 is configured to acquire a cursor position according to a movement trend of a cursor corresponding to a mouse in a movement period, obtain at least one sampling point, and use a movement trend direction of one sampling point as a primary movement direction of the cursor;
the identification module 230 is configured to convert the moving direction into a corresponding identification code, and generate an identification sequence from all the identification codes in the moving period;
and the operation module 240 is configured to search the database for an operation corresponding to the identification sequence, and execute the operation.
In this embodiment, a user starts a mouse gesture on an application interface on a terminal to generate a mouse gesture trigger instruction. For example, a user presses a gesture trigger key of the mouse, such as a right key or a middle key, or presses a specific key on a keyboard, and slides the mouse while keeping the pressed state, and then the terminal determines that the mouse gesture is triggered at the moment and enters a mouse gesture recognition program. Or sliding the mouse over a specific area, or clicking a specific operation control on the application interface, or sliding a specific track of the mouse can be used as a starting condition for triggering a mouse gesture.
When a user presses the mouse gesture trigger key, a moving period starts, the mouse continuously moves, the user keeps the mouse gesture trigger key in a pressing state all the time, the mouse is still in the moving period, and the moving period is finished until the user releases the mouse gesture trigger key or the mouse leaves a terminal screen. When the mouse continuously moves, the mouse does not necessarily move in the same direction, and a user can control the mouse to draw various tracks on a terminal screen, such as letters, numbers, circles, polygons and the like, and even Chinese characters can be realized. Of course, since the mouse cannot leave the terminal screen, when the mouse is used for drawing a track, continuous completion is needed, the stroke sequence of some letters or Chinese characters may not be in accordance with the conventional stroke sequence, and the movement is only needed to be completed according to the stroke sequence defined by the user. For letters, numbers, characters and shapes with various lines, the moving directions of the mouse cursor are also various, and the moving track for changing the directions for many times is included in one moving period.
In order to recognize the mouse-drawn figure, it is necessary to first recognize the direction in which the mouse cursor is moved. When the cursor moves, a moving trend exists, and for coordinate points on a moving track with the same moving trend, the directions of the moving trends are the same, so that the direction of the moving trend can be obtained by only collecting a sampling point on the moving trend. When the cursor changes the moving direction for many times, the moving trend corresponding to the current coordinate point of the cursor changes, a sampling point is collected when the cursor changes the moving trend each time, the moving trend direction of the sampling point is obtained, and the moving trend direction corresponding to the current sampling point can be used as the one-time moving direction of the cursor. And combining the moving directions of each time, and finally drawing a complete graph, namely the content drawn on the terminal screen by the user through a mouse. For example, the moving direction of the number "3" in fig. 10a is right, lower, left, right, lower, and left in order, the moving direction of the letter "G" in fig. 10b is left, lower, right, upper, left, and right in order, the moving direction of the letter "a" in fig. 10c is right upper, right lower, left upper, and right in order, and all the moving directions may be controlled only within the range of upper, lower, left, and right, for example, the moving direction of the letter "a" in fig. 10d is upper, lower, left, and right in order, or upper, lower, upper, left, and right in order.
The identification codes corresponding to each moving direction one by one are preset in the terminal, for example, the left identification code is L, the right identification code is R, the upper identification code is U, the lower identification code is D, the upper right identification code is RU, the lower right identification code is RD, the upper left identification code is LU, the lower left identification code is LD, the identification codes corresponding to the moving direction of the mouse gesture are sequenced according to the sequence of the moving track to form an identification sequence for representing the moving gesture, for example the identification sequence corresponding to the direction of movement of the number "3" in figure 10a is RDLRDL, the identification sequence corresponding to the direction of movement of the letter "G" in FIG. 10b is LDRUR, the identification sequence corresponding to the direction of movement of the letter "A" in FIG. 10c is [ RU ] [ RD ] [ LU ] R, the identification sequence corresponding to the moving direction of the letter "a" in fig. 10d is UDLR or UDULR.
The method comprises the steps that a one-to-one corresponding relation of mouse gestures, identification sequences and operation contents is preset in a terminal database, when the corresponding relation is preset, more visual mouse gestures can be set according to characteristics of the operation contents, for example, characteristic information in the operation of starting a *** search engine is ***, a letter G can be preset as the mouse gesture of the operation, the visual corresponding relation is established between the identification sequence LDULR of the letter G and the operation of starting the *** search engine, when the mouse gesture of a user sliding on a screen is G, the terminal converts the G into the corresponding identification sequence LDLR, searches the corresponding operation '*** search engine' from the database, executes the operation, and displays a browser on a terminal screen to open a *** search interface.
In the embodiment, the mouse gesture is converted into the simple recognition sequence, the terminal can obtain the recognition sequence corresponding to the mouse gesture only by recognizing the moving direction of the cursor corresponding to the mouse, and the terminal can obtain the corresponding recognition sequence no matter how complex the mouse gesture is, so that the accuracy of mouse gesture recognition is effectively improved; meanwhile, the association between the mouse gesture and the operation function is more visual, so that a user does not need to particularly memorize a complicated mapping relation, and the user operation is more convenient.
Further, to generate the identification sequence, the identification module 230 is further configured to:
sequentially searching identification codes corresponding to each moving direction from a database;
judging whether the identification codes corresponding to the two adjacent moving directions are the same or not;
if yes, keeping any one of the two same identification codes;
if not, the identification codes corresponding to the two adjacent moving directions are reserved;
and arranging the reserved identification codes in sequence according to the sequence of the corresponding moving directions to generate an identification sequence.
In this embodiment, when the identification sequence is generated, the corresponding identification codes are sequentially ordered according to the sequence of the moving directions of the cursor corresponding to the mouse, so as to generate the identification sequence. The identification sequence can be generated by identifying all moving directions, converting into corresponding identification codes at the same time, and then sequencing in sequence; or the first moving direction is identified to obtain the first identification code, the first identification code is put into the identification sequence, then the second moving direction is identified to obtain the second identification code, the second identification code is added into the identification sequence, and so on. For the second way of generating the identification sequence, taking fig. 10b as an example, the first moving direction of the letter "G" is left, the first identification code L is obtained and added to the identification sequence, and the identification sequence is L at this time; if the second moving direction is downward, obtaining a second identification code D, and adding the second identification code D into the identification sequence, wherein the identification sequence is LD; if the third moving direction is right, obtaining a third identification code R, and adding the third identification code R into the identification sequence, wherein the identification sequence is LDR; and repeating the steps until the user releases the mouse or the mouse leaves the terminal screen, wherein the finally obtained identification sequence is LDRUR.
When the user controls the mouse to slide, the situations that the mouse cursor moves unstably and the sliding track shakes and the like due to the user self are considered, and therefore identification codes which repeatedly appear can be filtered. For the first way of generating the identification sequence, it is compared whether the identification codes corresponding to the two adjacent moving directions are the same, and if the identification codes are the same, the two corresponding moving directions are also the same or similar, which may be because the mouse is held by shaking, which causes the moving direction of the mouse to slightly shift, and then two identical or similar moving directions are obtained, and at this time, only one of the two adjacent identification codes is retained in the identification sequence. For the second method of generating the identification sequence, since the identification codes are added to the identification sequence one by one, the identification code corresponding to one moving direction is only required to be obtained and then compared with the last identification code in the identification sequence, that is, compared with the identification code corresponding to the moving direction obtained last time, if the two identification codes are the same, the identification code obtained this time is abandoned, and if the two identification codes are not the same, the identification code obtained this time is added to the identification sequence.
In the embodiment, a simple recognition sequence is adopted to replace a complex mouse gesture, so that the arrangement sequence of the recognition codes in the recognition sequence is consistent with the sequence of the moving directions generated by the movement of the mouse cursor, and the accuracy of mouse gesture recognition is effectively improved; meanwhile, one repeated identification code in the two adjacent identification codes is eliminated from the identification sequence, so that the direction deviation caused by the shaking of the mouse is avoided, and the accuracy of the mouse gesture identification is further improved.
Further, to obtain the moving direction of the cursor corresponding to the mouse, the tracking module 220 is further configured to:
acquiring a cursor position according to a set sampling time or sampling distance in a moving period to obtain a plurality of sampling points;
and taking the vector directions of two adjacent sampling points as the primary moving direction of the cursor.
In the embodiment, when the moving direction of the cursor is determined, the position of the mouse cursor on the display screen is collected according to the time or distance set by the system, and the collected multiple mouse focus positions are multiple sampling points. The set sampling interval or sampling can adopt equal time interval or distance interval, sampling time or sampling distance with specific change rule, or random sampling time or sampling distance and other modes for selection. A vector direction pointing to the next sampling point B from the previous sampling point A exists between the two adjacent sampling points, and the vector direction is the direction in which the mouse cursor moves from the position of the previous sampling point A to the position of the next sampling point B and also corresponds to the moving trend direction of the previous sampling point A. The embodiment determines the moving direction of the cursor through the vector directions of the two sampling points, so that the determination of the moving direction of the cursor is simpler, more convenient and faster.
Further, the tracking module 220 is further configured to:
respectively taking two adjacent sampling points as a moving front sampling point and a moving rear sampling point of one movement;
acquiring coordinates (x1, y1) of a sampling point of a cursor before the current movement and coordinates (x2, y2) of the sampling point after the current movement;
judging whether the sliding distance | X2-X1| of the cursor moving on the X axis at this time is greater than or equal to the sliding distance | Y2-Y1| on the Y axis;
if yes, when x2-x1>0, determining that the current movement direction of the cursor is rightward; when x2-x1<0, determining that the moving direction of the cursor is leftward;
if not, when y2-y1>0, determining that the moving direction of the cursor is upward; when y2-y1<0, the direction of the current movement of the cursor is determined to be downward.
In this embodiment, a vector direction pointing from a previous sampling point to a next sampling point exists between two adjacent sampling points, and the vector direction is a direction in which a mouse cursor moves from the previous sampling point to the next sampling point, and also corresponds to a moving direction of the mouse. And the former sampling point of the two adjacent sampling points is taken as a moving front sampling point of the movement, and the latter sampling point is taken as a moving rear sampling point of the movement. The terminal records the coordinates of the sampling point corresponding to the mouse cursor when the moving direction of the mouse changes every time in a moving period. And subtracting the coordinate values of the sampling points before and after the movement to obtain the sliding distance of the mouse cursor movement. The sliding distance on the X axis is | X2-X1|, and the sliding distance on the Y axis is | Y2-Y1 |. When the sliding distance of the cursor on the X axis is larger than or equal to the sliding distance on the Y axis, the mouse is judged to move left and right, and when the sliding distance of the cursor on the X axis is smaller than the sliding distance on the Y axis, the mouse is judged to move up and down. After the mouse cursor is judged to move transversely, when the sliding distance of the mouse cursor on the X axis is larger than 0, the mouse is judged to move rightwards, when the sliding distance of the mouse cursor on the X axis is smaller than 0, the mouse is judged to move leftwards, and when the sliding distance of the mouse cursor on the X axis is equal to 0, the mouse does not move. After the longitudinal movement of the mouse cursor is judged, when the sliding distance of the mouse cursor on the Y axis is larger than 0, the mouse is judged to move upwards, when the sliding distance of the mouse cursor on the Y axis is smaller than 0, the mouse is judged to move downwards, and when the sliding distance of the mouse cursor on the Y axis is equal to 0, the mouse does not move.
In order to make the direction identification simpler, the present embodiment only needs to identify the up, down, left, and right directions, and simplifies the oblique directions such as up-left, down-left, up-right, down-right, etc. into up, down, left, and right directions. For example, if the letter "A" moves in the direction of the full recognition, the moving direction is sequentially up right, down right, up left, and right as shown in FIG. 10c, and the recognition sequence corresponding to the moving direction is [ RU ] [ RD ] [ LU ] R, whereas if the letter "A" moves in the simplified recognition of this embodiment, the moving direction is sequentially up, down, left, and right as shown in FIG. 10d, and the recognition sequence corresponding to the moving direction is UDLR. Meanwhile, in order to avoid the situation that the user does not move accurately when drawing the mouse gesture, for example, the horizontal line or the vertical line is drawn as an oblique line, in this embodiment, the oblique directions such as upper left, lower left, upper right, lower right and the like are simplified into four directions, namely, upper, lower, left and right, so that the direction recognition can be more accurate.
Further, to generate the mouse gesture triggering instruction, the tracking module 220 is further configured to:
when a mouse gesture trigger key is pressed down, acquiring coordinates (x01, y01) of a sampling point before the cursor moves for the first time and coordinates (x02, y02) of the sampling point after the cursor moves for the first time;
when the X-axis sliding distance | X02-X01| or the Y-axis sliding distance | Y02-Y01| of the cursor moving for the first time is larger than or equal to the sliding threshold value, a mouse gesture trigger instruction is generated, and the first movement is counted in a moving period.
The mouse gesture trigger key of the embodiment may be a right key or a middle key of a mouse, or two or three of a left key, a right key and a middle key are simultaneously clicked, or one or more specific keys on a keyboard. When a mouse gesture trigger button is pressed down and kept in a pressed state, the mouse is slid, and coordinates of sampling points before and after the mouse moves for the first time in a moving period are recorded by the terminal. In order to eliminate the situation that a user accidentally touches a mouse gesture trigger button, the terminal presets a sliding threshold, when the sliding distance of the mouse on an X axis or a Y axis reaches the sliding threshold, the terminal judges that the mouse gesture is triggered at the moment and enters a mouse gesture recognition program, otherwise, the terminal judges that the user performs misoperation without processing, and therefore the accuracy of mouse gesture recognition is effectively improved.
Fig. 9 is a block diagram illustrating a mouse gesture recognition apparatus according to a second embodiment of the present invention. The embodiment includes all modules in the embodiment shown in fig. 8, and further, in order to implement a custom mouse gesture, an entry module 250 and an association module 260 are added;
the instruction receiving module 210 is further configured to receive a mouse gesture custom instruction;
the tracking module 220 is further configured to acquire a cursor position according to a movement trend of a cursor corresponding to the mouse in a user-defined movement period, obtain at least one sampling point, and use a movement trend direction of the sampling point as a primary movement direction of the cursor;
the identification module 230 is further configured to convert the moving direction into a corresponding identification code, and generate an identification sequence from all the identification codes in the user-defined moving period;
the entry module 250 is configured to receive an operation option entered on the operation option menu;
the associating module 260 is configured to associate the identification sequence with the operation corresponding to the operation option, and store the associated operation in the database.
The terminal database of the embodiment is preset with the intuitive mapping relation between the mouse gesture and the operation function, so that the user can conveniently memorize and operate the terminal database. In order to provide richer mouse gestures, the terminal also provides a way for a user to define the mouse gestures, and the user can change or add the mouse gestures according to own use habits. When a user clicks a control for adding or changing a mouse gesture on an application interface, a mouse gesture self-defining instruction is triggered, and a self-defining mode is entered. At the moment, in a user-defined movement period, the user moves the mouse to draw a gesture track. For example, assuming that the right mouse button is a mouse gesture trigger button, after entering the user-defined mode, the user clicks the right mouse button, slides the mouse while keeping the right mouse button in a pressed state, draws a track, and obtains the moving directions of the cursors corresponding to the multiple mice. The moving direction is converted into the identification code, and the identification sequence is generated, and the process may refer to the principle of generating the identification sequence in the foregoing embodiment, which is not described herein again. In order to ensure the accuracy of drawing the mouse gesture, after one-time drawing is finished, the terminal can prompt the user to draw repeatedly, if the two-time drawn mouse gestures are the same, the obtained identification sequence is also the same, the drawing of the mouse gesture is considered to be finished, otherwise, the terminal prompts the user whether the input is wrong, and the user asks to draw the mouse gesture again. And after the moving direction recognition is finished and the recognition sequence is generated, popping up an operation option menu, and prompting a user to select the operation which needs to be associated with the mouse gesture. The operation option menu comprises a plurality of preset operation options, and the operation options correspond to background operation programs. And after the user selects the operation option, clicking to determine, and associating the operation option selected by the user with the recognition sequence of the mouse gesture by the terminal. For example, the mouse gesture completed by the user is the letter "G", the corresponding recognition sequence is LDRULR, the operation option selected by the user is "start *** search engine", and the terminal corresponds the letter "G", the recognition sequence LDRULR, and the operation "start *** search engine" one to one, and stores them in the database, thereby completing the mouse gesture custom association.
The embodiment provides a user-defined mouse gesture path for the user, the user can change or add the mouse gesture according to the use habit, the association between the mouse gesture and the operation function is more visual, the user can remember and operate conveniently, more abundant mouse gestures are provided for the user, and the application of the mouse gesture is wider.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (9)

1. A method for recognizing mouse gestures is applied to a terminal and comprises the following steps:
receiving a mouse gesture trigger instruction, acquiring a cursor position according to the movement trend of a cursor corresponding to a mouse in a movement period, obtaining at least one sampling point, and taking the movement trend direction of one sampling point as the primary movement direction of the cursor; converting the moving direction into a corresponding identification code, and generating an identification sequence from all the identification codes in the moving period;
searching for the operation corresponding to the identification sequence in a database, and executing the operation, wherein the database stores a one-to-one correspondence relationship among mouse gestures, the identification sequence and operation contents, the mouse gestures are set according to the characteristic information of the operation contents, and the identification sequence is determined according to the mouse gestures;
the method comprises the following steps of collecting a cursor position according to the movement trend of a cursor corresponding to a mouse in a movement period, obtaining at least one sampling point, and taking the movement trend direction of one sampling point as the one-time movement direction of the cursor, wherein the step comprises the following steps:
acquiring the cursor position according to set sampling time in a moving period to obtain a plurality of sampling points; taking the vector directions of two adjacent sampling points as the primary moving direction of the cursor;
the step of converting the moving direction into a corresponding identification code and generating an identification sequence from all the identification codes in the moving period comprises:
sequentially searching identification codes corresponding to the moving directions each time from the database; judging whether the identification codes corresponding to the two adjacent moving directions are the same or not; if yes, keeping any one of the two same identification codes; if not, the identification codes corresponding to the two adjacent moving directions are reserved; arranging the reserved identification codes in sequence according to the sequence of the corresponding moving direction to generate the identification sequence;
alternatively, the first and second electrodes may be,
searching an identification code corresponding to the first moving direction from the database, and adding the first identification code into an identification sequence; searching an identification code corresponding to the second moving direction from the database, and judging whether the identification code corresponding to the second moving direction is the same as the identification code corresponding to the first moving direction; if so, discarding the second identification code; if not, adding a second identification code to the identification sequence; continuously searching the identification code corresponding to the next moving direction of the second moving direction from the database, and judging whether the identification code corresponding to the next moving direction of the second moving direction is the same as the second identification code or not until judging whether the identification code corresponding to the last moving direction in the moving period is the same as the identification code corresponding to the previous moving direction of the last moving direction or not; if so, abandoning the identification code corresponding to the last moving direction; if not, adding the identification code of the last moving direction into the identification sequence to obtain an identification sequence.
2. The method of mouse gesture recognition according to claim 1, wherein the step of using the vector directions of two adjacent sampling points as the one-time moving direction of the cursor comprises:
respectively taking two adjacent sampling points as a moving front sampling point and a moving rear sampling point of one movement;
acquiring coordinates (x1, y1) of a sampling point of the cursor before the current movement and coordinates (x2, y2) of the sampling point after the current movement;
judging whether the sliding distance | X2-X1| of the cursor moving on the X axis at the time is larger than or equal to the sliding distance | Y2-Y1| on the Y axis;
if yes, when x2-x1>0, determining that the current movement direction of the cursor is rightward; when x2-x1<0, determining that the current movement direction of the cursor is leftward;
if not, when y2-y1>0, determining that the current movement direction of the cursor is upward; when y2-y1<0, the current direction of movement of the cursor is determined to be downward.
3. The method of mouse gesture recognition according to claim 2, wherein the step of receiving a mouse gesture triggering instruction further comprises:
when a mouse gesture trigger key is pressed down, acquiring coordinates (x01, y01) of a sampling point of the cursor before the cursor moves for the first time and coordinates (x02, y02) of the sampling point after the cursor moves for the first time;
when the X-axis sliding distance | X02-X01| or the Y-axis sliding distance | Y02-Y01| of the cursor moving for the first time is larger than or equal to a sliding threshold value, generating a mouse gesture trigger instruction, and counting the first movement into the moving period.
4. The method of mouse gesture recognition according to claim 1, wherein the step of receiving a mouse gesture triggering instruction further comprises:
receiving a mouse gesture self-defining instruction, collecting a cursor position according to the movement trend of a cursor corresponding to a mouse in a self-defining movement period, obtaining at least one sampling point, and taking the movement trend direction of one sampling point as the primary movement direction of the cursor;
converting the moving direction into a corresponding identification code, and generating an identification sequence from all the identification codes in the user-defined moving period;
receiving operation options recorded on the operation option menu;
and associating the identification sequence with the operation corresponding to the operation option and storing the identification sequence in a database.
5. The device for mouse gesture recognition is applied to a terminal, and comprises:
the command receiving module is used for receiving a mouse gesture triggering command;
the tracking module is used for acquiring the cursor position according to the movement trend of the cursor corresponding to the mouse in a movement period to obtain at least one sampling point, and the movement trend direction of one sampling point is taken as the primary movement direction of the cursor;
the identification module is used for converting the moving direction into a corresponding identification code and generating an identification sequence from all the identification codes in the moving period;
the operation module is used for searching the operation corresponding to the identification sequence in a database and executing the operation, wherein the database stores the one-to-one correspondence of mouse gestures, the identification sequence and operation contents, the mouse gestures are set according to the characteristic information of the operation contents, and the identification sequence is determined according to the mouse gestures;
the tracking module is further to:
acquiring the cursor position according to set sampling time in a moving period to obtain a plurality of sampling points; taking the vector directions of two adjacent sampling points as the primary moving direction of the cursor;
the identification module is further configured to:
sequentially searching identification codes corresponding to the moving directions each time from the database; judging whether the identification codes corresponding to the two adjacent moving directions are the same or not; if yes, keeping any one of the two same identification codes; if not, the identification codes corresponding to the two adjacent moving directions are reserved; arranging the reserved identification codes in sequence according to the sequence of the corresponding moving direction to generate the identification sequence;
alternatively, the first and second electrodes may be,
searching an identification code corresponding to the first moving direction from the database, and adding the first identification code into an identification sequence; searching an identification code corresponding to the second moving direction from the database, and judging whether the identification code corresponding to the second moving direction is the same as the identification code corresponding to the first moving direction; if so, discarding the second identification code; if not, adding a second identification code to the identification sequence; continuously searching the identification code corresponding to the next moving direction of the second moving direction from the database, and judging whether the identification code corresponding to the next moving direction of the second moving direction is the same as the second identification code or not until judging whether the identification code corresponding to the last moving direction in the moving period is the same as the identification code corresponding to the previous moving direction of the last moving direction or not; if so, abandoning the identification code corresponding to the last moving direction; if not, adding the identification code of the last moving direction into the identification sequence to obtain an identification sequence.
6. The apparatus for mouse gesture recognition of claim 5, wherein the tracking module is further to:
respectively taking two adjacent sampling points as a moving front sampling point and a moving rear sampling point of one movement;
acquiring coordinates (x1, y1) of a sampling point of the cursor before the current movement and coordinates (x2, y2) of the sampling point after the current movement;
judging whether the sliding distance | X2-X1| of the cursor moving on the X axis at the time is larger than or equal to the sliding distance | Y2-Y1| on the Y axis;
if yes, when x2-x1>0, determining that the current movement direction of the cursor is rightward; when x2-x1<0, determining that the current movement direction of the cursor is leftward;
if not, when y2-y1>0, determining that the current movement direction of the cursor is upward; when y2-y1<0, the current direction of movement of the cursor is determined to be downward.
7. The apparatus for mouse gesture recognition of claim 6, wherein the tracking module is further to:
when a mouse gesture trigger key is pressed down, acquiring coordinates (x01, y01) of a sampling point of the cursor before the cursor moves for the first time and coordinates (x02, y02) of the sampling point after the cursor moves for the first time;
when the X-axis sliding distance | X02-X01| or the Y-axis sliding distance | Y02-Y01| of the cursor moving for the first time is larger than or equal to a sliding threshold value, generating a mouse gesture trigger instruction, and counting the first movement into the moving period.
8. The apparatus for mouse gesture recognition according to claim 5, further comprising an entry module and an association module;
the instruction receiving module is also used for receiving a mouse gesture self-defining instruction;
the tracking module is further used for acquiring the cursor position according to the movement trend of the cursor corresponding to the mouse in a user-defined movement period to obtain at least one sampling point, and the movement trend direction of one sampling point is used as the primary movement direction of the cursor;
the identification module is further used for converting the moving direction into a corresponding identification code and generating an identification sequence from all the identification codes in the user-defined moving period;
the input module is used for receiving the operation options input on the operation option menu;
and the association module is used for associating the identification sequence with the operation corresponding to the operation option and storing the association in a database.
9. A computer-readable storage medium, characterized in that it comprises a stored program, wherein the program is executable by a terminal device or a computer to perform the method of any of claims 1 to 4.
CN201510397490.8A 2015-07-08 2015-07-08 Method and device for recognizing mouse gestures Active CN106325702B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510397490.8A CN106325702B (en) 2015-07-08 2015-07-08 Method and device for recognizing mouse gestures

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510397490.8A CN106325702B (en) 2015-07-08 2015-07-08 Method and device for recognizing mouse gestures

Publications (2)

Publication Number Publication Date
CN106325702A CN106325702A (en) 2017-01-11
CN106325702B true CN106325702B (en) 2021-01-08

Family

ID=57725562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510397490.8A Active CN106325702B (en) 2015-07-08 2015-07-08 Method and device for recognizing mouse gestures

Country Status (1)

Country Link
CN (1) CN106325702B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109101274A (en) * 2018-09-20 2018-12-28 上海施嵌电子科技有限公司 Computer peripheral equipment macroefficiency device and its gesture identification method
CN114518830A (en) * 2022-01-20 2022-05-20 广东迅扬科技股份有限公司 Mouse button-free click control identification and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2378395A2 (en) * 2008-01-11 2011-10-19 Sony Computer Entertainment Inc. Gesture cataloguing and recognition
KR20120001711A (en) * 2011-11-21 2012-01-04 엔에이치엔(주) System and method for providing web page including mouse gesture function
CN102332024A (en) * 2011-09-30 2012-01-25 奇智软件(北京)有限公司 Touch control type browser for portable mobile terminal
CN103076918A (en) * 2012-12-28 2013-05-01 深圳Tcl新技术有限公司 Remote control method and system based on touch terminal
CN104484115A (en) * 2014-12-01 2015-04-01 林志均 Mouse gesture recognition method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8159457B2 (en) * 2006-09-27 2012-04-17 Yahoo! Inc. Zero-click activation of an application
CN101526861B (en) * 2008-03-06 2012-01-18 郑国书 Recognition of mouse gesture, and method for perfoming relevant order according to recognition results
CN101408824A (en) * 2008-11-18 2009-04-15 广东威创视讯科技股份有限公司 Method for recognizing mouse gesticulation
CN102402361B (en) * 2010-09-08 2015-08-12 腾讯科技(深圳)有限公司 Motion track based on mouse carries out the method and apparatus controlled on computers
CN102662581B (en) * 2012-03-31 2015-06-24 北京奇虎科技有限公司 Method and system for performing control by mouse input

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2378395A2 (en) * 2008-01-11 2011-10-19 Sony Computer Entertainment Inc. Gesture cataloguing and recognition
CN102332024A (en) * 2011-09-30 2012-01-25 奇智软件(北京)有限公司 Touch control type browser for portable mobile terminal
KR20120001711A (en) * 2011-11-21 2012-01-04 엔에이치엔(주) System and method for providing web page including mouse gesture function
CN103076918A (en) * 2012-12-28 2013-05-01 深圳Tcl新技术有限公司 Remote control method and system based on touch terminal
CN104484115A (en) * 2014-12-01 2015-04-01 林志均 Mouse gesture recognition method

Also Published As

Publication number Publication date
CN106325702A (en) 2017-01-11

Similar Documents

Publication Publication Date Title
US8577100B2 (en) Remote input method using fingerprint recognition sensor
JP6427559B6 (en) Permanent synchronization system for handwriting input
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
US9207861B2 (en) Method and mobile terminal for processing touch input in two different states
EP3001285B1 (en) Input method and system
CN103135930A (en) Touch screen control method and device
JP2009505175A (en) Improved method and apparatus for selecting database items from a database
JP2017033528A (en) Information display device, information display method, and program
KR101591586B1 (en) Data processing apparatus which detects gesture operation
CN106325702B (en) Method and device for recognizing mouse gestures
Watanabe et al. Remote touch pointing for smart TV interaction
CN104808899A (en) Terminal
KR101451943B1 (en) Method and set-top box for controlling screen
CN105094344B (en) Fixed terminal control method and device
US20150138095A1 (en) Device and method for inputting information
US11726580B2 (en) Non-standard keyboard input system
CN104951293B (en) The key response method and mobile terminal of a kind of mobile terminal
CN113448465A (en) Virtual cursor control method and device, storage medium and electronic equipment
CN111176545B (en) Equipment control method, system, electronic equipment and storage medium
CN114237458A (en) Method, device and storage medium for controlling UI in multidimensional mode independent of OSD position
KR20130136030A (en) Device and method of controlling function by movement of finger on pointing device
KR101348763B1 (en) Apparatus and method for controlling interface using hand gesture and computer-readable recording medium with program therefor
CN112118491A (en) Bullet screen generation method and device and computer readable storage medium
CN110765736B (en) Mathematical expression input method and device and mobile equipment
US12013987B2 (en) Non-standard keyboard input system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant