CN112684895A - Marking method, device, equipment and computer storage medium - Google Patents

Marking method, device, equipment and computer storage medium Download PDF

Info

Publication number
CN112684895A
CN112684895A CN202011636320.8A CN202011636320A CN112684895A CN 112684895 A CN112684895 A CN 112684895A CN 202011636320 A CN202011636320 A CN 202011636320A CN 112684895 A CN112684895 A CN 112684895A
Authority
CN
China
Prior art keywords
gesture
user
marking
cursor
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011636320.8A
Other languages
Chinese (zh)
Inventor
孙红伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Hongcheng Opto Electronics Co Ltd
Original Assignee
Anhui Hongcheng Opto Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Hongcheng Opto Electronics Co Ltd filed Critical Anhui Hongcheng Opto Electronics Co Ltd
Priority to CN202011636320.8A priority Critical patent/CN112684895A/en
Publication of CN112684895A publication Critical patent/CN112684895A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a marking method, a marking device, marking equipment and a computer storage medium. The method comprises the following steps: acquiring a hand image of a user; determining user gestures from the hand images; and under the condition that the user gesture is matched with the preset marking gesture, controlling the cursor to move along with the movement of the user gesture, and marking a passing path until the user gesture is switched to other gestures except the marking gesture. By adopting the marking method, the marking device, the marking equipment and the computer storage medium, a user can perform marking operation through a user gesture without directly contacting any equipment, so that the operation mode is more flexible, and the use experience of the user is improved.

Description

Marking method, device, equipment and computer storage medium
Technical Field
The present application belongs to the field of interaction technologies, and in particular, to a marking method, apparatus, device, and computer storage medium.
Background
With the continuous development of the interactive technology, most electronic devices have marking functions such as writing and marking.
In the prior art, when people perform marking operations such as writing, marking and the like on electronic equipment, a cursor is controlled to mark mainly through physical controllers such as a mouse, a remote controller and the like or through direct contact of hands with a screen with a touch screen.
In the process of implementing the present application, the inventor finds that the prior art has at least the following problems: when the user carries out the marking operation, the user needs to directly contact the entity controller or the screen, so that the operation mode is not flexible enough, and the use experience of the user is reduced.
Disclosure of Invention
The embodiment of the application provides a marking method, a marking device, marking equipment and a computer storage medium, and can at least solve the problems that in the prior art, when a user carries out marking operation, the operation mode is not flexible enough, and the user experience is low.
In a first aspect, an embodiment of the present application provides a marking method, where the marking method includes:
acquiring a hand image of a user;
determining user gestures from the hand images;
and under the condition that the user gesture is matched with the preset marking gesture, controlling the cursor to move along with the movement of the user gesture, and marking a passing path until the user gesture is switched to other gestures except the marking gesture.
In an alternative embodiment, after determining the user gesture from the hand image, the marking method further comprises:
and under the condition that the user gesture is matched with the preset cursor movement gesture, controlling the cursor to move along with the movement of the user gesture until the user gesture is switched into other gestures except the cursor movement gesture so as to move the cursor to the mark initial position.
In an alternative embodiment, controlling the cursor to move following the movement of the user gesture includes:
acquiring corresponding hand position information according to the hand image;
determining the moving direction and the gesture moving speed of the user gesture according to hand position information respectively corresponding to a plurality of hand images acquired in real time;
and controlling the cursor to move according to the moving direction and the gesture moving speed.
In an optional implementation manner, determining a moving direction and a gesture moving speed of a user gesture according to hand position information corresponding to a plurality of hand images acquired in real time respectively includes:
according to a preset smoothing algorithm, smoothing hand position information corresponding to a plurality of hand images acquired in real time respectively;
and determining the moving direction and the gesture moving speed of the user gesture according to the smoothed hand position information.
In an alternative embodiment, the controlling the cursor to move according to the moving direction and the gesture moving speed comprises the following steps:
determining the cursor moving speed according to the gesture moving speed;
and controlling the cursor to move according to the moving direction and the moving speed of the cursor.
In a second aspect, an embodiment of the present application provides a marking device, including:
the acquisition module is used for acquiring a hand image of a user;
the determining module is used for determining the user gesture according to the hand image;
and the marking module is used for controlling the cursor to move along with the movement of the user gesture under the condition that the user gesture is matched with the preset marking gesture, and marking the passing path until the user gesture is switched to other gestures except for the marking gesture.
In an alternative embodiment, the marking device further comprises:
and the moving module is used for controlling the cursor to move along with the movement of the user gesture under the condition that the user gesture is matched with the preset cursor moving gesture until the user gesture is switched to other gestures except the cursor moving gesture so as to move the cursor to the mark initial position.
In an alternative embodiment, the moving module comprises:
the acquisition sub-module is used for acquiring corresponding hand position information according to the hand image;
the determining submodule is used for determining the moving direction and the gesture moving speed of the user gesture according to hand position information corresponding to the plurality of hand images acquired in real time;
and the moving sub-module is used for controlling the cursor to move according to the moving direction and the gesture moving speed.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory storing computer program instructions;
the processor, when executing the computer program instructions, implements the tagging method as described in any of the embodiments of the first aspect.
In a fourth aspect, the present application provides a computer storage medium having computer program instructions stored thereon, where the computer program instructions, when executed by a processor, implement the marking method as described in any one of the embodiments of the first aspect.
According to the marking method, the marking device, the marking equipment and the computer storage medium, the user gesture is determined through the acquired hand image of the user, the cursor is controlled to move along with the movement of the user gesture under the condition that the user gesture is matched with the preset marking gesture, and the passing path is marked, so that the user can perform marking operation through the user gesture without directly contacting any equipment, the operation mode is more flexible, and the use experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow diagram illustrating a marking method in accordance with an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating a user gesture according to an exemplary embodiment;
FIG. 3 is a schematic diagram illustrating another user gesture in accordance with an illustrative embodiment;
FIG. 4 is a schematic flow diagram illustrating another marking method in accordance with an exemplary embodiment;
FIG. 5 is a schematic flow diagram illustrating yet another tagging method according to an exemplary embodiment;
FIG. 6 is a schematic diagram illustrating the structure of a marking system in accordance with an exemplary embodiment;
FIG. 7 is a schematic diagram illustrating the construction of a marking device according to an exemplary embodiment;
fig. 8 is a schematic structural diagram of an electronic device according to an exemplary embodiment.
Detailed Description
Features and exemplary embodiments of various aspects of the present application will be described in detail below, and in order to make objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are intended to be illustrative only and are not intended to be limiting. It will be apparent to one skilled in the art that the present application may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present application by illustrating examples thereof.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Fig. 1 shows a schematic flow chart of a marking method according to an embodiment of the present application.
As shown in fig. 1, the main execution body of the marking method may be a marking device, and specifically may include the following steps:
firstly, S110, acquiring a hand image of a user;
secondly, S120, determining the gesture of the user according to the hand image;
and finally, S130, under the condition that the user gesture is matched with the preset marking gesture, controlling the cursor to move along with the movement of the user gesture, and marking the passing path until the user gesture is switched to other gestures except for the marking gesture.
Therefore, the user gesture is determined through the acquired hand image of the user, and then under the condition that the user gesture is matched with the preset marking gesture, the cursor is controlled to move along with the movement of the user gesture, and the passing path is marked, so that the user does not need to directly contact any equipment, the marking operation can be carried out through the user gesture, the operation mode is more flexible, and the use experience of the user is improved.
The above steps are described in detail below, specifically as follows:
referring to S110, in the embodiment of the present application, an image capturing device may be disposed on an electronic device integrated with the above-mentioned marking device, and a hand image of a user using the electronic device may be captured by the image capturing device. The image capturing device may be, for example, an image sensor such as a depth Of view (TOF) camera, a binocular vision camera, or a laser radar. When the user puts various gestures with the hand, the gestures can be embodied in the hand image, such as: an extension gesture image as shown in fig. 2, an OK gesture image as shown in fig. 3, and the like. The hand image may be used to determine user gestures.
Referring to S120, the current user gesture may be analyzed by recognizing the hand image of the user, for example, a gesture of extending the palm and directing the palm toward the display screen, which is analyzed according to the stretch gesture image shown in fig. 2, and a gesture of an OK hand and directing the palm toward the display screen, which is analyzed according to the OK gesture image shown in fig. 3. The user gesture can be used for comparing with a preset gesture to determine the type of the user gesture.
Finally, referring to S130, the preset marking gesture may be a preset gesture for controlling the cursor to mark a passing path, for example, the preset marking gesture may be a gesture for controlling the cursor to mark the passing path, where the preset marking gesture may be an OK hand type gesture and a gesture for controlling the palm of the hand to face the display screen, the user keeps the OK hand type gesture and the gesture for controlling the palm of the hand to face the display screen unchanged and moves at the same time, and when the marking device acquires the gesture and recognizes that the gesture matches the preset marking gesture, the marking function is turned on, and when the cursor (for example, a painting brush) is controlled to move along with the movement of the user gesture, the passing path is marked until the user gesture is switched to another gesture.
Based on this, after the above S120, the marking method may further include:
and under the condition that the user gesture is matched with the preset cursor movement gesture, controlling the cursor to move along with the movement of the user gesture until the user gesture is switched into other gestures except the cursor movement gesture so as to move the cursor to the mark initial position.
Here, the preset cursor movement gesture may be a preset gesture for controlling the cursor to move following the movement of the user gesture.
In a specific example, the preset cursor movement gesture may be a gesture in which a palm extends away and a palm faces the display screen, the user keeps the gesture in which the palm extends away and the palm faces the display screen unchanged and moves at the same time, and when the marking device acquires the gesture and recognizes that the gesture is matched with the preset cursor movement gesture, the marking device controls the cursor to move along with the movement of the user gesture until the user gesture is switched to another gesture.
Therefore, through the process, the cursor can be moved to the mark initial position according to the cursor movement gesture, so that a user can control the cursor to move through the gesture without directly contacting any equipment.
In addition to the above S110-S130, in a possible embodiment, as shown in fig. 4, the above mentioned controlling the cursor to move along with the movement of the user gesture may specifically include:
and S410, acquiring corresponding hand position information according to the hand image.
Here, the hand position information may be the current position of the user's hand, which may be used to determine the real-time position of the user's gesture.
In a specific example, the hand position information may be a coordinate value of the hand, and the coordinate value corresponding to the hand position at the current time may be determined according to the acquired hand image of the user.
And S420, determining the moving direction and the gesture moving speed of the user gesture according to the hand position information corresponding to the plurality of hand images acquired in real time.
Here, the moving direction and the gesture moving speed of the user gesture may be directly determined according to the plurality of pieces of real-time hand position information, or the moving direction and the gesture moving speed of the user gesture may be determined according to the hand position information after smoothing processing by smoothing the plurality of pieces of acquired real-time hand position information.
Specifically, the real-time moving direction and the gesture moving speed of the current user gesture can be determined according to the position information of any two hands.
In an optional implementation manner, determining the moving direction and the gesture moving speed of the user gesture may specifically include:
according to a preset smoothing algorithm, smoothing hand position information corresponding to a plurality of hand images acquired in real time respectively;
and determining the moving direction and the gesture moving speed of the user gesture according to the smoothed hand position information.
Here, the preset smoothing algorithm may include kalman filtering, particle filtering, curve fitting, and other algorithms, and may be configured to smooth hand position information corresponding to each of the plurality of hand images acquired in real time.
In a specific example, hand coordinates corresponding to a plurality of real-time hand images may be obtained, the hand coordinates are smoothed by a curve fitting algorithm, and a moving direction and a gesture moving speed of a user gesture are determined according to each smoothed hand coordinate.
Therefore, through the process, the moving direction and the gesture moving speed of the user gesture can be determined according to the position information of each hand after smoothing processing, so that the moving track of the cursor is smoother, and the user feels more comfortable.
And S430, controlling the cursor to move according to the moving direction and the gesture moving speed.
Here, during the movement, the cursor may move according to the same direction as the moving direction of the user gesture; the movement may be performed at the same speed as the gesture movement speed, or may be performed at a movement speed proportional to the gesture movement speed.
In an optional implementation, controlling the cursor to move may specifically include:
determining the cursor moving speed according to the gesture moving speed;
and controlling the cursor to move according to the moving direction and the moving speed of the cursor.
Here, the moving direction may be any direction that is the same as the user gesture moving direction, and the cursor moving speed may be a moving speed that is in a preset proportion to the gesture moving speed. The moving direction and the moving speed of the cursor can be used as the basis for controlling the movement of the cursor. Therefore, the effect of controlling the moving speed of the cursor through the moving speed of the user gesture can be achieved.
In a specific example, the moving direction may be upward, and the preset ratio of the gesture moving speed to the cursor moving speed may be 1: 2; when the user gesture moves upward at a speed of 5cm/s, the cursor moves upward at a speed of 10 cm/s.
Therefore, through the process, the movement of the cursor can be controlled according to the movement direction and the movement speed of the user gesture, so that the user can control the movement direction and the movement speed of the cursor through the gesture without directly contacting any equipment.
To better describe the whole scheme, based on the above embodiments, as a specific example, as shown in fig. 5, the marking method may include S501 to S509, which is explained in detail below.
S501, acquiring a hand image of the user.
In a specific example, when a user wants to mark some content on the screen, it needs to first determine whether the cursor is at the starting position of the intended mark. If the cursor on the screen is at the starting point of the mark to be scribed, directly performing scribing mark operation; if the cursor on the screen is not at the starting point of the intended marking, the cursor needs to be moved to the starting point of the marking first. For example, at this time, the user makes a gesture in which the palm is stretched and the palm is oriented toward the display screen, and moves to the starting point position where the marking is desired. The marking device can acquire a hand image of a user with the palm stretched and the palm facing the display screen.
And S502, determining the user gesture as a cursor moving gesture according to the hand image.
In a specific example, the acquired hand image of the user with the palm stretched and the palm facing the display screen is compared with a preset cursor movement gesture, and if the cursor movement gesture is confirmed, real-time hand position information is acquired; and if the gesture is not the cursor movement gesture, continuously acquiring the hand image of the user.
And S503, acquiring real-time hand position information according to the hand images of the user.
In a specific example, hand position information of the user can be acquired in real time and used for determining a gesture movement track of the user.
And S504, processing the real-time hand position information to obtain a smooth gesture movement track.
In one specific example, the smoothing algorithm may be preset, such as: and performing smoothing treatment on the real-time hand position information by Kalman filtering, particle filtering, curve fitting and the like to obtain a smooth gesture movement track.
And S505, controlling the cursor to move according to the smooth gesture movement track until the gesture of the user is determined to be no longer the cursor movement gesture according to the hand image.
In a specific example, in the process of moving according to the smooth gesture movement trajectory, the direction of the cursor movement and the direction of the gesture movement are kept consistent, and the speed of the cursor movement and the speed of the gesture movement may be converted according to a certain mapping relationship.
After the cursor moves to the start point where the user wants to make the scribe mark, the user may change the gesture to another gesture to end the movement control of the cursor.
S506, determining the user gesture as a marking gesture according to the hand image, starting a marking function, and converting the cursor into a brush.
In one specific example, after the cursor moves to the starting point where the user wants to mark the line, the user can convert the gesture into a gesture of an OK hand shape, with the palm of the hand facing the display screen, and move along the position where the user wants to mark the line. The marking device can acquire hand images of the OK hand shape of the user and the palm of the hand facing the display screen.
Specifically, the acquired hand image of the user OK hand shape and the palm facing the display screen may be compared with a preset marking gesture, and if the hand image is confirmed to be the marking gesture, real-time hand position information is acquired; and if the gesture is not the mark gesture, continuously acquiring the hand image of the user.
And S507, acquiring real-time hand position information.
In a specific example, hand position information of the user can be acquired in real time and used for determining a gesture movement track of the user.
And S508, processing the real-time hand position information to obtain a smooth gesture movement track.
In one specific example, the smoothing algorithm may be preset, such as: and performing smoothing treatment on the real-time hand position information by Kalman filtering, particle filtering, curve fitting and the like to obtain a smooth gesture movement track.
And S509, moving the brush according to the smooth gesture moving track and marking a path through which the brush moves until the gesture of the user is determined to be no longer a marking gesture according to the hand image.
In a specific example, in the process of moving the brush according to the smooth gesture movement trajectory, the moving direction of the brush and the moving direction of the gesture are kept consistent, and the moving speed of the brush and the moving speed of the gesture can be converted according to a certain mapping relation.
After the marking is completed, the user may transform the gesture into another gesture other than the marking gesture to end the marking behavior. The marking device can acquire a hand image of a user to judge whether the user gesture is a marking gesture. Specifically, the acquired hand image of the user may be compared with a preset marking gesture, and if the marking gesture is confirmed, real-time hand position information is acquired; if the gesture is not the marking gesture, the marking is finished.
Therefore, the user gesture is determined through the acquired hand image of the user, and then under the condition that the user gesture is matched with the preset marking gesture, the cursor is controlled to move along with the movement of the user gesture, and the passing path is marked, so that the user does not need to directly contact any equipment, the marking operation can be carried out through the user gesture, the operation mode is more flexible, and the use experience of the user is improved.
Based on the above marking method, in one possible embodiment, there is a marking system, as shown in fig. 6, comprising: a gesture recognition module 610, a smoothing module 620, a cursor movement module 630, and a marking module 640.
The gesture recognition module 610 is configured to recognize a gesture type of an operator and output coordinates of a hand in real time. The module may specifically include: a gesture image acquisition sub-module 611 and an image information processing sub-module 612. The gesture image acquisition sub-module 611 may select image acquisition devices such as TOF, binocular vision, and laser radar to acquire hand images; the image information processing sub-module 612 may be configured to perform gesture recognition processing on the acquired hand image according to a preset processing algorithm, and output position coordinates of the hand. The image information processing sub-module 612 may be built in a processor of the display system, or may be a single processor and output a recognition result.
The smoothing module 620 may be configured to optimize the real-time hand position coordinates to obtain a smooth gesture movement trajectory. The smoothing module 620 may perform smoothing on the real-time hand three-dimensional position coordinate value output by the image information processing submodule 612 by using algorithms such as kalman filtering, particle filtering, curve fitting, and the like, so as to obtain a smooth gesture movement trajectory in real time.
The cursor moving module 630 may be configured to control a cursor in the display system to move. When the system is in a cursor moving state, for example, when the current gesture of the user is recognized as a cursor moving gesture, the user keeps the cursor moving gesture and moves, and the cursor moves along with the cursor moving gesture.
And the marking module 640 can be used for controlling a brush tool in the display system to realize a marking function. When the system is in the scribe-mark state, for example, it is recognized that the user's current gesture is a mark gesture, the on-screen cursor changes to a brush. The user holds the marker gesture and moves, the brush moves with it, and a marker is formed on the path that the brush passes through.
Therefore, the user gesture is determined through the acquired hand image of the user, and then under the condition that the user gesture is matched with the preset marking gesture, the cursor is controlled to move along with the movement of the user gesture, and the passing path is marked, so that the user does not need to directly contact any equipment, the marking operation can be carried out through the user gesture, the operation mode is more flexible, and the use experience of the user is improved.
Based on the same inventive concept, the application also provides a marking device. The marking device provided by the embodiment of the present application will be described in detail with reference to fig. 7.
FIG. 7 is a schematic diagram illustrating the structure of a marking device according to an exemplary embodiment.
As shown in fig. 7, the marking device may include:
an obtaining module 701, configured to obtain a hand image of a user;
a determination module 702 for determining a user gesture from the hand image;
the marking module 703 is configured to, when the user gesture matches a preset marking gesture, control the cursor to move along with the movement of the user gesture, and mark a passing path until the user gesture is switched to another gesture other than the marking gesture.
In one embodiment, the apparatus may further comprise:
and the moving module 704 is configured to, under the condition that the user gesture matches the preset cursor movement gesture, control the cursor to move along with the movement of the user gesture until the user gesture is switched to another gesture other than the cursor movement gesture, so as to move the cursor to the mark start position.
In one embodiment, the moving module 704 may specifically include:
the acquisition sub-module is used for acquiring corresponding hand position information according to the hand image;
the determining submodule is used for determining the moving direction and the gesture moving speed of the user gesture according to hand position information corresponding to the plurality of hand images acquired in real time;
and the moving sub-module is used for controlling the cursor to move according to the moving direction and the gesture moving speed.
In one embodiment, the determining the sub-module may specifically include:
the processing unit is used for smoothing the hand position information corresponding to the hand images acquired in real time according to a preset smoothing algorithm;
and the first determining unit is used for determining the moving direction and the gesture moving speed of the user gesture according to the smoothed hand position information.
In one embodiment, the mobile sub-module may specifically include:
the second determination unit is used for determining the cursor movement speed according to the gesture movement speed;
and the moving unit is used for controlling the cursor to move according to the moving direction and the moving speed of the cursor.
Therefore, the user gesture is determined through the acquired hand image of the user, and then under the condition that the user gesture is matched with the preset marking gesture, the cursor is controlled to move along with the movement of the user gesture, and the passing path is marked, so that the user does not need to directly contact any equipment, the marking operation can be carried out through the user gesture, the operation mode is more flexible, and the use experience of the user is improved.
Fig. 8 is a schematic structural diagram of an electronic device according to an exemplary embodiment.
As shown in fig. 8, the electronic device 8 may include a processor 801 and a memory 802 that stores computer program instructions.
Specifically, the processor 801 may include a Central Processing Unit (CPU), or an Application Specific Integrated Circuit (ASIC), or may be configured to implement one or more integrated circuits of the embodiments of the present application.
Memory 802 may include a mass storage for information or instructions. By way of example, and not limitation, memory 802 may include a Hard Disk Drive (HDD), a floppy disk drive, flash memory, an optical disk, a magneto-optical disk, a magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. Memory 802 may include removable or non-removable (or fixed) media, where appropriate. Memory 802 may be internal or external to the integrated gateway device, where appropriate. In a particular embodiment, the memory 802 is a non-volatile solid-state memory. In a particular embodiment, the memory 802 includes Read Only Memory (ROM). Where appropriate, the ROM may be mask-programmed ROM, Programmable ROM (PROM), Erasable PROM (EPROM), Electrically Erasable PROM (EEPROM), electrically rewritable ROM (EAROM), or flash memory, or a combination of two or more of these.
The processor 801 reads and executes the computer program instructions stored in the memory 802 to implement the method in the embodiment shown in fig. 1 or fig. 4, and achieve the corresponding technical effect, which is not described herein again for brevity.
In one embodiment, the electronic device 8 may also include a transceiver 803 and a bus 804. As shown in fig. 8, the processor 801, the memory 802, and the transceiver 803 are connected via a bus 804 to complete communication with each other.
Bus 804 includes hardware, software, or both. By way of example, and not limitation, a bus may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a Hypertransport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Control Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus or a combination of two or more of these. Bus 804 may include one or more buses, where appropriate. Although specific buses are described and shown in the embodiments of the application, any suitable buses or interconnects are contemplated by the application.
The embodiment of the present application further provides a computer storage medium, in which computer program instructions are stored, and the computer program instructions are used to implement the marking method described in the embodiment of the present application.
In some possible embodiments, various aspects of the methods provided by the present application may also be implemented in the form of a program product including program code for causing a computer device to perform the steps of the methods according to various exemplary embodiments of the present application described above in this specification when the program product is run on the computer device, for example, the computer device may perform the marking methods described in the embodiments of the present application.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may be, for example but not limited to: an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable information processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable information processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable information processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable information processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A marking method, comprising:
acquiring a hand image of a user;
determining a user gesture from the hand image;
and under the condition that the user gesture is matched with a preset marking gesture, controlling a cursor to move along with the movement of the user gesture, and marking a passing path until the user gesture is switched to other gestures except the marking gesture.
2. The marking method as claimed in claim 1, characterized in that after determining a user gesture from the hand image, the marking method further comprises:
and under the condition that the user gesture is matched with a preset cursor movement gesture, controlling the cursor to move along with the movement of the user gesture until the user gesture is switched to other gestures except the cursor movement gesture so as to move the cursor to a mark initial position.
3. The marking method according to claim 1 or 2, characterized in that the movement of the control cursor following the movement of the user gesture comprises:
acquiring corresponding hand position information according to the hand image;
determining the movement direction and the gesture movement speed of the user gesture according to hand position information respectively corresponding to a plurality of hand images acquired in real time;
and controlling the cursor to move according to the moving direction and the gesture moving speed.
4. The marking method according to claim 3, wherein the determining the moving direction and the gesture moving speed of the user gesture according to the hand position information corresponding to the plurality of hand images acquired in real time comprises:
according to a preset smoothing algorithm, smoothing hand position information corresponding to a plurality of hand images acquired in real time respectively;
and determining the moving direction and the gesture moving speed of the user gesture according to the smoothed hand position information.
5. The marking method according to claim 3, wherein the controlling the cursor to move according to the moving direction and the gesture moving speed comprises:
determining the cursor moving speed according to the gesture moving speed;
and controlling the cursor to move according to the moving direction and the moving speed of the cursor.
6. A marking device, comprising:
the acquisition module is used for acquiring a hand image of a user;
a determination module for determining user gestures from the hand images;
and the marking module is used for controlling a cursor to move along with the movement of the user gesture under the condition that the user gesture is matched with a preset marking gesture, and marking a passing path until the user gesture is switched to other gestures except the marking gesture.
7. The marking device of claim 6, further comprising:
and the moving module is used for controlling the cursor to move along with the movement of the user gesture under the condition that the user gesture is matched with a preset cursor moving gesture until the user gesture is switched to other gestures except the cursor moving gesture so as to move the cursor to the mark initial position.
8. A marking device as claimed in claim 6 or 7, characterized in that said movement module comprises:
the acquisition sub-module is used for acquiring corresponding hand position information according to the hand image;
the determining submodule is used for determining the moving direction and the gesture moving speed of the user gesture according to hand position information corresponding to the hand images acquired in real time;
and the moving submodule is used for controlling the cursor to move according to the moving direction and the gesture moving speed.
9. An electronic device, characterized in that the electronic device comprises: a processor and a memory storing computer program instructions;
the processor, when executing the computer program instructions, implements the tagging method of any one of claims 1-5.
10. A computer storage medium having computer program instructions stored thereon which, when executed by a processor, implement the marking method of any one of claims 1 to 5.
CN202011636320.8A 2020-12-31 2020-12-31 Marking method, device, equipment and computer storage medium Pending CN112684895A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011636320.8A CN112684895A (en) 2020-12-31 2020-12-31 Marking method, device, equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011636320.8A CN112684895A (en) 2020-12-31 2020-12-31 Marking method, device, equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN112684895A true CN112684895A (en) 2021-04-20

Family

ID=75456433

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011636320.8A Pending CN112684895A (en) 2020-12-31 2020-12-31 Marking method, device, equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN112684895A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113610944A (en) * 2021-07-30 2021-11-05 新线科技有限公司 Line drawing method, device, equipment and storage medium
CN114637439A (en) * 2022-03-24 2022-06-17 海信视像科技股份有限公司 Display device and gesture track recognition method

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102541084A (en) * 2010-12-21 2012-07-04 新奥特(北京)视频技术有限公司 Method for automatically drawing target point trajectory
US20120268372A1 (en) * 2011-04-19 2012-10-25 Jong Soon Park Method and electronic device for gesture recognition
US20140078076A1 (en) * 2012-09-18 2014-03-20 Adobe Systems Incorporated Natural Language Image Tags
CN103955277A (en) * 2014-05-13 2014-07-30 广州三星通信技术研究有限公司 Method and device for controlling cursor on electronic equipment
CN105302464A (en) * 2015-10-28 2016-02-03 北京京东尚科信息技术有限公司 Flow document scribing system and method
CN106125928A (en) * 2016-06-24 2016-11-16 同济大学 PPT based on Kinect demonstrates aid system
CN106909234A (en) * 2015-12-23 2017-06-30 小米科技有限责任公司 Method, control device, terminal and the device being marked to display screen
CN106990840A (en) * 2017-03-27 2017-07-28 联想(北京)有限公司 control method and control system
CN107463331A (en) * 2017-08-15 2017-12-12 上海闻泰电子科技有限公司 Gesture path analogy method, device and electronic equipment
CN108932053A (en) * 2018-05-21 2018-12-04 腾讯科技(深圳)有限公司 Drawing practice, device, storage medium and computer equipment based on gesture
CN111273778A (en) * 2020-02-14 2020-06-12 北京百度网讯科技有限公司 Method and device for controlling electronic equipment based on gestures
CN111382598A (en) * 2018-12-27 2020-07-07 北京搜狗科技发展有限公司 Identification method and device and electronic equipment

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102541084A (en) * 2010-12-21 2012-07-04 新奥特(北京)视频技术有限公司 Method for automatically drawing target point trajectory
US20120268372A1 (en) * 2011-04-19 2012-10-25 Jong Soon Park Method and electronic device for gesture recognition
US20140078076A1 (en) * 2012-09-18 2014-03-20 Adobe Systems Incorporated Natural Language Image Tags
CN103955277A (en) * 2014-05-13 2014-07-30 广州三星通信技术研究有限公司 Method and device for controlling cursor on electronic equipment
CN105302464A (en) * 2015-10-28 2016-02-03 北京京东尚科信息技术有限公司 Flow document scribing system and method
CN106909234A (en) * 2015-12-23 2017-06-30 小米科技有限责任公司 Method, control device, terminal and the device being marked to display screen
CN106125928A (en) * 2016-06-24 2016-11-16 同济大学 PPT based on Kinect demonstrates aid system
CN106990840A (en) * 2017-03-27 2017-07-28 联想(北京)有限公司 control method and control system
CN107463331A (en) * 2017-08-15 2017-12-12 上海闻泰电子科技有限公司 Gesture path analogy method, device and electronic equipment
CN108932053A (en) * 2018-05-21 2018-12-04 腾讯科技(深圳)有限公司 Drawing practice, device, storage medium and computer equipment based on gesture
CN111382598A (en) * 2018-12-27 2020-07-07 北京搜狗科技发展有限公司 Identification method and device and electronic equipment
CN111273778A (en) * 2020-02-14 2020-06-12 北京百度网讯科技有限公司 Method and device for controlling electronic equipment based on gestures

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
周彤: "《体育三维展示技术:以太极***互平台为例》", 31 December 2016 *
邓甲昊、叶勇、陈慧敏;: "《电容探测原理及应用》", 30 April 2019 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113610944A (en) * 2021-07-30 2021-11-05 新线科技有限公司 Line drawing method, device, equipment and storage medium
WO2023005139A1 (en) * 2021-07-30 2023-02-02 新线科技有限公司 Line drawing method and apparatus, electronic device and computer-readable storage medium
CN113610944B (en) * 2021-07-30 2024-06-14 新线科技有限公司 Line drawing method, device, equipment and storage medium
CN114637439A (en) * 2022-03-24 2022-06-17 海信视像科技股份有限公司 Display device and gesture track recognition method

Similar Documents

Publication Publication Date Title
CN104076986B (en) A kind of method of toch control for multiple point touching terminal and equipment
CN106934333B (en) Gesture recognition method and system
US20110291926A1 (en) Gesture recognition system using depth perceptive sensors
CN112684895A (en) Marking method, device, equipment and computer storage medium
CN103353935A (en) 3D dynamic gesture identification method for intelligent home system
KR101631011B1 (en) Gesture recognition apparatus and control method of gesture recognition apparatus
TWI496094B (en) Gesture recognition module and gesture recognition method
JP2014119295A (en) Control device and portable terminal
TWI571772B (en) Virtual mouse driving apparatus and virtual mouse simulation method
KR20120026043A (en) Information processing device, information processing method, and program
US10035539B2 (en) Steering wheel control system
CN111414837A (en) Gesture recognition method and device, computer equipment and storage medium
CN109240494B (en) Control method, computer-readable storage medium and control system for electronic display panel
US20150193001A1 (en) Input device, apparatus, input method, and recording medium
CN105159494A (en) Information display method and device
CN105808129B (en) Method and device for quickly starting software function by using gesture
CN105242888A (en) System control method and electronic device
CN106598422B (en) hybrid control method, control system and electronic equipment
US9886085B2 (en) Image processing apparatus, image processing method, and program
CN106569716B (en) Single-hand control method and control system
JP2016167268A (en) Gesture modeling device, gesture modeling method, program for gesture modeling system, and gesture modeling system
JP6033061B2 (en) Input device and program
CN113282164A (en) Processing method and device
CN108073267B (en) Three-dimensional control method and device based on motion trail
US20140301603A1 (en) System and method for computer vision control based on a combined shape

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210420

RJ01 Rejection of invention patent application after publication