CN107316025B - Hand gesture recognition method and system - Google Patents

Hand gesture recognition method and system Download PDF

Info

Publication number
CN107316025B
CN107316025B CN201710505926.XA CN201710505926A CN107316025B CN 107316025 B CN107316025 B CN 107316025B CN 201710505926 A CN201710505926 A CN 201710505926A CN 107316025 B CN107316025 B CN 107316025B
Authority
CN
China
Prior art keywords
hand
posture
data
standard
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710505926.XA
Other languages
Chinese (zh)
Other versions
CN107316025A (en
Inventor
那日松
齐越
李楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Combanc Technology Co ltd
Original Assignee
Beijing Combanc Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Combanc Technology Co ltd filed Critical Beijing Combanc Technology Co ltd
Priority to CN201710505926.XA priority Critical patent/CN107316025B/en
Publication of CN107316025A publication Critical patent/CN107316025A/en
Application granted granted Critical
Publication of CN107316025B publication Critical patent/CN107316025B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The hand gesture recognition method is used for solving the technical problem that high-accuracy hand gestures cannot be formed due to the fact that the recognition method is low in robustness and influenced by the quality of depth image data. The method comprises the following steps: extracting depth local gradient characteristic data in the preliminary depth image to form hand preliminary posture characteristic data; the hand preliminary posture characteristic data form hand current posture data through a hand posture classifier; and determining hand standard posture data close to the hand current posture data through the storage index, and comparing to determine the hand standard posture corresponding to the hand current posture. The invention overcomes the defect that the prediction result error of the classifier is larger due to the influence of depth data holes and noise in the depth image. And the selected standard hand postures are distributed near the predicted hand postures by storing the indexes, so that the robustness of the recognition process is further improved. The invention also includes a hand gesture recognition system.

Description

Hand gesture recognition method and system
Technical Field
The invention relates to a computer recognition method and a computer recognition system for real objects, in particular to a hand gesture recognition method and a hand gesture recognition system.
Background
With the popularization of depth sensors and the demand in the field of human-computer interaction, research on hand gesture recognition based on depth data is emerging in recent years. Compared with the traditional hand gesture recognition based on RGB (red, green and blue) images, the depth data provides three-dimensional information of the hand, and robustness and accuracy of the hand gesture recognition are improved.
However, the image quality requirement of the depth image is high when the depth data is used for hand gesture recognition, the existing depth sensor is limited by physical parameters, the image quality of the dynamically formed depth image is poor, and the data input requirement of the classifier cannot be completely met, so that the hand image obtained by the classifier contains a large amount of noise and even generates 'holes', and the prediction accuracy of the classifier is seriously reduced.
Disclosure of Invention
In view of this, embodiments of the present invention provide a hand gesture recognition method and a hand gesture recognition system, which are used to solve the technical problem that a high-accuracy hand gesture cannot be formed due to the low robustness of the recognition method and the influence of the quality of depth image data.
The hand gesture recognition method comprises the following steps:
extracting depth local gradient characteristic data in the preliminary depth image to form hand preliminary posture characteristic data;
the hand preliminary posture characteristic data form hand current posture data through a hand posture classifier;
and determining hand standard posture data close to the hand current posture data through the storage index, and comparing to determine the hand standard posture corresponding to the hand current posture.
The hand gesture recognition system comprises the following functional modules:
the preliminary depth image generating device is used for acquiring a preliminary depth image of the current hand gesture through the depth sensor;
the hand preliminary posture characteristic data generating device is used for extracting depth local gradient characteristic data in the preliminary depth image to form hand preliminary posture characteristic data;
the hand current posture data generating device is used for generating hand current posture data by the hand preliminary posture characteristic data through a hand posture classifier;
and the gesture comparison device is used for determining hand standard gesture data close to the hand current gesture data through the storage index and comparing and determining the hand standard gesture corresponding to the hand current gesture.
The hand gesture recognition system comprises a processor, wherein a program module arranged in the processor comprises:
the preliminary depth image generating device is used for acquiring a preliminary depth image of the current hand gesture through the depth sensor;
the hand preliminary posture characteristic data generating device is used for extracting depth local gradient characteristic data in the preliminary depth image to form hand preliminary posture characteristic data;
the hand current posture data generating device is used for generating hand current posture data by the hand preliminary posture characteristic data through a hand posture classifier;
and the gesture comparison device is used for determining hand standard gesture data close to the hand current gesture data through the storage index and comparing and determining the hand standard gesture corresponding to the hand current gesture.
The hand gesture recognition method and the recognition system avoid the hand gesture recognition by directly adopting hand preliminary gesture characteristic data with noise defects and data defects, and overcome the defect that the prediction result error of the classifier is larger due to the influence of depth data holes and noise in a depth image. By adopting the hand gesture recognition method, even if the quality of the original data collected by the depth sensor is low due to the fact that the hand moves too fast, the final prediction accuracy cannot be obviously reduced. According to the hand gesture recognition method, by utilizing the characteristic that the similarity between high-dimensional data points should be reflected on the data points of the low-dimensional space after the high-dimensional data are mapped to the low-dimensional space, the selected standard gestures of the hand are distributed near the predicted hand gesture through the storage index, and the robustness of the recognition process is further improved.
Drawings
FIG. 1 is a flowchart of a hand gesture recognition method according to an embodiment of the present invention.
Fig. 2 is a flowchart of forming a standard depth image of a hand gesture recognition method according to an embodiment of the present invention.
Fig. 3 is a flowchart illustrating a process of forming depth features of a standard depth image according to a hand gesture recognition method in an embodiment of the present invention.
FIG. 4 is a flowchart illustrating a hand gesture recognition method according to an embodiment of the present invention.
Fig. 5 is a flowchart illustrating a preliminary depth image forming method according to the hand gesture recognition method of the embodiment of the present invention.
FIG. 6 is a flowchart illustrating a process of forming a current hand gesture according to a hand gesture recognition method of an embodiment of the present invention.
Fig. 7 is a flowchart of hand gesture comparison and replacement of the hand gesture recognition method according to the embodiment of the present invention.
FIG. 8 is a hand simulation gesture established during the process of utilizing the hand gesture recognition method of the embodiment of the present invention.
FIG. 9 is a standard depth image of hand simulated poses during a hand pose recognition method according to an embodiment of the present invention.
Fig. 10 is a visualization result of two-dimensional hand standard gesture feature data for dimension reduction in the process of using the hand gesture recognition method according to the embodiment of the present invention.
Fig. 11 is a visualization result of a two-dimensional hand standard pose feature data prediction result of a hand ROI by a hand pose classifier in the process of using the hand pose recognition method according to the embodiment of the present invention.
FIG. 12 is a hand ROI acquired during a hand gesture recognition method in accordance with an embodiment of the present invention.
FIG. 13 shows the predicted hand ROI results and the selected depth data corresponding to the nearest hand ROI gesture during the hand gesture recognition method according to the embodiment of the invention.
Fig. 14 shows the most similar result selected after the similarity degree is measured in the process of the hand gesture recognition method according to the embodiment of the present invention.
Fig. 15 is a schematic structural diagram of a hand gesture recognition system or a program module according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The step numbers in the figures are used only as reference numerals for the steps and do not indicate the execution order.
FIG. 1 is a flowchart of a hand gesture recognition method according to an embodiment of the present invention. The method steps shown in fig. 1 include:
step 100: and establishing a hand simulation gesture, and generating a standard depth image corresponding to the hand simulation gesture.
Step 200: and extracting depth local gradient characteristic data in the standard depth image to form hand standard posture characteristic data.
Step 300: training the classifier through the hand standard posture characteristic data to form a hand posture classifier, and establishing storage indexes of the hand standard posture data and the hand standard posture data through the hand posture classifier.
Step 400: and acquiring a preliminary depth image of the current hand gesture through a depth sensor.
Step 500: and extracting depth local gradient characteristic data in the preliminary depth image to form hand preliminary posture characteristic data.
Step 600: the hand preliminary posture characteristic data forms hand current posture data through a hand posture classifier.
Step 700: and determining hand standard posture data close to the hand current posture data through the storage index, and comparing to determine the hand standard posture corresponding to the hand current posture.
According to the hand gesture recognition method provided by the embodiment of the invention, the standard depth image with accurate depth information is established for various hand standard gestures in an off-line state, the hand standard gesture data capable of being indexed is formed (the hand standard gesture data can be displayed as a visual image of the hand standard gesture if necessary), and the standardization of the hand gesture is ensured. The low-quality depth image acquired by the depth sensor forms hand preliminary posture characteristic data (the hand preliminary posture data can be displayed as an intuitive image of the hand preliminary posture if necessary) in an online state, and is directly matched with hand standard posture data of partial hand standard postures, and the most appropriate hand standard posture data is selected by comparison to determine the corresponding hand standard posture. The hand gesture recognition method provided by the embodiment of the invention avoids the direct adoption of hand preliminary gesture characteristic data with noise defects and data defects for hand gesture recognition, and overcomes the defect that the prediction result error of a classifier is larger due to the influence of depth data holes and noise in a depth image. By adopting the hand gesture recognition method, even if the quality of the original data acquired by the fast depth sensor is low due to too much hand movement, the final prediction accuracy cannot be obviously reduced. According to the hand gesture recognition method, by utilizing the characteristic that the similarity between high-dimensional data points should be reflected on the data points of the low-dimensional space after the high-dimensional data are mapped to the low-dimensional space, the selected standard gestures of the hand are distributed near the predicted hand gesture through the storage index, and the robustness of the recognition process is further improved.
Fig. 2 is a flowchart of forming a standard depth image of a hand gesture recognition method according to an embodiment of the present invention. Step 100, shown in fig. 2, comprises:
step 110: and establishing a hand model.
And establishing a standard hand gesture by adopting a skeleton animation technology of a computer graphics technology. The skeleton animation technology adopts the skeleton characteristics of a living being as a basic model of a three-dimensional biological object, and fills and maps the basic model according to the individual characteristics of the three-dimensional biological object to form the three-dimensional biological object with the individual characteristics. The hand model may be formed by filling of muscle objects, mapping of skin texture objects using a three-dimensional skeletal base model of the hand.
Step 120: and determining a hand model reference point and a Euclidean distance coordinate system.
Computer graphics can establish a two-dimensional object or a three-dimensional object consisting of points, lines and planes in a three-dimensional coordinate space, and obtains the specific three-dimensional coordinates of each object such as bones according to algorithms such as Euclidean distance transformation and the like.
Step 130: and adjusting the motion angles of joints of the hand skeleton one by one to form the simulated postures of the hands.
The position change of the object in the three-dimensional coordinate space along six degrees of freedom and the change of the overall object shape can be formed by applying an appropriate offset or a movement rule to the three-dimensional coordinates of a specific object. The motion angle of each joint object in the three-dimensional skeleton basic model is limited by the motion angle of the physiological joint corresponding to the specific joint object. Different hand postures can be formed by adjusting the specific orientation of each skeleton in the hand model. The single-hand comprises 27 bones, more than 15 joints, and at least 1000 hand standard postures (including typical transitional postures) can be formed.
Step 140: and rendering the hand model after each joint adjustment by taking the Euclidean distance as a parameter to obtain a standard depth image of each hand simulation posture.
Through a model rendering technology of a computer animation technology, physical characteristics of objects such as muscles and skins can be converted into data, and the fixed points of the objects such as the muscles and the skins on the three-dimensional skeleton basic model and the physical characteristic data form a mapping relation. And obtaining a rendered corresponding hand gesture image by changing the form of the three-dimensional skeleton basic model.
Mapping is formed between Euclidean distances between each coordinate point and the reference point on the hand standard posture and the gray level, and depth information can be reflected through the gray level change of the standard depth image of each hand simulation posture.
And forming standard depth images of all hand standard postures, and forming a correlation degree label between each hand standard posture according to the similarity between the hand standard postures in order to reflect the similarity between the hand standard postures before and after any joint change. The relevance label can be represented by vector data with a change direction or a change angle, and then serves as an index parameter of the standard depth image.
In the hand gesture recognition method according to an embodiment of the present invention, coordinates of the light source and the visual angle may also be unified during the rendering process, and distance and depth information of each point of the hand within a fixed visual angle is obtained according to the brightness difference of the pixel points of each portion of the hand.
Fig. 3 is a flowchart illustrating a process of forming depth features of a standard depth image according to a hand gesture recognition method in an embodiment of the present invention. Step 200, shown in fig. 3, includes:
step 210: the depth value of each pixel is determined in each standard depth image using the depth information carried by the pixel.
The depth information may be a euclidean coordinate value of each pixel or a luminance value of each pixel.
Step 220: and forming depth local gradient values of the pixels in different directions and distances according to the depth values of the pixels.
The depth local gradient value refers to a depth difference value formed by quantizing depth information among pixels. Usually, the depth difference values of a central pixel and adjacent pixels or several spaced pixels of the same distance form depth gradient values of the central pixel in different directions and at different distances, i.e. depth gradient values in different direction dimensions and distance dimensions.
Specifically, the depth local gradient feature acquisition mode is as follows:
Figure BDA0001333941610000071
wherein u isiThe depth local gradient feature data of a pixel point i is represented, z (u) represents the depth value of the pixel point, and u is random offset.
Step 230: and forming high-dimensional hand standard posture characteristic data of each standard depth image according to the depth local gradient value of each pixel in each dimension.
And reflecting the attraction or repulsion relation among the pixels in different directions and distances by using the depth gradient values of multiple dimensions of the depth local gradient characteristic data to form a change rule of each hand standard posture on a specific depth attribute.
FIG. 4 is a flowchart illustrating a hand gesture recognition method according to an embodiment of the present invention. Step 300, shown in FIG. 4, includes:
step 310: and normalizing the hand standard posture characteristic data of each standard depth image and performing dimension reduction processing to form two-dimensional hand standard posture characteristic data.
And the data normalization ensures the measurement consistency of the hand standard posture characteristic data to finish data standardization processing, and realizes comparability between data indexes.
Dimension reduction algorithm such as t-sne algorithm (t distribution neighborhood embedding algorithm) is adopted for forming two-dimensional hand standard posture characteristic data, dimension reduction processing is carried out through t-sne algorithm by taking the hand standard posture characteristic data as input, and the high-dimensional hand standard posture characteristic data are formed into two-dimensional hand standard posture characteristic data.
Step 320: and taking the two-dimensional hand standard posture characteristic data of each standard depth image as input, training the classifier, and forming the hand posture classifier aiming at the hand model.
The classifier may employ a random forest classifier or a deep convolutional network. For example, a random forest classifier is adopted, two-dimensional hand standard posture characteristic data is used as input of the random forest classifier, and depth information of pixels of each standard depth image is accurate and reliable, so that parameters of an algorithm can be fully adjusted, the classifier formed by the algorithm is subjected to comprehensive data testing based on hand postures, reliability of classification prediction results is fully verified, and simulation test results can meet high repeatability.
Step 330: outputting the two-dimensional hand standard posture characteristic data of each standard depth image by a hand posture classifier to corresponding hand standard posture data, establishing vector labels among the hand standard posture data, and establishing storage indexes of the hand standard posture data by the vector labels.
And forming a vector label of the hand standard posture data through the correlation label of the hand standard posture, and expressing the similarity of the hand standard posture in a vector degree. And meanwhile, the storage of the hand standard posture data is finished by utilizing a reasonable data structure. For example, a kd-tree (k-dimensional tree) is used as a data structure to store hand standard posture data and vector tags are used for indexing, so that rapid indexing of similar hand standard postures can be realized.
Fig. 5 is a flowchart illustrating a preliminary depth image forming method according to the hand gesture recognition method of the embodiment of the present invention. Step 500, shown in FIG. 5, includes:
step 510: and determining the depth local gradient value of each pixel in different directions by using the depth information carried by the pixel in the preliminary depth image.
The depth local gradient value refers to a depth difference value formed by quantizing depth information among pixels. Usually, the depth difference values of one pixel and adjacent pixels or several spaced pixels with the same distance form gradient values of one central pixel in different directions and at different distances, i.e. depth gradient values in different direction dimensions and distance dimensions.
Specifically, the depth local gradient feature acquisition mode is as follows:
Figure BDA0001333941610000081
wherein u isiThe depth local gradient feature data of a pixel point i is represented, z (u) represents the depth value of the pixel point, and u is random offset.
Step 520: and forming high-dimensional hand preliminary pose feature data of the preliminary depth image according to the depth local gradient data of multiple dimensions among the pixels.
And reflecting the attraction or repulsion relation among the pixels in different directions and distances by using the depth local gradient characteristic data of multiple dimensions to form the change rule of each hand standard posture on the specific depth attribute.
FIG. 6 is a flowchart illustrating a process of forming a current hand gesture according to a hand gesture recognition method of an embodiment of the present invention. Step 600 shown in FIG. 6 includes:
step 610: and normalizing the hand preliminary posture characteristic data in the preliminary depth image and performing dimension reduction processing to form two-dimensional hand preliminary posture characteristic data.
And the data normalization ensures the measurement consistency of the hand standard posture characteristic data to finish data standardization processing, and realizes comparability between data indexes.
Dimension reduction algorithm such as t-sne algorithm (t distribution neighborhood embedding algorithm) is adopted for forming two-dimensional hand standard posture characteristic data, dimension reduction processing is carried out through t-sne algorithm by taking the hand standard posture characteristic data as input, and the high-dimensional hand standard posture characteristic data are formed into two-dimensional hand standard posture characteristic data.
Step 620: outputting the two-dimensional hand initial posture characteristic data to corresponding hand current posture data through a hand posture classifier and forming a corresponding vector label.
The classifier may employ a random forest classifier or a deep convolutional network. For example, a random forest classifier is adopted, two-dimensional hand standard posture characteristic data is used as input of the random forest classifier, and depth information of pixels of each standard depth image is accurate and reliable, so that parameters of an algorithm can be fully adjusted, the classifier formed by the algorithm is used for performing comprehensive data testing based on hand postures, the reliability of classification testing results is fully verified, and simulation testing results can meet high repeatability.
Fig. 7 is a flowchart of hand gesture comparison and replacement of the hand gesture recognition method according to the embodiment of the present invention. Step 700 shown in fig. 7 includes:
step 710: and determining the index range of the standard hand posture data according to the vector label of the current hand posture data.
And when outputting the current hand posture data through the hand posture classifier, forming a vector label corresponding to the storage index of the hand standard posture data.
Step 720: and acquiring a plurality of hand standard posture data from the index range, and forming a corresponding test depth image set by using the hand standard posture data and the hand current posture data.
The vector label of the hand current pose data is used as a range parameter for practicing retrieval at a storage index of the hand standard pose data to obtain a set of k test depth images rendered from k pose data formed by k-1(k is a storage dimension of the hand standard pose data) hand standard pose data and the hand current pose data, for example.
Step 730: and comparing the depth information in the preliminary depth image and the depth information in the test depth image set, and replacing the current hand posture with the hand simulation posture of the most approximate depth image.
The similarity degree measurement formula for comparing the preliminary depth image with the depth images in the test depth image set one by one is as follows:
Figure BDA0001333941610000101
where Z represents depth data acquired from a sensor, R represents depth data acquired from a rendered image, Zi,jRepresenting depth values, r, of depth image pixel points for hand current pose datai,jAnd p is the relative difference value (difference) of corresponding pixel points in the corresponding image.
In the hand gesture recognition method according to another embodiment of the present invention, in order to increase the accuracy of the final contrast and reduce the data processing amount of the contrast, a hand ROI needs to be extracted from the standard depth image and the preliminary depth image.
As shown in fig. 4, further comprising step 340 performed before step 320:
step 340: the center and the boundary of the hand in the hand standard posture characteristic data are determined according to a Principal Component Analysis (PCA) algorithm, and two-dimensional hand standard posture characteristic data of the hand ROI are formed according to the boundary.
As shown in fig. 6, further comprising step 630 performed before step 220:
step 630: determining the center and the orientation of a hand in the hand preliminary gesture feature data according to a PCA algorithm, setting a fixed cubic space in the same orientation and comprising hand point data, and projecting the fixed cubic space to a two-dimensional plane in the same orientation; and forming two-dimensional hand preliminary posture characteristic data of the hand ROI according to the boundary on a two-dimensional plane.
The hand standard pose data formed by the steps subsequent to steps 340 and 630 above is the hand standard pose data of the hand ROI, and the hand current pose data is the hand current pose data of the hand ROI.
Therefore, as shown in fig. 7, the similarity measure formula for comparing the preliminary depth image with the depth images in the test depth image set one by one in step 730 is as follows:
Figure BDA0001333941610000111
wherein ZroiRepresenting hand ROI depth data, R, acquired from a sensorroiRepresenting hand ROI depth data, z, obtained from rendered imagesi,jRepresenting the number of current poses for the handAccording to the depth value, r, of the pixel point of the depth imagei,jRepresenting depth values for pixels of hand standard pose data.
FIG. 8 is a hand simulation gesture established during the process of utilizing the hand gesture recognition method of the embodiment of the present invention. FIG. 9 is a standard depth image of hand simulated poses during a hand pose recognition method according to an embodiment of the present invention. Fig. 10 is a visualization result of two-dimensional hand standard gesture feature data for dimension reduction in the process of using the hand gesture recognition method according to the embodiment of the present invention. Fig. 11 is a visualization result of a two-dimensional hand standard pose feature data prediction result of a hand ROI by a hand pose classifier in the process of using the hand pose recognition method according to the embodiment of the present invention. As shown in fig. 8 to 11, the process of establishing the standard pose data of the hand ROI in the offline state is illustrated. The process ensures that hand ROI prediction data acquired by the sensor has a high-quality standard hand posture depth image set compared with standards.
FIG. 12 is a hand ROI acquired during a hand gesture recognition method in accordance with an embodiment of the present invention. FIG. 13 shows the predicted hand ROI results and the selected depth data corresponding to the nearest hand ROI gesture during the hand gesture recognition method according to the embodiment of the invention. Fig. 14 shows the most similar result selected after the similarity degree is measured in the process of the hand gesture recognition method according to the embodiment of the present invention. As shown in fig. 12-14, the process of collecting hand ROI depth data in online state to build hand predicted pose data is illustrated. Through comparison of the hand ROI standard posture data and hand predicted posture data, the defects that prediction accuracy is low and robustness is poor only depending on low-quality hand predicted posture data are overcome, and the hand ROI posture of a comparison selection result is consistent with the hand ROI posture acquired by a sensor.
Fig. 15 is a schematic structural diagram of a hand gesture recognition system or a program module according to an embodiment of the present invention. As shown in fig. 15, the following devices are included:
the standard depth image generating device 10 is used for establishing a hand simulation gesture and generating a standard depth image corresponding to the hand simulation gesture;
the hand standard posture characteristic data generating device 20 is used for extracting depth local gradient characteristic data in the standard depth image to form hand standard posture characteristic data;
the hand standard posture data indexing device 30 is used for training the classifier through hand standard posture characteristic data to form a hand posture classifier, and establishing storage indexes of the hand standard posture data and the hand standard posture data through the hand posture classifier;
the preliminary depth image generating device 40 is used for acquiring a preliminary depth image of the current hand gesture through a depth sensor;
a hand preliminary pose feature data generation means 50 for extracting depth local gradient feature data in the preliminary depth image to form hand preliminary pose feature data;
hand current posture data generating means 60 for generating hand current posture data from the hand preliminary posture feature data by means of a hand posture classifier;
and gesture comparison means 70 for determining hand standard gesture data close to the hand current gesture data by storing the index and comparing and determining the hand standard gesture corresponding to the hand current gesture.
As shown in fig. 15, a standard depth image generation device 10 in the hand gesture recognition system according to the embodiment of the present invention includes:
the model building module 11 is used for building a hand model;
the distance establishing module 12 is used for determining a hand model reference point and an Euclidean distance coordinate system;
the gesture simulation module 13 is used for adjusting the motion angles of joints of hand bones one by one to form hand simulation gestures;
the standard depth image generation module 14 is configured to render the hand model with the joints adjusted each time by using the euclidean distance as a parameter to obtain a standard depth image of each hand simulation posture;
as shown in fig. 15, a hand standard posture feature data generation device 20 in the hand posture recognition system according to the embodiment of the present invention includes:
a first pixel depth generating module 21, configured to determine a depth value of each pixel in each standard depth image by using depth information carried by the pixel;
a first depth gradient generating module 22, configured to form local depth gradient values of the pixels in different directions and distances according to the depth values of the pixels;
the standard gesture feature data generation module 23 is configured to form high-dimensional hand standard gesture feature data of each standard depth image according to the depth local gradient value of each pixel in each dimension;
as shown in fig. 15, the hand standard gesture data indexing device 30 in the hand gesture recognition system according to the embodiment of the present invention includes:
and a two-dimensional standard posture characteristic data generating module 31, configured to normalize and perform dimension reduction processing on the hand standard posture characteristic data of each standard depth image, so as to form two-dimensional hand standard posture characteristic data.
And the hand gesture classifier training module 32 is configured to train a classifier by taking the two-dimensional hand standard gesture feature data of each standard depth image as input, so as to form a hand gesture classifier for the hand model.
And the storage index generation module 33 is configured to output the two-dimensional hand standard posture feature data of each standard depth image by a hand posture classifier to corresponding hand standard posture data, establish a vector tag between the hand standard posture data, and establish a storage index of the hand standard posture data by the vector tag.
And the ROI two-dimensional standard posture characteristic data generating module 34 is used for determining the center and the boundary of the hand in the hand standard posture characteristic data according to a PCA algorithm and forming two-dimensional hand standard posture characteristic data of the hand ROI according to the boundary.
As shown in fig. 15, the hand preliminary pose feature data generation device 50 in the hand pose recognition system according to the embodiment of the present invention includes:
a second pixel depth generating module 51, configured to determine depth local gradient values of each pixel in different directions in the preliminary depth image by using the depth information carried by the pixel.
And a second depth gradient generation module 52, configured to form high-dimensional hand preliminary pose feature data of the preliminary depth image according to the depth local gradient data of multiple dimensions between pixels.
As shown in fig. 15, a hand current posture data generating device 60 in the hand posture recognition system according to the embodiment of the present invention includes:
and the two-dimensional hand preliminary posture characteristic data generating module 61 is used for normalizing the hand preliminary posture characteristic data in the preliminary depth image and performing dimension reduction processing to form two-dimensional hand preliminary posture characteristic data.
And a hand current posture data generation module 62, configured to output the two-dimensional hand preliminary posture feature data through the hand posture classifier to form corresponding hand current posture data and form a corresponding vector tag.
The ROI two-dimensional standard posture characteristic data generating module 63 is used for determining the center and the direction of a hand in the hand preliminary posture characteristic data according to a PCA algorithm, setting a fixed cubic space with the same direction and containing hand point data, and projecting the space to a two-dimensional plane with the same direction; and forming two-dimensional hand preliminary posture characteristic data of the hand ROI according to the boundary on a two-dimensional plane.
As shown in fig. 15, the gesture matching apparatus 70 in the hand gesture recognition system according to the embodiment of the present invention includes:
and the index range query module 71 is configured to determine an index range of the hand standard posture data according to the vector tag of the hand current posture data.
And the test depth image set generating module 72 is configured to obtain a plurality of hand standard posture data from the index range, and form a corresponding test depth image set by using the hand standard posture data and the hand current posture data.
And the depth information comparison module 73 is used for comparing the depth information in the preliminary depth image and the depth information in the test depth image set, and replacing the current hand posture with the hand simulation posture of the most approximate depth image.
Specific implementation and beneficial effects of the hand gesture recognition system in the embodiment of the invention can be seen in a hand gesture recognition method, and are not described herein again.
Fig. 15 is a schematic structural diagram of a hand gesture recognition system or a program module according to an embodiment of the present invention. As shown in fig. 15, the hand gesture recognition system of the embodiment of the present invention includes a processor, and program modules disposed in the processor include:
and a standard depth image generating device 10 for creating a hand simulation gesture and generating a standard depth image corresponding to the hand simulation gesture.
And the hand standard posture characteristic data generating device 20 is used for extracting the depth local gradient characteristic data in the standard depth image to form hand standard posture characteristic data.
And the hand standard posture data indexing device 30 is used for training the classifier through the hand standard posture characteristic data to form a hand posture classifier, and establishing storage indexes of the hand standard posture data and the hand standard posture data through the hand posture classifier.
And the preliminary depth image generating device 40 is used for acquiring a preliminary depth image of the current hand gesture through the depth sensor.
And the hand preliminary pose feature data generating device 50 is used for extracting the depth local gradient feature data in the preliminary depth image to form hand preliminary pose feature data.
And a hand current posture data generating device 60 for generating hand current posture data from the hand preliminary posture characteristic data by a hand posture classifier.
And gesture comparison means 70 for determining hand standard gesture data close to the hand current gesture data by storing the index and comparing and determining the hand standard gesture corresponding to the hand current gesture.
The standard depth image generation device 10 includes:
and the model establishing module 11 is used for establishing a hand model.
And the distance establishing module 12 is used for determining the hand model reference point and the Euclidean distance coordinate system.
And the gesture simulation module 13 is used for adjusting the motion angles of joints of the hand bones one by one to form each hand simulation gesture.
And the standard depth image generation module 14 is used for rendering the hand model with the joints adjusted each time by using the euclidean distance as a parameter to obtain a standard depth image of each hand simulation posture.
The hand standard posture feature data generation device 20 includes:
a first pixel depth generating module 21, configured to determine a depth value of each pixel in each standard depth image by using the depth information carried by the pixel.
A first depth gradient generating module 22, configured to form depth local gradient values of the pixels in different directions and distances according to the depth values of the pixels.
And a standard pose feature data generation module 23, configured to form high-dimensional hand standard pose feature data of each standard depth image according to the depth local gradient value of each pixel in each dimension.
The hand standard posture data indexing device 30 includes:
and a two-dimensional standard posture characteristic data generating module 31, configured to normalize and perform dimension reduction processing on the hand standard posture characteristic data of each standard depth image, so as to form two-dimensional hand standard posture characteristic data.
And the hand gesture classifier training module 32 is configured to train a classifier by taking the two-dimensional hand standard gesture feature data of each standard depth image as input, so as to form a hand gesture classifier for the hand model.
And the storage index generation module 33 is configured to output the two-dimensional hand standard posture feature data of each standard depth image by a hand posture classifier to corresponding hand standard posture data, establish a vector tag between the hand standard posture data, and establish a storage index of the hand standard posture data by the vector tag.
And the ROI two-dimensional standard posture characteristic data generating module 34 is used for determining the center and the boundary of the hand in the hand standard posture characteristic data according to a PCA algorithm and forming two-dimensional hand standard posture characteristic data of the hand ROI according to the boundary.
The hand preliminary posture characteristic data generation device 50 includes:
a second pixel depth generating module 51, configured to determine depth local gradient values of each pixel in different directions in the preliminary depth image by using the depth information carried by the pixel.
And a second depth gradient generation module 52, configured to form high-dimensional hand preliminary pose feature data of the preliminary depth image according to the depth local gradient data of multiple dimensions between pixels.
The hand current posture data generation device 60 includes:
and the two-dimensional hand preliminary posture characteristic data generating module 61 is used for normalizing the hand preliminary posture characteristic data in the preliminary depth image and performing dimension reduction processing to form two-dimensional hand preliminary posture characteristic data.
And a hand current posture data generation module 62, configured to output the two-dimensional hand preliminary posture feature data through the hand posture classifier to form corresponding hand current posture data and form a corresponding vector tag.
The ROI two-dimensional standard posture characteristic data generating module 63 is used for determining the center and the direction of a hand in the hand preliminary posture characteristic data according to a PCA algorithm, setting a fixed cubic space with the same direction and containing hand point data, and projecting the space to a two-dimensional plane with the same direction; and forming two-dimensional hand preliminary posture characteristic data of the hand ROI according to the boundary on a two-dimensional plane.
The posture alignment device 70 includes:
and the index range query module 71 is configured to determine an index range of the hand standard posture data according to the vector tag of the hand current posture data.
And the test depth image set generating module 72 is configured to obtain a plurality of hand standard posture data from the index range, and form a corresponding test depth image set by using the hand standard posture data and the hand current posture data.
And the depth information comparison module 73 is used for comparing the depth information in the preliminary depth image and the depth information in the test depth image set, and replacing the current hand posture with the hand simulation posture of the most approximate depth image.
Specific implementation and beneficial effects of the hand gesture recognition system in the embodiment of the invention can be seen in a hand gesture recognition method, and are not described herein again.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and the like that are within the spirit and principle of the present invention are included in the present invention.

Claims (10)

1. A hand gesture recognition method, comprising:
establishing a hand simulation gesture, and generating a standard depth image corresponding to the hand simulation gesture;
extracting depth local gradient characteristic data in the standard depth image to form hand standard posture characteristic data;
training a classifier through hand standard posture characteristic data to form a hand posture classifier, and establishing hand standard posture data and a storage index of the hand standard posture data through the hand posture classifier;
acquiring a preliminary depth image of the current gesture of the hand through a depth sensor;
extracting depth local gradient characteristic data in the preliminary depth image to form hand preliminary posture characteristic data;
classifying the preliminary hand posture characteristic data by a hand posture classifier to form current hand posture data;
determining hand standard posture data close to the hand current posture data through the storage index and comparing to determine the hand standard posture corresponding to the hand current posture,
wherein the establishing a hand simulation gesture and the generating a standard depth image corresponding to the hand simulation gesture comprises:
establishing a hand model;
determining the hand model reference point and an Euclidean distance coordinate system;
adjusting the motion angles of the skeleton joints of the hand model one by one to form a hand simulation posture;
rendering the hand simulation gestures by taking the Euclidean distance as a parameter to obtain the standard depth image of each hand simulation gesture.
2. The hand gesture recognition method of claim 1, wherein the extracting depth local gradient feature data in the standard depth image to form hand standard gesture feature data comprises:
determining the depth value of each pixel in each standard depth image by using the depth information carried by the pixel;
forming depth local gradient values of the pixels in different directions and distances according to the depth values of the pixels;
and forming the hand standard posture characteristic data of the high dimension of each standard depth image according to the depth local gradient value of each dimension of each pixel.
3. The hand gesture recognition method of claim 1, wherein training a classifier with hand standard gesture feature data to form a hand gesture classifier, and establishing the hand standard gesture data and the stored index of the hand standard gesture data with a hand gesture classifier comprises:
normalizing the hand standard posture characteristic data of each standard depth image and performing dimension reduction processing to form two-dimensional hand standard posture characteristic data;
taking the two-dimensional hand standard posture characteristic data of each standard depth image as input, training a classifier, and forming a hand posture classifier aiming at a hand model;
outputting the two-dimensional hand standard posture characteristic data of each standard depth image to corresponding hand standard posture data through the hand posture classifier, establishing vector labels among the hand standard posture data, and establishing the storage index of the hand standard posture data through the vector labels.
4. The hand gesture recognition method of claim 1, wherein the extracting depth local gradient feature data in the preliminary depth image to form hand preliminary gesture feature data comprises:
determining a depth local gradient value of each pixel in different directions by using depth information carried by the pixel in the preliminary depth image;
forming the hand preliminary pose feature data of a high dimension of the preliminary depth image from the depth local gradient values of multiple dimensions between the pixels.
5. The hand gesture recognition method of claim 1, wherein the hand preliminary gesture feature data forming hand current gesture data by a hand gesture classifier comprises:
normalizing the hand preliminary gesture feature data in the preliminary depth image and performing dimension reduction processing to form two-dimensional hand preliminary gesture feature data;
and outputting the two-dimensional hand preliminary posture characteristic data to the corresponding hand current posture data through the hand posture classifier and forming a corresponding vector label.
6. A hand gesture recognition method as recited in claim 1 wherein said determining hand standard gesture data proximate to the hand current gesture data from the stored index and comparing the determined hand standard gesture corresponding to the hand current gesture comprises:
determining the index range of the hand standard posture data according to the vector label of the hand current posture data;
acquiring a plurality of pieces of hand standard posture data from the index range, and forming a corresponding test depth image set by using the hand standard posture data and the hand current posture data;
comparing the depth information in the preliminary depth image and the set of test depth images, and replacing the current hand pose with the hand simulation pose of the most approximate depth image.
7. The hand gesture recognition method of claim 1, wherein training a classifier with hand standard gesture feature data to form a hand gesture classifier, and establishing the hand standard gesture data and the stored index of the hand standard gesture data with a hand gesture classifier comprises:
determining the center and the boundary of a hand in the hand standard posture characteristic data according to a PCA algorithm, and forming two-dimensional hand standard posture characteristic data of a hand ROI according to the boundary;
taking the two-dimensional hand standard posture characteristic data of each hand ROI as input, training a classifier, and forming a hand posture classifier aiming at a hand model;
and outputting the corresponding hand standard posture data of each hand ROI two-dimensional hand standard posture characteristic data through the hand posture classifier, establishing vector labels among the hand standard posture data, and establishing the storage index of the hand standard posture data through the vector labels.
8. The hand gesture recognition method of claim 1, wherein the hand preliminary gesture feature data forming hand current gesture data by a hand gesture classifier comprises:
determining the center and the orientation of a hand in the hand preliminary gesture feature data according to a PCA algorithm, setting a fixed cubic space in the same orientation and comprising hand point data, and projecting the fixed cubic space to a two-dimensional plane in the same orientation;
forming two-dimensional hand preliminary posture characteristic data of a hand ROI on the two-dimensional plane according to a boundary;
and outputting the two-dimensional hand preliminary posture characteristic data of the hand ROI through the hand posture classifier to correspond to the current hand posture data and form a corresponding vector label.
9. A hand gesture recognition system, comprising:
the standard depth image generating device is used for establishing a hand simulation gesture and generating a standard depth image corresponding to the hand simulation gesture;
the hand standard posture characteristic data generating device is used for extracting depth local gradient characteristic data in the standard depth image to form hand standard posture characteristic data;
the hand standard posture data indexing device is used for training the classifier through hand standard posture characteristic data to form a hand posture classifier, and establishing hand standard posture data and a storage index of the hand standard posture data through the hand posture classifier;
the preliminary depth image generating device is used for acquiring a preliminary depth image of the current hand gesture through the depth sensor;
the hand preliminary posture characteristic data generating device is used for extracting depth local gradient characteristic data in the preliminary depth image to form hand preliminary posture characteristic data;
the hand current posture data generating device is used for generating hand current posture data by the hand preliminary posture characteristic data through a hand posture classifier;
gesture comparison means for determining hand standard gesture data close to the hand current gesture data through the storage index and comparing and determining the hand standard gesture corresponding to the hand current gesture,
wherein the establishing a hand simulation gesture and the generating a standard depth image corresponding to the hand simulation gesture comprises:
establishing a hand model;
determining the hand model reference point and an Euclidean distance coordinate system;
adjusting the motion angles of the skeleton joints of the hand model one by one to form a hand simulation posture;
rendering the hand simulation gestures by taking the Euclidean distance as a parameter to obtain the standard depth image of each hand simulation gesture.
10. A hand gesture recognition system comprising a processor, program modules deployed in the processor comprising:
the standard depth image generating device is used for establishing a hand simulation gesture and generating a standard depth image corresponding to the hand simulation gesture;
the hand standard posture characteristic data generating device is used for extracting depth local gradient characteristic data in the standard depth image to form hand standard posture characteristic data;
the hand standard posture data indexing device is used for training the classifier through hand standard posture characteristic data to form a hand posture classifier, and establishing hand standard posture data and a storage index of the hand standard posture data through the hand posture classifier;
the preliminary depth image generating device is used for acquiring a preliminary depth image of the current hand gesture through the depth sensor;
the hand preliminary posture characteristic data generating device is used for extracting depth local gradient characteristic data in the preliminary depth image to form hand preliminary posture characteristic data;
the hand current posture data generating device is used for generating hand current posture data by the hand preliminary posture characteristic data through a hand posture classifier;
gesture comparison means for determining hand standard gesture data close to the hand current gesture data through the storage index and comparing and determining the hand standard gesture corresponding to the hand current gesture,
wherein the establishing a hand simulation gesture and the generating a standard depth image corresponding to the hand simulation gesture comprises:
establishing a hand model;
determining the hand model reference point and an Euclidean distance coordinate system;
adjusting the motion angles of the skeleton joints of the hand model one by one to form a hand simulation posture;
rendering the hand simulation gestures by taking the Euclidean distance as a parameter to obtain the standard depth image of each hand simulation gesture.
CN201710505926.XA 2017-06-27 2017-06-27 Hand gesture recognition method and system Active CN107316025B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710505926.XA CN107316025B (en) 2017-06-27 2017-06-27 Hand gesture recognition method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710505926.XA CN107316025B (en) 2017-06-27 2017-06-27 Hand gesture recognition method and system

Publications (2)

Publication Number Publication Date
CN107316025A CN107316025A (en) 2017-11-03
CN107316025B true CN107316025B (en) 2021-04-06

Family

ID=60180321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710505926.XA Active CN107316025B (en) 2017-06-27 2017-06-27 Hand gesture recognition method and system

Country Status (1)

Country Link
CN (1) CN107316025B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108154176B (en) * 2017-12-22 2021-11-05 北京工业大学 3D human body posture estimation algorithm aiming at single depth image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2821916A1 (en) * 2012-02-27 2015-01-07 NEC CASIO Mobile Communications, Ltd. Voice input device, voice input method and program
CN104317391A (en) * 2014-09-24 2015-01-28 华中科技大学 Stereoscopic vision-based three-dimensional palm posture recognition interactive method and system
CN104517097A (en) * 2014-09-24 2015-04-15 浙江大学 Kinect-based moving human body posture recognition method
CN106030610A (en) * 2014-01-05 2016-10-12 马诺手势股份公司 Real-time 3D gesture recognition and tracking system for mobile devices

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4142460B2 (en) * 2003-01-31 2008-09-03 オリンパス株式会社 Motion detection device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2821916A1 (en) * 2012-02-27 2015-01-07 NEC CASIO Mobile Communications, Ltd. Voice input device, voice input method and program
CN106030610A (en) * 2014-01-05 2016-10-12 马诺手势股份公司 Real-time 3D gesture recognition and tracking system for mobile devices
CN104317391A (en) * 2014-09-24 2015-01-28 华中科技大学 Stereoscopic vision-based three-dimensional palm posture recognition interactive method and system
CN104517097A (en) * 2014-09-24 2015-04-15 浙江大学 Kinect-based moving human body posture recognition method

Also Published As

Publication number Publication date
CN107316025A (en) 2017-11-03

Similar Documents

Publication Publication Date Title
CN108345869B (en) Driver posture recognition method based on depth image and virtual data
CN108549873B (en) Three-dimensional face recognition method and three-dimensional face recognition system
Rogez et al. Mocap-guided data augmentation for 3d pose estimation in the wild
CN106682598B (en) Multi-pose face feature point detection method based on cascade regression
CN108052942B (en) Visual image recognition method for aircraft flight attitude
US9189855B2 (en) Three dimensional close interactions
CN106096542B (en) Image video scene recognition method based on distance prediction information
JP2016161569A (en) Method and system for obtaining 3d pose of object and 3d location of landmark point of object
CN104392223B (en) Human posture recognition method in two-dimensional video image
CN108182397B (en) Multi-pose multi-scale human face verification method
CN110852182B (en) Depth video human body behavior recognition method based on three-dimensional space time sequence modeling
CN110147767A (en) Three-dimension gesture attitude prediction method based on two dimensional image
CN114758362B (en) Clothing changing pedestrian re-identification method based on semantic perception attention and visual shielding
CN113012122B (en) Category-level 6D pose and size estimation method and device
KR20060057627A (en) Object posture estimation/correlation system using weight information
JP2003346162A (en) Input system by image recognition of hand
US20200057778A1 (en) Depth image pose search with a bootstrapped-created database
Zeng et al. Examplar coherent 3D face reconstruction from forensic mugshot database
CN112634125A (en) Automatic face replacement method based on off-line face database
CN111428689A (en) Multi-pool information fusion human face image feature extraction method
CN107479693A (en) Real-time hand recognition methods based on RGB information, storage medium, electronic equipment
CN110751097A (en) Semi-supervised three-dimensional point cloud gesture key point detection method
CN111274944A (en) Three-dimensional face reconstruction method based on single image
CN116958420A (en) High-precision modeling method for three-dimensional face of digital human teacher
CN106933976B (en) Method for establishing human body 3D net model and application thereof in 3D fitting

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant