CN104899591A - Wrist point and arm point extraction method based on depth camera - Google Patents

Wrist point and arm point extraction method based on depth camera Download PDF

Info

Publication number
CN104899591A
CN104899591A CN201510336118.6A CN201510336118A CN104899591A CN 104899591 A CN104899591 A CN 104899591A CN 201510336118 A CN201510336118 A CN 201510336118A CN 104899591 A CN104899591 A CN 104899591A
Authority
CN
China
Prior art keywords
point
wrist
arm
hand
palm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510336118.6A
Other languages
Chinese (zh)
Other versions
CN104899591B (en
Inventor
潘志庚
郭双双
张明敏
罗江林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin Jidong Culture and Art Group Co., Ltd.
Original Assignee
Jilin Jiyuan Space-Time Animation Game Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin Jiyuan Space-Time Animation Game Technology Co Ltd filed Critical Jilin Jiyuan Space-Time Animation Game Technology Co Ltd
Priority to CN201510336118.6A priority Critical patent/CN104899591B/en
Publication of CN104899591A publication Critical patent/CN104899591A/en
Application granted granted Critical
Publication of CN104899591B publication Critical patent/CN104899591B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/11Hand-related biometrics; Hand pose recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a wrist point and arm point extraction method based on a depth camera, applied to the fields of virtual reality and augmented reality. Segmenting hand areas according to color images and depth maps collected by a Kinect depth camera, and palm point three-dimensional positions provided by an OpenNI/NITE (open type natural interactive API); extracting accurate hand areas by a Bayesian skin color model, and extracting wrist points and arm points in a vision method; and designing a plurality of man-machine interaction gestures based on the extracted wrist points and arm points and palm points provided by an NITE bank. Since the Kinect depth camera comes out, many researchers have utilized the Kinect depth camera to do research on gesture identification, however, researches based on wrists and arms are very few, and methods of extracting the wrist points and arm points based on a vision method are rare. The wrist point and arm point extraction method of the invention has a small calculated quantity, is simple to operate, and can timely, stably and accurately extract the wrist points and arm points.

Description

Based on the wrist point of depth camera and the extracting method of arm point
Technical field
The present invention relates to the extracting method of a kind of wrist based on depth camera point and arm point, the open natural interaction API of the cromogram, depth map and the OpenNI/NITE(that utilize Kinect depth camera to gather) palm of the hand point three-dimensional position that provides, first hand region is split, then utilize visible sensation method wrist point and arm point to be extracted.The wrist point extracted, the palm of the hand point that arm point and NITE storehouse provide can design multiple man-machine interaction gesture, then the gesture of design is applied to the field such as virtual reality and augmented reality.
Background technology
After Kinect even depth camera out, the research of Many researchers a lot of gesture identification that utilized Kinect to do, these study the gesture identification research mainly based on finger tip and the palm of the hand, and wrist and arm are added to come in, and the research of some gesture identification of composite design is few.The effect of wrist in gesture identification is vital.Due to the hinge that must form a connecting link that wrist is from arm to finger, therefore a lot of inevitable coordination and participation also needing arm of action of taking as the leading factor with wrist.Accurately can extract wrist point and arm point, be vital to the degree of accuracy of the gesture identification based on wrist and arm.
The existing gesture identification method based on depth camera is all how to extract finger cusp or palm of the hand point, and design some gestures by palm of the hand point and the position or number pointing cusp, these gestures are more single, and range of application is more limited.If wrist point and arm point can be extracted, cusp, palm of the hand point, wrist point and arm point composite design gesture will be pointed, be not only gesture wide variety, and have usable range widely.Such as, wave the gesture of arm according to the Position Design of wrist point and arm point, this gesture can be applied to virtual playing ball; According to the gesture that palm of the hand point, wrist point and arm point design arm lifts or puts down, may be used for virtual act barbell.
Summary of the invention
The object of the present invention is to provide the extracting method of a kind of wrist based on depth camera point and arm point, solve the problems referred to above that prior art exists.The three-dimensional position of the depth map that the present invention is provided by Kinect depth camera and the palm of the hand point that OpeNI/NITE provides, first by hand region segmentation out, then using the bianry image that is partitioned into as mask figure, from coloured image, split region of selling, utilize complexion model and Bayes classifier in the coloured image in hand region, extract accurate hand region.In the accurate hand region extracted, utilize wrist point extracting method and arm point extracting method to extract wrist point and arm point, then utilize wrist point, arm point and palm of the hand point to design various gestures.
Above-mentioned purpose of the present invention is achieved through the following technical solutions:
Based on the wrist point of depth camera and the extracting method of arm point, the palm of the hand point three-dimensional position that the cromogram gathered by Kinect depth camera, depth map and OpenNI/NITE are provided, first hand region is split, then by Bayes's complexion model by accurate hand extracted region out, by visible sensation method, wrist point and arm point are extracted; The multiple man-machine interaction gesture of palm of the hand point design that the wrist point extracted, arm point and OpenNI/NITE provide, is then applied to virtual reality and augmented reality field by the gesture of design;
Described hand region segmentation is: first split region of selling in the two-dimensional direction, then split in the depth direction, again using the hand area image that is partitioned into as mask figure, in the coloured image that Kinect provides by hand region segmentation out, then accurate hand region is extracted by Bayes's complexion model;
The method that described wrist point extracts is: the end points first extracting wrist, then with this end points for the cornerwise summit of rectangle, do with each point on handwheel exterior feature the inscribe rectangle being parallel to coordinate axis, the central point of the inscribe rectangle selecting diagonal line the shortest is wrist point;
The method that described arm point extracts is: first ask for all incircles calculating profile, in all incircles, inscribe radius of a circle is selected to be greater than the threshold value (parameter training see inscribed circle diameter threshold value) of setting to prevent from getting on finger, palm of the hand point and wrist point be not in incircle, and the center of circle of distance palm of the hand point and wrist point incircle is farthest arm point.
according to the position of palm of the hand point, wrist point and arm point and straight angle devise 3 class gestures, and to combine with virtual reality and augmented reality, the gesture designed be applied in virtual reality and augmented reality.
Beneficial effect of the present invention is: the method for existing extraction wrist and arm, and major part arranges aid mark in wrist and arm place, is such as a belt being different from the colour of skin in wrist, wrist from extracting.Method wrist point and arm point extracted based on the method for pure vision is rarely found.In existing gesture identification investigation and application, using wrist point and arm point as trace point and to design mutual example also rarely found.This method is simple, and calculated amount is little, can real-time stabilization extract wrist point and arm point, and by the wrist point extracted and arm point design gesture be applied to virtual reality and augmented reality field.
Accompanying drawing explanation
Accompanying drawing described herein is used to provide a further understanding of the present invention, forms a application's part, and illustrative example of the present invention and explanation thereof, for explaining the present invention, do not form inappropriate limitation of the present invention.
Fig. 1 is the process flow diagram in the accurate hand region of extraction of the present invention;
The cromogram that Fig. 2 inputs when being the accurate hand region of extraction of the present invention;
Depth map when Fig. 3 is extraction of the present invention accurate hand region;
When Fig. 4 is extraction of the present invention accurate hand region, depth map is split to the coarse hand region obtained;
Coarse hand and cromogram corresponding to arm regions when Fig. 5 is extraction of the present invention accurate hand region;
The accurate hand area image that Fig. 6 obtains when being extraction of the present invention accurate hand region;
Fig. 7 is the design sketch that wrist end points of the present invention extracts;
Fig. 8 is that wrist end points of the present invention is extracted as inscribe rectangle schematic diagram;
Fig. 9 is that wrist end points of the present invention is extracted as non-inscribe rectangle schematic diagram;
Figure 10 is the wrist point schematic diagram that various gesture of the present invention is extracted;
Figure 11 is that arm of the present invention point is extracted as non-incircle schematic diagram;
Figure 12 is that arm of the present invention point is extracted as incircle schematic diagram;
Figure 13 is the palm of the hand of the present invention some schematic diagram in incircle;
Figure 14 is that wrist of the present invention point is at incircle schematic diagram;
The parameter training figure of Figure 15 inscribed circle diameter threshold value;
Figure 16 is arm point of trying to achieve various gesture by calculating incircle of the present invention;
Figure 17 is that arm of the present invention sets level gesture schematic diagram;
Figure 18 is that arm of the present invention is clenched fist and lifted gesture schematic diagram;
Figure 19 is that arm of the present invention is set level to clench fist with arm and lifted the application schematic diagram of gesture;
Figure 20 be of the present invention downwards and be bent upwards gesture schematic diagram;
Figure 21 is the application schematic diagram of wrist flex gesture of the present invention;
Figure 22 is of the present invention downwards with to Back stroke arm gesture schematic diagram;
Figure 23 is the application schematic diagram waving arm gesture of the present invention.
Embodiment
Detailed content of the present invention and embodiment thereof is further illustrated below in conjunction with accompanying drawing.
See shown in Fig. 1 to Figure 23, the extracting method of the point of the wrist based on depth camera of the present invention and arm point, the three-dimensional position of the palm of the hand point that the depth map provided by Kinect depth camera and OpenNI/NITE are provided, first by hand region segmentation out, then using the bianry image that is partitioned into as mask figure, from coloured image, split region of selling, utilize complexion model and Bayes classifier in the coloured image in hand region, extract accurate hand region.In the accurate hand region extracted, utilize wrist point extracting method and arm point extracting method to extract wrist point and arm point, then utilize wrist point, arm point and palm of the hand point design various gestures.
1, the segmentation in hand region
According to the three-dimensional coordinate of the palm of the hand point that OpenNI/NITE provides, first split in the two-dimensional direction.Palm of the hand point is the center of rectangle, splits the rectangle that comprises hand and arm regions.
for palm of the hand point coordinate in the two-dimensional direction. represent the length of side of rectangle.
Segmentation at depth direction: the depth map provided according to Kinect, extracts the depth value of palm of the hand point , determine hand and arm regions.If and this depth value meet , the point in hand and arm regions, that is:
wherein whether marker image vegetarian refreshments is hand and arm regions, represent the depth value of the palm of the hand point traced into, represent pixel the depth value at place, represent the depth capacity scope of hand and arm, the present invention is taken as 160mm.
The extraction of area of skin color: due to Kinect data, cause the edge sawtooth of hand and arm regions obvious, this can affect the accuracy of contours extract below.Therefore, after obtaining in one's hands and arm regions, application Bayes complexion model, removes the place that those are not the colours of skin, obtains accurate hand and arm regions.
The present invention adopts YCbCr color space to build complexion model, and its reason is the homogeneity of its discreteness and the visually-perceptible of people, brightness and chrominance separation and the characteristic such as colour of skin aggregation zone is compact.Then calculating each Cb*Cr component with Bayes classifier is skin color probability, is greater than 0.6 for the colour of skin, sets up colour of skin look-up table.When given one will be extracted the image of area of skin color, be translated into YCbCr color space, in the colour of skin look-up table set up, whether inquire about the Cb*Cr component of each pixel for the colour of skin.
2, the extraction of wrist point
2.1, from the bianry image of hand and arm, extract profile, then extract the convex closure of profile.Convex closure is made up of some straight-line segments, and the end points of line segment is exactly the intersection point with convex closure and profile.
2.2, the extraction of wrist end points
In the line segment forming convex closure, choose the longest line segment, obtain the profile that this line segment is corresponding, calculate each point on this profile to the distance of nose section, get the point P that ultimate range is corresponding, namely P point is an end points of wrist.
2.3, the extraction of wrist point
With extract wrist end points be a summit, on contouring, any point is another summit, and the line segment connecting two summits is the diagonal line of rectangle, does rectangle, judge whether this rectangle is profile inscribe rectangle, and the point namely on rectangle is all on profile or at contoured interior.Institute in this approach on traversal profile a little, is cornerwise two summits of rectangle with the wrist end points extracted, does rectangle, ask inscribe rectangle.
Ask in all inscribe rectangles, select the inscribe rectangle that diagonal line is the shortest, the central point of inscribe rectangle is wrist point.
3, the extraction of arm point
Wrist is a narrow region, so the range areas of wrist point is very little.Arm is then a sizable region, so the extraction also quite flexible of arm point.Calculate the incircle of profile, inscribe diameter of a circle be greater than the threshold value (preventing incircle from getting on finger) of setting and palm of the hand point and wrist point scarcely in circle (preventing incircle to be taken in palm and wrist).Chosen distance palm of the hand point and wrist point incircle farthest in all incircles, the center of circle of this circle is exactly arm point.
3.1, the incircle of profile is calculated
On contouring a bit, with this point for basic point, contouring a bit does line segment, with the mid point of line segment for the center of circle, line segment length is that diameter does circle, judge on profile a little to the distance in the center of circle, if profile has a point or multiple point be less than radius to the distance in the center of circle, just prove that this circle is not incircle.
The center of continuing to get next point and basic point line is the center of circle, and the distance between 2 is that diameter does circle, determines whether incircle.Until the institute on profile has a little all traveled through, basic point has just got next point as basic point, then does circle with each point on profile, asks for incircle, until basic point is got all over whole profile.
3.2 calculate arm point
The center of circle calculating each incircle is to the distance of palm of the hand point and wrist point, if palm of the hand point and wrist point are less than radius to the distance in the center of circle, prove that hand point and wrist point are in incircle, this incircle is undesirable.Otherwise record this centre point respectively, radius and the center of circle are to the distance of palm of the hand point and wrist point.In all qualified incircles, radius is selected to be greater than certain threshold value and distance of center circle is arm point from the center of circle of palm of the hand point and wrist point incircle farthest.
The parameter training of 3.3 inscribed circle diameter threshold values
The parameter training process of inscribed circle diameter threshold value of the present invention: in selected depth range, stretch out a finger at every turn, finger will stretch and not bend, slowly mobile before and after in this depth range, in the process of movement, adopt the method based on profile to extract finger tip point, with the finger tip point extracted for the center of circle, preseting length r=20 is that radius does circle, this circle has two intersection points with finger, calculate the distance of two intersection points, namely the width pointed, in this depth range, other fingers in the five fingers are changed every a period of time, repeatedly test, extract the width in this depth range of different finger.The present invention have selected six depth rangees, the finger width value of each depth range is exported, tries to achieve the scope of finger width, and the intermediate value of getting finger width scope adds the threshold value in the process that 3px extracts as arm point.
The width range square length of side that depth value scope is pointed is chosen ( )
[1001mm,1100mm] [10px,13px] 12px+
[901mm,1000mm] [12px,16px] 14px+
[801mm,900mm] [15px, 20px] 17px+
[701mm,800mm] [18px,23px] 20px+
[601mm,700mm] [23px,26px] 24px+
[501mm,600mm] [24px,28px] 26px+
4, the design of the palm of the hand, wrist and handle composite gesture
If palm of the hand point is , wrist point is , arm point is if, straight line with straight line obtuse angle in angle is , straight line with the angle of the transverse axis of coordinate axis be .According to , and 3 class gestures are designed in the position of palm of the hand point and arm point.
4.1, arm is stretched flat and lifts the design of gesture
When and time, be set as that arm is stretched flat gesture.
When and time, for clenching fist of arm lifts gesture,
4.2, the design of wrist flex gesture
Work as straight line with straight line obtuse angle in angle is and for hand is bent downwardly gesture along wrist.When and for hand is bent upwards gesture along wrist.
4.3, the design of arm gesture is waved
When for arm is brandished downwards.
When for arm is upwards brandished.
Embodiment 1: the segmentation in hand region
Shown in Fig. 1 and Fig. 2, the palm of the hand point three-dimensional position that the present invention utilizes Kinect depth camera to provide depth map and OpenNI/NITE to provide carries out the segmentation in hand region.Then the look-up table utilizing Bayes's complexion model to set up, extracts accurate hand region.
Embodiment 2: the extraction of wrist point
See shown in Fig. 3 to Fig. 5, first extract the profile in hand region, and then extract the convex closure of profile, try to achieve the profile of the nose section correspondence of convex closure, calculating the distance of each point on this profile to nose section, is an end points of wrist to the point that nose segment distance is maximum.
Wrist end points is a summit, and appointing on contouring is a bit another summit, and two summits on the diagonal line being rectangle with two summits, do the rectangle being parallel to coordinate axis, then to judge on this rectangle each point whether on profile or contoured interior.If rectangle exists the point outside profile, then this rectangle is not inscribe rectangle.Travel through the point on each profile in this approach, be diagonal line two summits with wrist end points, do rectangle, ask for inscribe rectangle.The inscribe rectangle selecting diagonal line the shortest in all inscribe rectangles, the central point of this rectangle is exactly wrist point.
Embodiment 3: the extraction of arm point
See shown in Fig. 6 to Fig. 8, on contouring a bit, with this point for basic point, contouring a bit does line segment, with the mid point of line segment for the center of circle, line segment length is that diameter does circle, judge on profile a little to the distance in the center of circle, if profile has a point or multiple point be less than radius to the distance in the center of circle, just prove that this circle is not incircle.
The center of continuing to get next point and basic point line is the center of circle, and the distance between 2 is that diameter does circle, determines whether incircle.Until the institute on profile has a little all traveled through, basic point has just got next point as basic point, then does circle with each point on profile, asks for incircle, until basic point is got all over whole profile.
The center of circle calculating each incircle is to the distance of palm of the hand point and wrist point, if palm of the hand point and wrist point are less than radius to the distance in the center of circle, prove that hand point and wrist point are in incircle, this incircle is undesirable.Otherwise record this centre point respectively, radius and the center of circle are to the distance of palm of the hand point and wrist point.In all qualified incircles, radius is selected to be greater than certain threshold value and distance of center circle is arm point from the center of circle of palm of the hand point and wrist point incircle farthest.
Embodiment 4: palm of the hand point, the design of wrist point and arm point combination gesture and application
See shown in Fig. 9 to Figure 14, if palm of the hand point is , wrist point is , arm point is if, straight line with straight line obtuse angle in angle is , straight line with the angle of the transverse axis of coordinate axis be .According to , and 3 class gestures are designed in the position of palm of the hand point and arm point.
Gesture identification combined with augmented reality, be stretched flat with the arm designed and clench fist the stretching, extension of virtual portrait model arm and lifting of arm of lifting gesture to drive.Along the reclinate gesture of wrist, hand represents that virtual hand model in augmented reality prepares the action of opening wine bottle, hand is bent upwards along wrist the unlatching that gesture represents bottle.Wave arm gesture to play table tennis to drive the hand in virtual scene and virtual portrait.
The foregoing is only preferred embodiment of the present invention, be not limited to the present invention, for a person skilled in the art, the present invention can have various modifications and variations.All any amendments made for the present invention, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (1)

1. the extracting method of the point of the wrist based on depth camera and arm point, it is characterized in that: the palm of the hand point three-dimensional position that the cromogram gathered by Kinect depth camera, depth map and OpenNI/NITE are provided, first hand region is split, then by Bayes's complexion model by accurate hand extracted region out, by visible sensation method, wrist point and arm point are extracted; The multiple man-machine interaction gesture of palm of the hand point design that the wrist point extracted, arm point and OpenNI/NITE provide, is then applied to virtual reality and augmented reality field by the gesture of design;
Described hand region segmentation is: first split region of selling in the two-dimensional direction, then split in the depth direction, again using the hand area image that is partitioned into as mask figure, in the coloured image that Kinect provides by hand region segmentation out, then accurate hand region is extracted by Bayes's complexion model;
The method that described wrist point extracts is: the end points first extracting wrist, then with this end points for the cornerwise summit of rectangle, do with each point on handwheel exterior feature the inscribe rectangle being parallel to coordinate axis, the central point of the inscribe rectangle selecting diagonal line the shortest is wrist point;
The method that described arm point extracts is: first ask for all incircles calculating profile, in all incircles, inscribe radius of a circle is selected to be greater than the threshold value of setting to prevent from getting on finger, palm of the hand point and wrist point be not in incircle, and the center of circle of distance palm of the hand point and wrist point incircle is farthest arm point.
CN201510336118.6A 2015-06-17 2015-06-17 The extracting method of wrist point and arm point based on depth camera Active CN104899591B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510336118.6A CN104899591B (en) 2015-06-17 2015-06-17 The extracting method of wrist point and arm point based on depth camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510336118.6A CN104899591B (en) 2015-06-17 2015-06-17 The extracting method of wrist point and arm point based on depth camera

Publications (2)

Publication Number Publication Date
CN104899591A true CN104899591A (en) 2015-09-09
CN104899591B CN104899591B (en) 2018-01-05

Family

ID=54032245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510336118.6A Active CN104899591B (en) 2015-06-17 2015-06-17 The extracting method of wrist point and arm point based on depth camera

Country Status (1)

Country Link
CN (1) CN104899591B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107358215A (en) * 2017-07-20 2017-11-17 重庆工商大学 A kind of image processing method applied to jewelry augmented reality system
CN108563329A (en) * 2018-03-23 2018-09-21 上海数迹智能科技有限公司 A kind of human arm position's parameter extraction algorithm based on depth map
CN108985242A (en) * 2018-07-23 2018-12-11 中国联合网络通信集团有限公司 The method and device of images of gestures segmentation
CN109344701A (en) * 2018-08-23 2019-02-15 武汉嫦娥医学抗衰机器人股份有限公司 A kind of dynamic gesture identification method based on Kinect
CN111354029A (en) * 2020-02-26 2020-06-30 深圳市瑞立视多媒体科技有限公司 Gesture depth determination method, device, equipment and storage medium
CN112036284A (en) * 2020-08-25 2020-12-04 腾讯科技(深圳)有限公司 Image processing method, device, equipment and storage medium
CN112949542A (en) * 2021-03-17 2021-06-11 哈尔滨理工大学 Wrist division line determining method based on convex hull detection

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062736A1 (en) * 2010-09-13 2012-03-15 Xiong Huaixin Hand and indicating-point positioning method and hand gesture determining method used in human-computer interaction system
CN103226387A (en) * 2013-04-07 2013-07-31 华南理工大学 Video fingertip positioning method based on Kinect
CN104063059A (en) * 2014-07-13 2014-09-24 华东理工大学 Real-time gesture recognition method based on finger division

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062736A1 (en) * 2010-09-13 2012-03-15 Xiong Huaixin Hand and indicating-point positioning method and hand gesture determining method used in human-computer interaction system
CN103226387A (en) * 2013-04-07 2013-07-31 华南理工大学 Video fingertip positioning method based on Kinect
CN104063059A (en) * 2014-07-13 2014-09-24 华东理工大学 Real-time gesture recognition method based on finger division

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107358215A (en) * 2017-07-20 2017-11-17 重庆工商大学 A kind of image processing method applied to jewelry augmented reality system
CN107358215B (en) * 2017-07-20 2020-10-09 重庆工商大学 Image processing method applied to hand ornament augmented reality system
CN108563329A (en) * 2018-03-23 2018-09-21 上海数迹智能科技有限公司 A kind of human arm position's parameter extraction algorithm based on depth map
CN108985242A (en) * 2018-07-23 2018-12-11 中国联合网络通信集团有限公司 The method and device of images of gestures segmentation
CN108985242B (en) * 2018-07-23 2020-07-14 中国联合网络通信集团有限公司 Gesture image segmentation method and device
CN109344701A (en) * 2018-08-23 2019-02-15 武汉嫦娥医学抗衰机器人股份有限公司 A kind of dynamic gesture identification method based on Kinect
CN109344701B (en) * 2018-08-23 2021-11-30 武汉嫦娥医学抗衰机器人股份有限公司 Kinect-based dynamic gesture recognition method
CN111354029A (en) * 2020-02-26 2020-06-30 深圳市瑞立视多媒体科技有限公司 Gesture depth determination method, device, equipment and storage medium
WO2021169704A1 (en) * 2020-02-26 2021-09-02 深圳市瑞立视多媒体科技有限公司 Method, device and apparatus for determining depth of gesture, and storage medium
CN112036284A (en) * 2020-08-25 2020-12-04 腾讯科技(深圳)有限公司 Image processing method, device, equipment and storage medium
CN112036284B (en) * 2020-08-25 2024-04-19 腾讯科技(深圳)有限公司 Image processing method, device, equipment and storage medium
CN112949542A (en) * 2021-03-17 2021-06-11 哈尔滨理工大学 Wrist division line determining method based on convex hull detection

Also Published As

Publication number Publication date
CN104899591B (en) 2018-01-05

Similar Documents

Publication Publication Date Title
CN104899591A (en) Wrist point and arm point extraction method based on depth camera
CN101807114B (en) Natural interactive method based on three-dimensional gestures
CN103226387B (en) Video fingertip localization method based on Kinect
CN104063059B (en) A kind of real-time gesture recognition method based on finger segmentation
CN103246891B (en) A kind of Chinese Sign Language recognition methods based on Kinect
CN103093196B (en) Character interactive input and recognition method based on gestures
CN109325398A (en) A kind of face character analysis method based on transfer learning
CN105913416A (en) Method for automatically segmenting three-dimensional human face model area
CN107168527A (en) The first visual angle gesture identification and exchange method based on region convolutional neural networks
CN107728792A (en) A kind of augmented reality three-dimensional drawing system and drawing practice based on gesture identification
CN103824253B (en) Figure five sense organ deformation method based on image local precise deformation
CN108052884A (en) A kind of gesture identification method based on improvement residual error neutral net
CN106055091A (en) Hand posture estimation method based on depth information and calibration method
CN107169455A (en) Face character recognition methods based on depth local feature
CN101216882A (en) A method and device for positioning and tracking on corners of the eyes and mouths of human faces
CN105930784A (en) Gesture recognition method
CN110210426A (en) Method for estimating hand posture from single color image based on attention mechanism
CN105068748A (en) User interface interaction method in camera real-time picture of intelligent touch screen equipment
CN101853523A (en) Method for adopting rough drawings to establish three-dimensional human face molds
CN109558902A (en) A kind of fast target detection method
CN104679242A (en) Hand gesture segmentation method based on monocular vision complicated background
CN103500010A (en) Method for locating fingertips of person through video
CN104517100B (en) Gesture pre-judging method and system
CN107092917A (en) A kind of Chinese-character stroke extraction method based on manifold learning
CN103995595A (en) Game somatosensory control method based on hand gestures

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 130012 Jilin province city Changchun well-informed high tech Industrial Development Zone, Road No. 168

Applicant after: JILIN JIYUAN SPACE-TIME CARTOON GAME SCIENCE AND TECHNOLOGY GROUP CO., LTD.

Address before: No. 2888, Silicon Valley Avenue, Changchun high tech Zone, Jilin Province

Applicant before: JILIN JIYUAN SPACE-TIME ANIMATION GAME TECHNOLOGY CO., LTD.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 130012 No. 168 Boxue Road, Changchun High-tech Industrial Development Zone, Jilin Province

Patentee after: Jilin Jidong Culture and Art Group Co., Ltd.

Address before: 130012 No. 168 Boxue Road, Changchun High-tech Industrial Development Zone, Jilin Province

Patentee before: JILIN JIYUAN SPACE-TIME CARTOON GAME SCIENCE AND TECHNOLOGY GROUP CO., LTD.

CP01 Change in the name or title of a patent holder