CN102122345A - Finger gesture judging method based on hand movement variation - Google Patents

Finger gesture judging method based on hand movement variation Download PDF

Info

Publication number
CN102122345A
CN102122345A CN 201110042470 CN201110042470A CN102122345A CN 102122345 A CN102122345 A CN 102122345A CN 201110042470 CN201110042470 CN 201110042470 CN 201110042470 A CN201110042470 A CN 201110042470A CN 102122345 A CN102122345 A CN 102122345A
Authority
CN
China
Prior art keywords
image
gesture
threshold value
finger gesture
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 201110042470
Other languages
Chinese (zh)
Inventor
管业鹏
贾新丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN 201110042470 priority Critical patent/CN102122345A/en
Publication of CN102122345A publication Critical patent/CN102122345A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to a finger gesture judging method based on hand movement variation, which can be used for automatically judging the finger gesture according to the arm movement variation characteristic of an arm in the finger gesture. A foreground object is extracted with a background difference method; the hand zone of the foreground object is extracted with a complexion segmentation method; when a finger gesture user appoints to a target, a finger gesture arm has the characteristics of changing from obvious movement to relatively motionless state; and the finger gesture is automatically judged. With the method disclosed by the invention, special hardware support is not needed, and the moving range of the user can not be restricted. The method is convenient, flexible and easy to realize.

Description

Finger gesture method of discrimination based on the hand exercise variation
Technical field
The present invention relates to a kind of finger gesture method of discrimination that changes based on hand exercise, be used for video digital images analysis and understanding, belong to the intelligent information processing technology field.
Background technology
Human-computer interaction technology is the advanced subject in computer research field always.Early stage man-machine interaction is mainly by hardware devices such as keyboards, mouse, realize between people and the computing machine interchange with communicate by letter.Along with the fast development of computer technology, both at home and abroad more and more researchers is devoted to study and is met the human novel human-computer interaction technology of being accustomed to that exchanges naturally more, promptly based on the human-computer interaction technology of Human biology feature.Gesture is a kind of natural, directly perceived, interactive mode of being easy to learn, it can for the user provide and computing machine between the interactive means of nature.But because staff is non-rigid body, gesture also has diversity and the polysemy on time and the space, causes the gesture identification difficulty.Diversity and polysemy with respect to gesture refer to the gesture easy to understand.Referring to that gesture is to use the reflection of finger to the spatial impression targets of interest in people's daily life, is the important pioneer of human family of languages development and ontogeny, can disclose human society intelligence, is a kind of desirable natural interactive mode.Carry out man-machine interaction based on the finger gesture, then can make full use of human daily technical ability, break away from the constraint that the conventional input equipment of present use (as keyboard, Genius mouse and touch-screen etc.) is imported, wherein, refer to that effectively the key of gesture man-machine interaction is meant the gesture differentiation.
Present adoptable finger gesture method of discrimination mainly contains: (1) template matches, and this method is simple, easy to understand, but, cause referring to that gesture has different attitudes owing to refer to have angle difference between gesture hand and the video camera, therefore, a plurality of templates need be set, and real-time is not high; (2) neural net method, this method has strong sort feature and noiseproof feature, but since its processing time sequence indifferent be mainly used in static state and refer to gesture identification; (3) Hidden Markov Model (HMM) method, this method has THE INVARIANCE OF THE SCALE OF TIME and can cut apart automatically and classify, but because the generality of its topological structure causes its calculated amount when training and identification big, and need to calculate a large amount of state probability density, be difficult to requirement of real time.
Summary of the invention
It is poor to the objective of the invention is at present finger gesture method of discrimination calculation of complex, real-time, a kind of finger gesture method of discrimination that changes based on hand exercise is provided, according to referring to that the gesture user when the definite object, refers to that the gesture arm has obvious motion change feature, can under multiple condition, realize referring to the gesture differentiation.
For achieving the above object, design of the present invention is: adopt the background subtraction point-score, extract foreground object, utilization skin color segmentation method is extracted the hand region of foreground object, according to referring to that the gesture user is when the definite object, refer to that the gesture arm has motion obviously to static relatively variation characteristic, differentiate automatically to refer to gesture.
According to the foregoing invention design, the present invention adopts following technical proposals:
A kind of finger gesture method of discrimination that changes based on hand exercise.It is according to the arm motion variation characteristic that refers in the gesture, differentiates automatically to refer to gesture, and concrete steps are as follows:
1) startup refers to the gesture image capturing system: gather video image;
2) obtain background image
Continuous acquisition does not comprise user's scene image, when setting-up time at interval in two image difference during less than setting image difference threshold value, piece image in then should the time interval is image as a setting, otherwise gather again, two image difference in the time interval of satisfying setting are less than setting the image difference threshold value;
3) foreground object is cut apart
Current frame image and step 2 by camera acquisition) background image that obtains subtracts each other, and is partitioned into the foreground object zone;
4) extract hand region;
5) refer to the gesture differentiation.
Above-mentioned steps 4) the concrete operations step of extraction hand region is as follows:
(1) color space conversion is calculated color-values Cr, Cb: red by the RGB color space R, green G, indigo plant BThree-component, the color-values of calculating YCbCr color space Cr, Cb:
Cr?=?0.5× R?-?0.4187× G?–?0.0813× B
Cb?=?-0.1687× R?–?0.3313× G?+?0.5× B
(2) area of skin color extracts: determine color-values respectively CrThreshold value T 1, T 2, CbThreshold value T 3, T 4And Cr/ CbThe ratio threshold value T 5, the zone with all pixels that satisfy following formula are formed is defined as area of skin color S
Wherein,
Figure 986874DEST_PATH_IMAGE002
Be " logical and " operational character;
(3) extract the area of skin color of possible finger gesture user object: will satisfy the image-region of step 3) and step (2) simultaneously, as the area of skin color of possible finger gesture user object;
(4) extract hand region: the bianry image to step (3) carries out the connected region search, calculates the connected region height S l With wide S w Ratio S l / S w , and the hole number in the connected region HWith the connected region size W, the zone that all pixels that satisfy following formula are formed is considered as non-hand region, rejects from the bianry image zone of step (3);
Figure 2011100424700100002DEST_PATH_IMAGE003
Wherein, T 6, T 7For S l / S w The ratio threshold value, T 8Be the connected region size threshold value.
Above-mentioned steps 5) refer to that the concrete operations step that gesture differentiates is as follows:
(1) be 0 with plane picture by angular divisions 0~360 0
(2) motion hand region is determined: to the hand region of step 4), add up its continuous motion direction in angle A 1To angle A 2Frame number in the scope M a , if M a M 0, judge that then this hand region is the motion hand region, wherein, M 0Be the frame number threshold value;
(3) refer to that gesture action differentiates:, add up it and in the time subsequently, be in continuous frame number when static to the hand region that step (2) is determined M b, if M b M 1, judge that then the action of finger gesture has taken place user object, wherein, M 1Be the frame number threshold value.
Principle of the present invention is as follows:
In technical scheme of the present invention, can provide characteristic more completely according to the background subtraction point-score, all can be embodied in the variation of scene image sequence based on any perceptible target travel in the scene, utilize the difference between present image and the background image, from video image, detect foreground object.
If the time interval
Figure 863563DEST_PATH_IMAGE004
In, obtain respectively t n-1 With t n Two two field pictures in two moment f( t n-1 , x, y), f( t n , x, y), two width of cloth images are asked difference by pixel, get difference image Diff( x, y):
Figure 2011100424700100002DEST_PATH_IMAGE005
Wherein, DiffR, DiffG, DiffBThe corresponding difference image red, green, blue three-component of difference, | f| for fAbsolute value.
If the time interval
Figure 751272DEST_PATH_IMAGE004
Two interior sequence images f( t n-1 , x, y), f( t n , x, y) difference
Figure 18305DEST_PATH_IMAGE006
, wherein, TBe threshold value, | for " logical OR " operational symbol, show
Figure 217206DEST_PATH_IMAGE004
Do not change object in the time interval, thus can with t n ~ t n-1 Between the image in a certain moment, image as a setting.
Utilize the gained background image,, adopt the background subtraction point-score, be partitioned into the foreground object zone according to the current current frame image that obtains.
Simultaneously, although the colour of skin varies with each individual, and vary, in the color space of YCrCb, the human colour of skin presents good cluster characteristic, and insensitive to the attitude variation, can overcome variable effects such as rotation, expression, has strong robustness.In the YCbCr model, YThe monochrome information of representation in components color, CrWith CbComponent is represented red and blue colourity respectively.As follows from the RGB space conversion to the conversion formula in YCbCr space:
Figure 2011100424700100002DEST_PATH_IMAGE007
In the YCbCr space, the colour of skin exists Cb, CrAnd Cr/ CbDistribution on, be in the stable scope, that is, in the current image that obtains, will satisfy
Figure 948401DEST_PATH_IMAGE008
The pixel of condition, as the area of skin color of present image, wherein,
Figure 4082DEST_PATH_IMAGE002
Be " logical and " operational character.For overcoming the influence of class colour of skin information (as timber floor, wooden cabinet etc.) in the present image, with satisfying the foreground object zone that area of skin color that this colour of skin condition extracted and above-mentioned employing background subtraction point-score are partitioned into simultaneously, as the user object zone.
For extracting the finger areas of user object, need from the user object zone that obtains, get rid of the human face region that also satisfies above-mentioned two conditions simultaneously, because color characteristic areas such as human eye, eyebrow, lip are not in face complexion, therefore, in the human face region that extracts, will there be hole, and the height of face complexion area, wide ratio, be distributed in the stable scope, reject of the influence of people's face in view of the above the finger extracted region.
According to when referring to that gesture is carried out man-machine interaction, arm stretches naturally, lift from bottom to top with given pace, aloft pause a moment, and the sensing interesting target, therefore, according to referring to that the gesture user is when the definite object, refer to that the gesture arm has motion obviously to static relatively variation characteristic, differentiates thereby realization refers to gesture.
The present invention compared with prior art, have following conspicuous outstanding substantive distinguishing features and remarkable advantage: the present invention is according to the hand exercise feature that refers in the gesture, though and the human colour of skin varies with each individual, but in the color space of YCrCb, present good cluster characteristic, in conjunction with the background subtraction separating method, extract the hand region of user object, and refer to that the gesture hand has tangible motion change feature, differentiate and refer to gesture, computing is easy, flexibly, realize easily, solved when differentiation refers to gesture, need be simple by contact external device (as gloves) or context request, refer to that the gesture object is single, and dynamic scene is changed responsive, noise is big, the deficiency of computing complexity; Improve the robustness that refers to the gesture differentiation, can adapt to the automatic differentiation that complex background condition refers to gesture down.Easy, flexible, the easy realization of method of the present invention.
Description of drawings
Fig. 1 is the flowsheet of the inventive method.
Fig. 2 is the original background image of one embodiment of the invention.
Fig. 3 is the present image of one embodiment of the invention.
Fig. 4 is the foreground object bianry image that is partitioned into based on the background difference of one embodiment of the invention.
Fig. 5 is the colour of skin bianry image that is partitioned into based on the YCbCr color space of one embodiment of the invention.
Fig. 6 is the area of skin color of the user object of one embodiment of the invention
The hand region bianry image that Fig. 7 extracts.
The motion hand region bianry image that Fig. 8 extracts.
Embodiment
A preferred embodiment of the present invention accompanying drawings is as follows: the running program of present embodiment as shown in Figure 1.This routine original background image as shown in Figure 2, present image such as Fig. 3 according to the arm motion variation characteristic that refers in the gesture behavior, to coloured image shown in Figure 3, refer to that gesture differentiates.This is as follows based on the finger gesture method of discrimination concrete steps that hand exercise changes:
1) starts indication images of gestures acquisition system: gather video image;
2) obtain background image
Continuous acquisition does not comprise the scene image of indication gesture target, when setting-up time at interval in two image difference during less than setting image difference threshold value, piece image in then should the time interval is image as a setting, otherwise gather again, two image difference in the time interval of satisfying setting are less than setting the image difference threshold value;
3) foreground object is cut apart
Subtract each other by present image (Fig. 3) and original background image (Fig. 2), be partitioned into the foreground object zone, as shown in Figure 4;
4) extract hand region
The concrete operations step is as follows:
(1) color space conversion is calculated color-values Cr, Cb: red by the RGB color space R, green G, indigo plant BThree-component, the color-values of calculating YCbCr color space Cr, Cb:
Cr?=?0.5× R?-?0.4187× G?–?0.0813× B
Cb?=?-0.1687× R?–?0.3313× G?+?0.5× B
(2) area of skin color extracts: determine color-values respectively CrThreshold value, CbThreshold value and Cr/ CbThe ratio threshold value, the zone with all pixels that satisfy following formula are formed is defined as area of skin color S, as shown in Figure 5;
Figure 2011100424700100002DEST_PATH_IMAGE009
Wherein, Be " logical and " operational character.
(3) extract the area of skin color of possible finger gesture user object: will satisfy the bianry image of step 3) and step (2) simultaneously, as the area of skin color of possible finger gesture user object, as shown in Figure 6;
(4) extract hand region: to bianry image shown in Figure 6, carry out the connected region search, calculate the connected region height S l With wide S w Ratio S l / S w , and the hole number in the connected region HWith the connected region size W, the zone that all pixels that satisfy following formula are formed is considered as non-hand region, rejects from bianry image zone shown in Figure 6, obtains hand region, as shown in Figure 7;
Figure 2011100424700100002DEST_PATH_IMAGE011
5) refer to the gesture differentiation
The concrete operations step is as follows:
(1) be 0 with plane picture by angular divisions 0~360 0
(2) motion hand region is determined: to hand bianry image zone shown in Figure 7, add up its continuous motion direction 40 0~140 0If frame number in the angular range greater than 10, judges that then this hand region is the motion hand region, as shown in Figure 8;
(3) refer to that gesture action differentiates:, add up it and in the time subsequently, be in continuous frame number when static, if it, judges then that the action of finger gesture has taken place user object greater than 5 to hand region shown in Figure 8.

Claims (3)

1. finger gesture method of discrimination that changes based on hand exercise is characterized in that concrete steps are as follows:
1) startup refers to the gesture image capturing system: gather video image;
2) obtain background image
Continuous acquisition does not comprise user's scene image, when setting-up time at interval in two image difference during less than setting image difference threshold value, piece image in then should the time interval is image as a setting, otherwise gather again, two image difference in the time interval of satisfying setting are less than setting the image difference threshold value;
3) foreground object is cut apart
Current frame image and step 2 by camera acquisition) background image that obtains subtracts each other, and is partitioned into the foreground object zone;
4) extract hand region;
5) refer to the gesture differentiation.
2. the finger gesture method of discrimination that changes based on hand exercise according to claim 1 is characterized in that the concrete operations step of described step 4) extraction hand region is as follows:
(1) color space conversion is calculated color-values Cr, Cb: red by the RGB color space R, green G, indigo plant BThree-component, the color-values of calculating YCbCr color space Cr, Cb:
Cr?=?0.5× R?-?0.4187× G?–?0.0813× B
Cb?=?-0.1687× R?–?0.3313× G?+?0.5× B
(2) area of skin color extracts: determine color-values respectively CrThreshold value T 1, T 2, CbThreshold value T 3, T 4And Cr/ CbThe ratio threshold value T 5, the zone with all pixels that satisfy following formula are formed is defined as area of skin color S
S?=?( T 1CrT 2)?∩( T 3CbT 4)∩( Cr/ CbT 5
Wherein, ∩ is " logical and " operational character;
(3) extract the area of skin color of possible finger gesture user object: will satisfy the image-region of step 3) and step (2) simultaneously, as the area of skin color of possible finger gesture user object;
(4) extract hand region: the bianry image to step (3) carries out the connected region search, calculates the connected region height S l With wide S w Ratio S l / S w , and the hole number in the connected region HWith the connected region size W, the zone that all pixels that satisfy following formula are formed is considered as non-hand region, rejects from the bianry image zone of step (3);
F?=?( T 6?≤ ?S l / S w T 7?)∩( H?>1)∩ W< T 8
Wherein, T 6, T 7For S l / S w The ratio threshold value, T 8Be the connected region size threshold value.
3. the finger gesture method of discrimination that changes based on hand exercise according to claim 1 is characterized in that described step 5) refers to that the concrete operations step of gesture differentiation is as follows:
(1) be 0 with plane picture by angular divisions 0~360 0
(2) motion hand region is determined: to the hand region of step 4), add up its continuous motion direction in angle A 1To angle A 2Frame number in the scope M a , if M a M 0, judge that then this hand region is the motion hand region, wherein, M 0Be the frame number threshold value;
(3) refer to that gesture action differentiates:, add up it and in the time subsequently, be in continuous frame number when static to the hand region that step (2) is determined M b, if M b M 1, judge that then the action of finger gesture has taken place user object, wherein, M 1Be the frame number threshold value.
CN 201110042470 2011-02-22 2011-02-22 Finger gesture judging method based on hand movement variation Pending CN102122345A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110042470 CN102122345A (en) 2011-02-22 2011-02-22 Finger gesture judging method based on hand movement variation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110042470 CN102122345A (en) 2011-02-22 2011-02-22 Finger gesture judging method based on hand movement variation

Publications (1)

Publication Number Publication Date
CN102122345A true CN102122345A (en) 2011-07-13

Family

ID=44250899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110042470 Pending CN102122345A (en) 2011-02-22 2011-02-22 Finger gesture judging method based on hand movement variation

Country Status (1)

Country Link
CN (1) CN102122345A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106960418A (en) * 2016-01-11 2017-07-18 安鹤男 The algorithm that sleet is removed in video image
US10466797B2 (en) 2014-04-03 2019-11-05 Huawei Technologies Co., Ltd. Pointing interaction method, apparatus, and system
CN111308969A (en) * 2020-01-16 2020-06-19 浙江大学 Carrier motion mode discrimination method based on time domain differential characteristics

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《华中科技大学学报(自然科学版)》 20081031 李瑞峰,等 一种复杂背景下的手势提取方法 第80-82页 1-3 第36卷, *
《电视技术》 20110131 贾新丽,等 基于运动变化的指势行为实时判别方法 第129-132页 1-3 第35卷, 第01期 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10466797B2 (en) 2014-04-03 2019-11-05 Huawei Technologies Co., Ltd. Pointing interaction method, apparatus, and system
CN106960418A (en) * 2016-01-11 2017-07-18 安鹤男 The algorithm that sleet is removed in video image
CN111308969A (en) * 2020-01-16 2020-06-19 浙江大学 Carrier motion mode discrimination method based on time domain differential characteristics

Similar Documents

Publication Publication Date Title
CN101719015B (en) Method for positioning finger tips of directed gestures
CN107168527B (en) The first visual angle gesture identification and exchange method based on region convolutional neural networks
CN105528794B (en) Moving target detecting method based on mixed Gauss model and super-pixel segmentation
CN103530613B (en) Target person hand gesture interaction method based on monocular video sequence
CN103186775B (en) Based on the human motion identification method of mix description
CN103530892B (en) A kind of both hands tracking based on Kinect sensor and device
CN106598226A (en) UAV (Unmanned Aerial Vehicle) man-machine interaction method based on binocular vision and deep learning
CN110032932B (en) Human body posture identification method based on video processing and decision tree set threshold
CN105046206B (en) Based on the pedestrian detection method and device for moving prior information in video
CN110956099B (en) Dynamic gesture instruction identification method
Song et al. Design of control system based on hand gesture recognition
CN103150019A (en) Handwriting input system and method
CN101201695A (en) Mouse system for extracting and tracing based on ocular movement characteristic
Pandey et al. Hand gesture recognition for sign language recognition: A review
CN112906550B (en) Static gesture recognition method based on watershed transformation
CN103020614B (en) Based on the human motion identification method that space-time interest points detects
CN103218601B (en) The method and device of detection gesture
CN112329646A (en) Hand gesture motion direction identification method based on mass center coordinates of hand
CN110135237A (en) A kind of gesture identification method
Alksasbeh et al. Smart hand gestures recognition using K-NN based algorithm for video annotation purposes
CN103456012B (en) Based on visual human hand detecting and tracking method and the system of maximum stable area of curvature
CN102073878B (en) Non-wearable finger pointing gesture visual identification method
CN102122345A (en) Finger gesture judging method based on hand movement variation
CN103077383B (en) Based on the human motion identification method of the Divisional of spatio-temporal gradient feature
CN112199015B (en) Intelligent interaction all-in-one machine and writing method and device thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20110713