CN104392211A - Skin recognition method based on saliency detection - Google Patents

Skin recognition method based on saliency detection Download PDF

Info

Publication number
CN104392211A
CN104392211A CN201410638233.4A CN201410638233A CN104392211A CN 104392211 A CN104392211 A CN 104392211A CN 201410638233 A CN201410638233 A CN 201410638233A CN 104392211 A CN104392211 A CN 104392211A
Authority
CN
China
Prior art keywords
skin
value
image
detected
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410638233.4A
Other languages
Chinese (zh)
Inventor
张伟
傅松林
张长定
叶志鸿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XIAMEN MEITUWANG TECHNOLOGY Co Ltd
Original Assignee
XIAMEN MEITUWANG TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XIAMEN MEITUWANG TECHNOLOGY Co Ltd filed Critical XIAMEN MEITUWANG TECHNOLOGY Co Ltd
Priority to CN201410638233.4A priority Critical patent/CN104392211A/en
Publication of CN104392211A publication Critical patent/CN104392211A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/162Detection; Localisation; Normalisation using pixel segmentation or colour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a skin recognition method based on saliency detection. The skin recognition method comprises performing face recognition and saliency detection on an image to be detected to obtain a face skin region, calculating the face skin region to obtain an average skin color and a skin color probability mapping table, and finally, performing skin color recognition on the image to be detected according to the skin color probability mapping table and obtaining the result map of the skin color probability of the image to be detected; the skin recognition method has the advantages that the collection of huge training data is avoided, the recognition result is not affected by the deviation of the training data, and as saliency detection is performed, non-skin regions in the face region can be excluded, and therefore, the accuracy of recognition is higher and the problem of misrecognition due to that the image to be detected is too bright or too dark is overcome.

Description

A kind of skin identification method detected based on conspicuousness
Technical field
The present invention relates to image-recognizing method, particularly a kind of skin identification method detected based on conspicuousness.
Background technology
The target of skin identification is the skin area automatically identifying human body from image, and carries out beauty treatment according to the skin area identified and beautify operation.And in prior art, the skin identification of human body in image existed that false recognition rate is high, the of long duration and step trouble that needs to build database, realize the deficiencies such as complicated, program operational efficiency is low.
Chinese invention patent application 201110185739.0 discloses the human body skin tone testing method of a kind of illumination adaptive in a kind of Pattern recognition and image processing technical field, by collecting tranining database, basic complexion model and illumination model is trained with tranining database, and screen with the pixel that basic complexion model treats detected image, one and the immediate illumination model of image to be detected is found out from illumination model, detected image is treated and basic complexion model is revised with this model, revised image to be detected exports after revised basic complexion model detects.
But the technical scheme of foregoing invention mainly builds complexion model according to database, has certainly existed significant limitation.When training data in the database of software definition is all partial to dark situation, the more black part that the skin detection of partially bright photo obtains will be automatically recognized as skin, causes false recognition rate; When training data in the database of software definition is all partial to bright situation, the result that the skin detection of partially dark photo obtains has very major part not to be identified, and false recognition rate is higher; Time training data in the database of software definition balances partially, darker or brighter skin all can not be identified.
Summary of the invention
The present invention, for solving the problem, provides the skin identification method detected based on conspicuousness that a kind of algorithm is simple, operational efficiency is high, discrimination is high.
For achieving the above object, the technical solution used in the present invention is:
Based on the skin identification method that conspicuousness detects, it is characterized in that, comprise the following steps:
10. treat detected image and carry out recognition of face, obtain human face region, and conspicuousness detection is carried out to human face region, obtain face most marking area, i.e. face skin area;
Face skin area described in 20. pairs carries out mean value computation, obtains the average colour of skin;
30. calculate the skin color probability mapping table of image to be detected according to the described average colour of skin;
40. treat detected image according to described skin color probability mapping table carries out skin color model, and obtains the result figure of the skin color probability of image to be detected.
Preferably, carry out conspicuousness detection in described step 10 to human face region, this conspicuousness detects and comprises further:
A. characteristics of image is extracted: adopt Gaussian filter to treat detected image and carry out filtering and sampling, the gaussian pyramid model that to be formed with described image to be detected be bottom; Then various characteristics of image is extracted respectively to the every one deck in gaussian pyramid model, morphogenesis characters pyramid model; The characteristic pattern calculating described image to be detected is carried out again according to this feature pyramid model;
B. generate significantly figure: the characteristic pattern normalized described in each, and the characteristic pattern after each normalized is carried out COMPREHENSIVE CALCULATING, obtain the remarkable figure corresponding to described image to be detected.
Preferably, adopt white and black to mark to it salient region obtaining image to be detected in described remarkable figure, wherein, white represents significant region in image to be detected, and black represents inapparent region in image to be detected.
Preferably, the method obtaining the average colour of skin in step 20 is further comprising the steps:
The original skin model of 21. initialization;
The color average of the whole image of 22. calculating, as the threshold value of initial skin;
23. according to the average colour of skin of the threshold calculations human face region of the initial skin obtained.
Preferably, in described step 21, the step of the original skin model of initialization is as follows:
211. create complexion model, and size is 256*256;
212. carry out assignment to complexion model successively, and concrete false code is as follows;
Default temporary variable AlphaValue, nMax, i, j are integer type.
Complexion model variable is SkinModel [256] [256]
For(i=0;i<256;i++)
{
Judge whether i is greater than 128, if be greater than 128, then AlphaValue is 255, otherwise is i*2;
Calculate the value obtaining nMax, computing formula is nMax=min (256, AlphaValue*2);
For(j=0;j<nMax;j++)
{
Calculate the value of the complexion model of correspondence position, computing formula is SkinModel [i] [j]=AlphaValue-(j/2);
}
For(j=nMax;j<256;j++)
{
The value of the complexion model of initial correspondence position is 0;
}
}
Preferably, described step 22 comprises further:
The pixel of the whole image of 221. traversal, adds up the color value of red channel, green channel, blue channel, obtains color accumulated value;
222. by the sum of color accumulated value divided by pixel, obtain the average of red channel, green channel, blue channel, as the threshold value of initial skin.
Preferably, described step 23 comprises further:
231. according to the gray-scale value of the average colour of skin of following formulae discovery:
GRAY 1=0.299*RED+0.587*GREEN+0.114*BLUE
Wherein, GRAY1 is the gray-scale value of the current pixel point of the gray-scale map of image to be detected; RED, GREEN, BLUE are respectively the color value of the red, green, blue passage of the current pixel point of image to be detected;
232. using described gray-scale value as threshold value, be used for getting rid of human face region noncutaneous part;
And travel through the color value of the pixel in human face region successively, obtain the average colour of skin according to following formula:
skin=SkinModel[red][blue];
Wherein, skin is the skin value after the color map of skin model; SkinModel is the original skin model of initialization of step 21; Red is the color value of the red channel of pixel in human face region; Blue is the color value of the blue channel of pixel in human face region.
Preferably, the skin color probability mapping table in described step 30 obtains as follows:
31. create skin color probability mapping table, and size is 256*256;
32. carry out assignment to skin color probability mapping table successively, and concrete false code is as follows;
Default temporary variable i, j, SkinRed_Left, AlphaValue, Offset, TempAlphaValue, OffsetJ are integer type;
The variable of skin color probability mapping table is SkinProbability [256] [256];
SkinRed is the average of the red channel that step 222 calculates; SkinBlue is the average of the blue channel that step 222 calculates;
Preset the value of SkinRed_Left, computing formula is: SkinRed_Left=SkinRed-128;
For(i=0;i<256;i++)
{
Calculate the value of Offset, formula is Offset=max (0, min (255, i-SkinRed_Left));
Judge whether the value of Offset is less than 128, if be less than, talk about then AlphaValue=Offset*2; If be more than or equal to 128, then AlphaValue=255;
For(j=0;j<256;j++)
{
Calculate the value of OffsetJ, formula is OffsetJ=max (0, j-SkinBlue);
Calculate the value of TempAlphaValue, formula is TempAlphaValue=max (AlphaValue-(OffsetJ*2), 0);
Judge the value of TempAlphaValue.If be greater than 160, then the value of SkinProbability [i] [j] is 255;
If be less than 90, then the value of SkinProbability [i] [j] is 0; Otherwise the value of SkinProbability [i] [j] is TempAlphaValue+30;
}
}
Preferably, it is characterized in that, described step 40 is realized by following formula:
skinColor=SkinProbability[red][blue]
Wherein, skinColor is the skin color probability value in skin color probability result figure; SkinProbability is skin color probability mapping table; Red is the color value of the red channel of pixel; Blue is the color value of the blue channel of pixel.
Preferably, treat detected image in described step 10 and carry out recognition of face, when human face region recognition failures, by whole image definition behaviour face skin region.
Preferably, described image to be detected is digital image file, video file or gif animation file.
Preferably, if image to be detected is video file or gif animation file, then image file is intercepted as single-frame images, after having identified each single-frame images respectively, restore as video file or gif animation file.
The invention has the beneficial effects as follows:
A kind of skin identification method detected based on conspicuousness of the present invention, it carries out recognition of face and conspicuousness detection by treating detected image, obtain face skin area, and the calculating average colour of skin of acquisition and skin color probability mapping table are carried out to it, skin color probability mapping table described in last basis is treated detected image and is carried out skin color model, and obtains the result figure of the skin color probability of image to be detected; It does not need by collecting huge training data, recognition result also can not be affected because of the deviation of training data, and owing to having carried out conspicuousness detection, the non-skin region in human face region can be got rid of, make the accuracy rate of identification higher, overcome the mistake identification problem caused when image to be detected is crossed bright or excessively dark.
Accompanying drawing explanation
Accompanying drawing described herein is used to provide a further understanding of the present invention, forms a part of the present invention, and schematic description and description of the present invention, for explaining the present invention, does not form inappropriate limitation of the present invention.In the accompanying drawings:
Fig. 1 is the general flow chart of a kind of skin identification method based on conspicuousness detection of the present invention.
Embodiment
In order to make technical matters to be solved by this invention, technical scheme and beneficial effect clearly, understand, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
As shown in Figure 1, a kind of skin identification method detected based on conspicuousness of the present invention, it comprises the following steps:
10. treat detected image and carry out recognition of face, obtain human face region, and conspicuousness detection is carried out to human face region, obtain face most marking area, i.e. face skin area; Further, when human face region recognition failures, by whole image definition behaviour face skin region;
Face skin area described in 20. pairs carries out mean value computation, obtains the average colour of skin;
30. calculate the skin color probability mapping table of image to be detected according to the described average colour of skin;
40. treat detected image according to described skin color probability mapping table carries out skin color model, and obtains the result figure of the skin color probability of image to be detected.
The recognition of face related in step 10, owing to not relating to main contents of the present invention, does not therefore repeat.In the present embodiment, face identification method adopts conventional method, such as document " P.Viola andM.Jones.Rapid Object Detection using a Boosted Cascade of SimpleFeatures; in:Computer Vision and Pattern Recognition, 2001.CVPR 2001.Proceedings of the 2001 IEEE Computer Society Conference on ".The approximate region position of face is obtained according to location.Wherein carry out conspicuousness detection to human face region to comprise further:
A. characteristics of image is extracted: adopt Gaussian filter to treat detected image and carry out filtering and sampling, the gaussian pyramid model that to be formed with described image to be detected be bottom; Then various characteristics of image is extracted respectively to the every one deck in gaussian pyramid model, morphogenesis characters pyramid model; The characteristic pattern calculating described image to be detected is carried out again according to this feature pyramid model;
B. generate significantly figure: the characteristic pattern normalized described in each, and the characteristic pattern after each normalized is carried out COMPREHENSIVE CALCULATING, obtain the remarkable figure corresponding to described image to be detected; Adopt white and black to mark to it salient region obtaining image to be detected in described remarkable figure, wherein, white represents significant region in image to be detected, and black represents inapparent region in image to be detected.
Preferably, the method obtaining the average colour of skin in step 20 is further comprising the steps:
The original skin model of 21. initialization;
The color average of the whole image of 22. calculating, as the threshold value of initial skin;
23. according to the average colour of skin of the threshold calculations human face region of the initial skin obtained.
Preferably, in described step 21, the step of the original skin model of initialization is as follows:
211. create complexion model, and size is 256*256;
212. carry out assignment to complexion model successively, and concrete false code is as follows;
Default temporary variable AlphaValue, nMax, i, j are integer type.
Complexion model variable is SkinModel [256] [256]
For(i=0;i<256;i++)
{
Judge whether i is greater than 128, if be greater than 128, then AlphaValue is 255, otherwise is i*2;
Calculate the value obtaining nMax, computing formula is nMax=min (256, AlphaValue*2);
For(j=0;j<nMax;j++)
{
Calculate the value of the complexion model of correspondence position, computing formula is SkinModel [i] [j]=AlphaValue-(j/2);
}
For(j=nMax;j<256;j++)
{
The value of the complexion model of initial correspondence position is 0;
}
}
As represented with program code form, then the formula of the original skin model of initialization is as follows:
Preferably, described step 22 comprises further:
The pixel of the whole image of 221. traversal, adds up the color value of red channel, green channel, blue channel, obtains color accumulated value;
222. by the sum of color accumulated value divided by pixel, obtain the average of red channel, green channel, blue channel, as the threshold value of initial skin.
Preferably, described step 23 comprises further:
231. according to the gray-scale value of the average colour of skin of following formulae discovery:
GRAY 1=0.299*RED+0.587*GREEN+0.114*BLUE
Wherein, GRAY1 is the gray-scale value of the current pixel point of the gray-scale map of image to be detected; RED, GREEN, BLUE are respectively the color value of the red, green, blue passage of the current pixel point of image to be detected;
232. using described gray-scale value as threshold value, be used for getting rid of human face region noncutaneous part;
And travel through the color value of the pixel in human face region successively, obtain the average colour of skin according to following formula:
skin=SkinModel[red][blue];
Wherein, skin is the skin value after the color map of skin model; SkinModel is the original skin model of initialization of step 21; Red is the color value of the red channel of pixel in human face region; Blue is the color value of the blue channel of pixel in human face region.
Preferably, the skin color probability mapping table in described step 30 obtains as follows:
31. create skin color probability mapping table, and size is 256*256;
32. carry out assignment to skin color probability mapping table successively, and concrete false code is as follows;
Default temporary variable i, j, SkinRed_Left, AlphaValue, Offset, TempAlphaValue, OffsetJ are integer type;
The variable of skin color probability mapping table is SkinProbability [256] [256];
SkinRed is the average of the red channel that step 222 calculates; SkinBlue is the average of the blue channel that step 222 calculates;
Preset the value of SkinRed_Left, computing formula is: SkinRed_Left=SkinRed-128;
For(i=0;i<256;i++)
{
Calculate the value of Offset, formula is Offset=max (0, min (255, i-SkinRed_Left));
Judge whether the value of Offset is less than 128, if be less than, talk about then AlphaValue=Offset*2; If be more than or equal to 128, then AlphaValue=255;
For(j=0;j<256;j++)
{
Calculate the value of OffsetJ, formula is OffsetJ=max (0, j-SkinBlue);
Calculate the value of TempAlphaValue, formula is TempAlphaValue=max (AlphaValue-(OffsetJ*2), 0);
Judge the value of TempAlphaValue.If be greater than 160, then the value of SkinProbability [i] [j] is 255;
If be less than 90, then the value of SkinProbability [i] [j] is 0; Otherwise the value of SkinProbability [i] [j] is TempAlphaValue+30;
}
}
As represented with program code form, described skin color probability mapping table obtains especially by following formula:
Wherein, SkinRed and SkinBlue is the average of red channel and the blue channel obtained in step 222.
Described step 40 is realized by following formula:
skinColor=SkinProbability[red][blue]
Wherein, skinColor is the skin color probability value in skin color probability result figure; SkinProbability is skin color probability mapping table; Red is the color value of the red channel of pixel; Blue is the color value of the blue channel of pixel.
Method of the present invention is applied widely, can detect digital image file, video file or gif animation file, if image to be detected is video file or gif animation file, then image file is intercepted for single-frame images, after having identified each single-frame images respectively, restore as video file or gif animation file, it builds complexion model by the colour of skin average after using Face datection and conspicuousness to detect, the picture of taking pictures under can adapting to various condition, and owing to having carried out conspicuousness detection, the non-skin region in human face region can be got rid of, make the precision of detection more accurate, and, hair can very naturally be got rid of in human face region by image to be detected after conspicuousness detects, thus make the skin average obtained in skin detection closer to real skin, thus make the better effects if of skin identification.
In addition, relative to prior art, owing to not being limited to training data, recognition result can not be affected because of the deviation of training data, overcomes the brightness preference problem that some method of the prior art exists, and range of application realizes wider covering; And method of the present invention does not need the accumulation carrying out training data, not needing to reach certain discrimination, and need to collect huge training data, from realizing angle, very easily implementing; And do not need to compare with huge training data, operational efficiency achieves the raising of matter.Identified by complexion model, recognition accuracy is high, overcomes the inaccurate problem of skin identification when crossing bright or excessively dark.
Above-mentioned explanation illustrate and describes the preferred embodiments of the present invention, be to be understood that the present invention is not limited to the form disclosed by this paper, should not regard the eliminating to other embodiments as, and can be used for other combinations various, amendment and environment, and can in invention contemplated scope herein, changed by the technology of above-mentioned instruction or association area or knowledge.And the change that those skilled in the art carry out and change do not depart from the spirit and scope of the present invention, then all should in the protection domain of claims of the present invention.

Claims (12)

1., based on the skin identification method that conspicuousness detects, it is characterized in that, comprise the following steps:
10. treat detected image and carry out recognition of face, obtain human face region, and conspicuousness detection is carried out to human face region, obtain face most marking area, i.e. face skin area;
Face skin area described in 20. pairs carries out mean value computation, obtains the average colour of skin;
30. calculate the skin color probability mapping table of image to be detected according to the described average colour of skin;
40. treat detected image according to described skin color probability mapping table carries out skin color model, and obtains the result figure of the skin color probability of image to be detected.
2. a kind of skin identification method detected based on conspicuousness according to claim 1, is characterized in that: carry out conspicuousness detection to human face region in described step 10, and this conspicuousness detects and comprises further:
A. characteristics of image is extracted: adopt Gaussian filter to treat detected image and carry out filtering and sampling, the gaussian pyramid model that to be formed with described image to be detected be bottom; Then various characteristics of image is extracted respectively to the every one deck in gaussian pyramid model, morphogenesis characters pyramid model; The characteristic pattern calculating described image to be detected is carried out again according to this feature pyramid model;
B. generate significantly figure: the characteristic pattern normalized described in each, and the characteristic pattern after each normalized is carried out COMPREHENSIVE CALCULATING, obtain the remarkable figure corresponding to described image to be detected.
3. a kind of skin identification method detected based on conspicuousness according to claim 2, it is characterized in that: in described remarkable figure, adopt white and black to mark to obtain the salient region of image to be detected to it, wherein, white represents significant region in image to be detected, and black represents inapparent region in image to be detected.
4. a kind of skin identification method detected based on conspicuousness according to claim 1, it is characterized in that, the method obtaining the average colour of skin in step 20 is further comprising the steps:
The original skin model of 21. initialization;
The color average of the whole image of 22. calculating, as the threshold value of initial skin;
23. according to the average colour of skin of the threshold calculations human face region of the initial skin obtained.
5. a kind of skin identification method detected based on conspicuousness according to claim 4, it is characterized in that, in described step 21, the step of the original skin model of initialization is as follows:
211. create complexion model, and size is 256*256;
212. carry out assignment to complexion model successively, and concrete false code is as follows;
Default temporary variable AlphaValue, nMax, i, j are integer type.
Complexion model variable is SkinModel [256] [256]
For(i=0;i<256;i++)
{
Judge whether i is greater than 128, if be greater than 128, then AlphaValue is 255, otherwise is i*2;
Calculate the value obtaining nMax, computing formula is nMax=min (256, AlphaValue*2);
For(j=0;j<nMax;j++)
{
Calculate the value of the complexion model of correspondence position, computing formula is SkinModel [i] [j]=AlphaValue-(j/2);
}
For(j=nMax j<256;j++)
{
The value of the complexion model of initial correspondence position is 0;
}
}
6. a kind of skin identification method detected based on conspicuousness according to claim 4, it is characterized in that, described step 22 comprises further:
The pixel of the whole image of 221. traversal, adds up the color value of red channel, green channel, blue channel, obtains color accumulated value;
222. by the sum of color accumulated value divided by pixel, obtain the average of red channel, green channel, blue channel, as the threshold value of initial skin.
7. a kind of skin identification method detected based on conspicuousness according to claim 6, it is characterized in that, described step 23 comprises further:
231. according to the gray-scale value of the average colour of skin of following formulae discovery:
GRAY 1=0.299*RED+0.587*GREEN+0.114*BLUE
Wherein, GRAY1 is the gray-scale value of the current pixel point of the gray-scale map of image to be detected; RED, GREEN, BLUE are respectively the color value of the red, green, blue passage of the current pixel point of image to be detected;
232. using described gray-scale value as threshold value, be used for getting rid of human face region noncutaneous part;
And travel through the color value of the pixel in human face region successively, obtain the average colour of skin according to following formula:
skin=SkinModel[red][blue];
Wherein, skin is the skin value after the color map of skin model; SkinModel is the original skin model of initialization of step 21; Red is the color value of the red channel of pixel in human face region; Blue is the color value of the blue channel of pixel in human face region.
8. a kind of skin identification method detected based on conspicuousness according to claim 6, it is characterized in that, the skin color probability mapping table in described step 30 obtains as follows:
31. create skin color probability mapping table, and size is 256*256;
32. carry out assignment to skin color probability mapping table successively, and concrete false code is as follows;
Default temporary variable i, j, SkinRed_Left, AlphaValue, Offset, TempAlphaValue, OffsetJ are integer type;
The variable of skin color probability mapping table is SkinProbability [256] [256];
SkinRed is the average of the red channel that step 222 calculates; SkinBlue is the average of the blue channel that step 222 calculates;
Preset the value of SkinRed_Left, computing formula is: SkinRed_Left=SkinRed-128;
For(i=0;i<256;i++)
{
Calculate the value of Offset, formula is Offset=max (0, min (255, i-SkinRed_Left));
Judge whether the value of Offset is less than 128, if be less than, talk about then AlphaValue=Offset*2; If be more than or equal to 128, then AlphaValue=255;
For(j=0;j<256;j++)
{
Calculate the value of OffsetJ, formula is OffsetJ=max (0, j-SkinBlue);
Calculate the value of TempAlphaValue, formula is TempAlphaValue=max (AlphaValue-(0ffsetJ*2), 0);
Judge the value of TempAlphaValue.If be greater than 160, then the value of SkinProbability [i] [j] is 255;
If be less than 90, then the value of SkinProbability [i] [j] is 0; Otherwise the value of SkinProbability [i] [j] is TempAlphaValue+30;
}
}
9. a kind of skin identification method detected based on conspicuousness according to any one of claim 1 to 8, it is characterized in that, described step 40 is realized by following formula:
skinColor=SkinProbability[red][blue]
Wherein, skinColor is the skin color probability value in skin color probability result figure; SkinProbability is skin color probability mapping table; Red is the color value of the red channel of pixel; Blue is the color value of the blue channel of pixel.
10. a kind of skin identification method detected based on conspicuousness according to any one of claim 1 to 8, it is characterized in that, treat detected image in described step 10 and carry out recognition of face, when human face region recognition failures, by whole image definition behaviour face skin region.
11. a kind of skin identification method detected based on conspicuousness according to any one of claim 1 to 8, it is characterized in that, described image to be detected is digital image file, video file or gif animation file.
12. a kind of skin identification method detected based on conspicuousness according to claim 11, it is characterized in that, if image to be detected is video file or gif animation file, then image file is intercepted for single-frame images, after having identified each single-frame images respectively, restore as video file or gif animation file.
CN201410638233.4A 2014-11-12 2014-11-12 Skin recognition method based on saliency detection Pending CN104392211A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410638233.4A CN104392211A (en) 2014-11-12 2014-11-12 Skin recognition method based on saliency detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410638233.4A CN104392211A (en) 2014-11-12 2014-11-12 Skin recognition method based on saliency detection

Publications (1)

Publication Number Publication Date
CN104392211A true CN104392211A (en) 2015-03-04

Family

ID=52610112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410638233.4A Pending CN104392211A (en) 2014-11-12 2014-11-12 Skin recognition method based on saliency detection

Country Status (1)

Country Link
CN (1) CN104392211A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105118051A (en) * 2015-07-29 2015-12-02 广东工业大学 Saliency detecting method applied to static image human segmentation
CN106570472A (en) * 2016-11-02 2017-04-19 华为技术有限公司 Skin color detecting method and device and terminal
CN106611415A (en) * 2016-12-29 2017-05-03 北京奇艺世纪科技有限公司 Detection method and device for skin area
CN110516648A (en) * 2019-09-02 2019-11-29 湖南农业大学 Ramie strain number recognition methods based on unmanned aerial vehicle remote sensing and pattern-recognition
CN111079662A (en) * 2019-12-19 2020-04-28 江苏云从曦和人工智能有限公司 Figure identification method and device, machine readable medium and equipment
CN112686965A (en) * 2020-12-25 2021-04-20 百果园技术(新加坡)有限公司 Skin color detection method, device, mobile terminal and storage medium
WO2022135574A1 (en) * 2020-12-25 2022-06-30 百果园技术(新加坡)有限公司 Skin color detection method and apparatus, and mobile terminal and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080112622A1 (en) * 2006-11-13 2008-05-15 Samsung Electro-Mechanics Co., Ltd Skin detection system and method
CN101620673A (en) * 2009-06-18 2010-01-06 北京航空航天大学 Robust face detecting and tracking method
CN101751559A (en) * 2009-12-31 2010-06-23 中国科学院计算技术研究所 Method for detecting skin stains on face and identifying face by utilizing skin stains
CN103455790A (en) * 2013-06-24 2013-12-18 厦门美图网科技有限公司 Skin identification method based on skin color model
CN103955718A (en) * 2014-05-15 2014-07-30 厦门美图之家科技有限公司 Image subject recognition method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080112622A1 (en) * 2006-11-13 2008-05-15 Samsung Electro-Mechanics Co., Ltd Skin detection system and method
CN101620673A (en) * 2009-06-18 2010-01-06 北京航空航天大学 Robust face detecting and tracking method
CN101751559A (en) * 2009-12-31 2010-06-23 中国科学院计算技术研究所 Method for detecting skin stains on face and identifying face by utilizing skin stains
CN103455790A (en) * 2013-06-24 2013-12-18 厦门美图网科技有限公司 Skin identification method based on skin color model
CN103955718A (en) * 2014-05-15 2014-07-30 厦门美图之家科技有限公司 Image subject recognition method

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105118051A (en) * 2015-07-29 2015-12-02 广东工业大学 Saliency detecting method applied to static image human segmentation
CN105118051B (en) * 2015-07-29 2017-12-26 广东工业大学 A kind of conspicuousness detection method applied to still image human body segmentation
CN106570472A (en) * 2016-11-02 2017-04-19 华为技术有限公司 Skin color detecting method and device and terminal
WO2018082388A1 (en) * 2016-11-02 2018-05-11 华为技术有限公司 Skin color detection method and device, and terminal
CN106570472B (en) * 2016-11-02 2019-11-05 华为技术有限公司 A kind of skin color detection method, device and terminal
CN106611415A (en) * 2016-12-29 2017-05-03 北京奇艺世纪科技有限公司 Detection method and device for skin area
CN106611415B (en) * 2016-12-29 2020-01-10 北京奇艺世纪科技有限公司 Skin region detection method and device
CN110516648A (en) * 2019-09-02 2019-11-29 湖南农业大学 Ramie strain number recognition methods based on unmanned aerial vehicle remote sensing and pattern-recognition
CN110516648B (en) * 2019-09-02 2022-04-19 湖南农业大学 Ramie plant number identification method based on unmanned aerial vehicle remote sensing and pattern identification
CN111079662A (en) * 2019-12-19 2020-04-28 江苏云从曦和人工智能有限公司 Figure identification method and device, machine readable medium and equipment
CN112686965A (en) * 2020-12-25 2021-04-20 百果园技术(新加坡)有限公司 Skin color detection method, device, mobile terminal and storage medium
WO2022135574A1 (en) * 2020-12-25 2022-06-30 百果园技术(新加坡)有限公司 Skin color detection method and apparatus, and mobile terminal and storage medium

Similar Documents

Publication Publication Date Title
CN103455790B (en) A kind of skin identification method based on complexion model
CN104392211A (en) Skin recognition method based on saliency detection
CN103927719B (en) Picture processing method and device
WO2017092431A1 (en) Human hand detection method and device based on skin colour
Ajmal et al. A comparison of RGB and HSV colour spaces for visual attention models
WO2017084204A1 (en) Method and system for tracking human body skeleton point in two-dimensional video stream
CN104282002B (en) A kind of quick beauty method of digital picture
CN106709964B (en) Sketch generation method and device based on gradient correction and multidirectional texture extraction
WO2009131539A1 (en) A method and system for detecting and tracking hands in an image
CN111476849B (en) Object color recognition method, device, electronic equipment and storage medium
CN103440633B (en) A kind of digital picture dispels the method for spot automatically
CN103218605A (en) Quick eye locating method based on integral projection and edge detection
JP2007257087A (en) Skin color area detecting device and skin color area detecting method
Yang et al. Real-time traffic sign detection via color probability model and integral channel features
JP2007272435A (en) Face feature extraction device and face feature extraction method
Jiang et al. Skin detection using color, texture and space information
CN108345867A (en) Gesture identification method towards Intelligent household scene
CN110599553B (en) Skin color extraction and detection method based on YCbCr
CN108154496A (en) A kind of power equipment appearance suitable for electric operating robot changes recognition methods
CN108711160A (en) A kind of Target Segmentation method based on HSI enhancement models
Yusuf et al. Human face detection using skin color segmentation and watershed algorithm
CN112686800B (en) Image processing method, device, electronic equipment and storage medium
JP3902887B2 (en) Lip extraction method
CN103729624A (en) Photometry method and system based on skin color recognition
KR20190105273A (en) Preprocessing method for color filtering robust against illumination environment and the system thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150304