CN1979523A - 2-D main-element human-face analysis and identifying method based on relativity in block - Google Patents

2-D main-element human-face analysis and identifying method based on relativity in block Download PDF

Info

Publication number
CN1979523A
CN1979523A CN 200610123197 CN200610123197A CN1979523A CN 1979523 A CN1979523 A CN 1979523A CN 200610123197 CN200610123197 CN 200610123197 CN 200610123197 A CN200610123197 A CN 200610123197A CN 1979523 A CN1979523 A CN 1979523A
Authority
CN
China
Prior art keywords
image
matrix
piecemeal
eigenmatrix
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200610123197
Other languages
Chinese (zh)
Other versions
CN100449567C (en
Inventor
马争鸣
***
张成言
邓娜
袁红梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
National Sun Yat Sen University
Original Assignee
National Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Sun Yat Sen University filed Critical National Sun Yat Sen University
Priority to CNB2006101231973A priority Critical patent/CN100449567C/en
Publication of CN1979523A publication Critical patent/CN1979523A/en
Application granted granted Critical
Publication of CN100449567C publication Critical patent/CN100449567C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention belongs to the mode recognition technical field, concretely relating to an in-block relativity-based two-dimensional principal component analysis (2DPCA) human face recognizing method. According to the local characteristic that a human face image has, the method divides the human face image into nonoverlapping small blocks, then connecting the elements in each small block by rows to generate corresponding row vectors, arranging the row vectors in sequence into a new 2D image matrix, finally using the 2D image matrix as input image after blocking and rearranging, and making 2DPCA-based human face recognition. And its advantages: fully using the relativity information between row and column pixels in local region, able to better remain local characteristic information of human face, able to achieve higher human face recognizing rate, and having lower calculating complexity.

Description

A kind of two-dimensional principal component analysis face identification method based on correlativity in the piece
Technical field
The invention belongs to mode identification technology, be specifically related to a kind of two-dimensional principal component analysis face identification method based on correlativity in the piece.
Technical background
Recognition of face belongs to the category of pattern-recognition, and as one of application the most successful in graphical analysis and the understanding field, recognition of face all has been subjected to paying attention to widely in commerce application and research field.Existing face identification method comprises based on the face identification method of statistical study with based on the face identification method of template matches.
1991, Turk and Pentland have proposed a kind of pivot analysis (PCA that is called, Principal ComponentAnalysis, be also referred to as Karhunen-Loeve transformation) face identification method (list of references [1]: M.Turk and A.Pentland, " FaceRecognition Using Eigenfaces ", Proc.IEEE Conf.on Computer Vision and PatternRecognition, 1991), be called eigenface (eigenface) recognition methods again.This is a kind of face identification method based on statistical study, its theoretical foundation is the pivot analysis of matrix, pivot analysis is a kind of method of analyzing data in the statistics, its basic thought is the principal character of extracting in the data space (pivot), make data processed at the feature space of a low-dimensional, the information that keeps the overwhelming majority of raw data simultaneously, thus the too high bottleneck problem of data space dimension solved.So-called eigenface just is meant the pivot of facial image covariance matrix.
Pca method is subjected to researcher's extensive concern as a kind of effective linear dimensionality reduction and feature extraction method always.Round pca method, develop in the recognition of face field and a series of subspace methods, Fishfaces method (list of references [2]: Peter N.Belhumeur for example, Joao P.Hespanha, " Eigenface vs.Fisherface:Recognition Using Class Specific Linear Projection, " IEEE Transactions on Pattern Analysisand Machine Intelligence, vol.19, no.7, July 1997), ICA independence pca method (list of references [3]: Hyvarinen, A., Karhunen, J., Oja, E., Independent Component Analysis, John Wiley ﹠amp; Sons.New York, 2001), KPCA core pivot element analysis method (list of references [4]: Scholkopf, B., Smola, A.﹠amp; Muller, K.R., " Nonlinear component analysis as a kernel eigenvalue problem; " Neural Computation, PP.1299-1319) etc., the expansion of these methods the research field of pca method, enriched the content of recognition of face research.
In order to reduce the operand of PCA method, a kind of two-dimensional principal component analysis (2DPCA) (list of references [5]: Yang that is called has appearred in recent years, J., Zhang, D., " Two-dimensional PCA:A new approach to appearance-basedface representation and recognition, " IEEE Transactions on Pattern Analysis and MachineIntelligence, 26 (1), 131-137,2004, list of references [6]: Yang, J., Yang, J.Y., " From Image vector tomatrix:A straightforward image projection technique-IMPCA vs.PCA, " PatternRecognition, vol.35, no.9, pp.1997-1999,2002) face identification method (Yang ﹠amp; Zhang, 2004), this is a kind of improvement on conventional P CA method.Different with conventional P CA method, the 2DPCA method can directly be handled on the two dimensional image matrix, and does not need in advance image array to be converted to one-dimensional vector.The 2DPCA method directly uses the two dimensional image matrix to remove the construct image covariance matrix, and then obtains the pivot characteristic vector (being projecting direction) of covariance matrix, then two dimensional image is directly projected on each projecting direction, thus the character representation of acquisition image.
The effect of 2DPCA face identification method has obtained the extensive approval of all circles in the industry.Current experiments shows, approximately be 1/10~1/20 of PCA method the operation time of 2DPCA method, and the recognition of face rate of 2DPCA method is higher than the PCA method, and particularly under the less situation of number of samples, the recognition of face rate generally has the remarkable lifting (list of references [5]) greater than 5%.
Can find that by mathematical analysis 2DPCA utilizes image array to construct the divergence matrix with the correlativity between delegation's pixel to 2DPCA.The 2DPCA method has only been utilized with the correlation information between delegation's interior pixel, abandoned the correlation information of pixel between row and the row fully, even with the pixel in the delegation, correlativity between the pixel of apart from each other can be more weak, these have illustrated that all the 2DPCA method fails to make full use of the correlation information between the pixel, are the defectives that the 2DPCA method fails to overcome.Generally speaking, the feature of people's face mainly shows regional area, as the major organs eyes of people's face, nose, face etc., the local feature that all belongs to people's face, but the zone that the 2DPCA method is used is the one-row pixels of image, isolated the pixel interdependence of people's face regional area, be difficult to characterize the local feature of people's face, this is the factor that is unfavorable for carrying out recognition of face.
Summary of the invention
The objective of the invention is to propose a kind of two-dimensional principal component analysis (being called for short 2DPCA) face identification method based on correlativity in the piece, according to the local feature that facial image had, it is complete semanteme of localized mass ordinary representation of facial image, as nose, eyes, face or the like, employing is divided into non-overlapped fritter with facial image, then the element in each fritter is joined by row and produce corresponding row vector, again the row vector is arranged in order the method for new two dimensional image matrix, at last with piecemeal, two dimensional image matrix after resetting is used as input picture, carries out the 2DPCA recognition of face.Such way has made full use of the correlation information between row and the row pixel, makes that the two-dimentional covariance matrix that estimates image is more accurate, can reach higher recognition of face rate.
The inventive method comprises sets up a reflection face database of face characteristic and one by calculating the relatively recognizer of similarity of Euclidean distance, and step is divided into builds storehouse and two stages of identification.Following mask body is introduced this two stages:
L, build the storehouse stage
The fundamental purpose in this stage is to set up a database that comprises face characteristic, during cognitive phase, the characteristic in the characteristic of image to be identified and the storehouse is compared in the future, and the facial image of getting the similarity maximum is as recognition result.
(1) facial image of this stage use is the front face image of standard, because the standardization of facial image has direct influence to final recognition result, so at first input picture is carried out standardization, this work is divided into two parts: light normalization and size normalization.
Light normalization: the gray scale of all input pictures is unified to standard level, carry out histogram equalization then.Wherein, the process of unified gradation of image is as follows: at first will choose a facial image that gray scale is moderate, as the standard picture of gray scale normalization, calculate and write down its average gray W, then each width of cloth input picture is carried out the gray scale adjustment: the average gray w that 1. calculates this width of cloth image; 2. calculate gray scale resize ratio C w=W/W; 3. the gray-scale value with these all pixels of width of cloth image multiply by C wAdjust.
Size normalization:, be suitable size with the adjusted size of all input pictures according to the needs (will guarantee that promptly number of pixels that each sub-piecemeal comprises is identical with the number of pixels of original image delegation) of algorithm in (2) step.
(2) to carry out the operation of piecemeal, rearrangement through standardized image, construct new two dimensional image matrix.
Any width of cloth standardized images can represent that the gray-scale value of matrix element remarked pixel is designated as A with image array, that is: with the two-dimensional matrix of a m * n
Figure A20061012319700071
Matrix A is divided into p * q the submatrix that size is identical, that is:
Figure A20061012319700072
Wherein, each piecemeal Akl=1,2 ..., p; L=1,2 ..., be the matrix of m ' * n ' dimension q), p * m '=m is promptly arranged, q * n '=n, and require the big or small m ' * n '=n (need (1) step in the normalized cooperation of size) of each piecemeal.
So each width of cloth image just is divided into p * q=m piecemeal, is designated as D in order i(i=1,2 ... m), that is:
Figure A20061012319700081
Each piecemeal is lined by line scan, obtain the capable vector representation V of each piecemeal correspondence i(i=1,2 ... m):
Again these row vectors are reset the two dimensional image matrix A that structure makes new advances ' as follows in order:
A ′ = [ V 1 V 2 . . . V m
New two dimensional image matrix A ' by m n dimension (m ' * the row vector of n '=n) forms, size is the same with original image, also is m * n dimension, is that original image is in the interior pixel of same localized mass but each goes that vector chooses in the new matrix.
All input pictures are all carried out top image block, reset operation, obtain corresponding two dimensional image matrix storehouse.
(3) calculate optimal projection direction
Carry out after the operation of image block, rearrangement each width of cloth input picture A i(i=1,2 ..., N) (N is the training image sum) is corresponding to the image array A of a m * n dimension i' (i=1,2 ..., N), can calculate image two dimension covariance matrix according to these two dimensional image matrixes, computing formula is as follows:
G t = 1 N Σ i = 1 M ( A i ′ - A ′ ‾ ) T ( A i ′ - A ′ ‾ )
Obtain optimal projection direction according to criterion function then:
J(x)=X TG tX
Wherein, X is normalized column vector, is called optimal projection direction.The meaning that maximizes above-mentioned criterion function is: image array is the degree of scatter maximum between the class of the proper vector that directions X upslide movie queen obtains.In fact, optimal projection direction X promptly is image two dimension covariance matrix G tThe pairing proper vector of eigenvalue of maximum.
Under the more situation of sample class number, generally can select the optimum projection vector group X of one group of mutually orthogonal and maximization criterion function 1, X 2..., X d, promptly satisfy following requirement:
{ X 1 , X 2 , · · · , X d } = arg max J ( X ) X i T X j = 0 , i ≠ j , i , j = 1,2 , · · · , d
In the actual computation, optimum projection vector group X 1, X 2..., X dJust be exactly image two dimension covariance matrix G tPreceding d the pairing proper vector of eigenwert of maximum.
(4) eigenmatrix that extracts input picture is set up database
Obtain optimum projection vector group X 1, X 2..., X dAfterwards, each width of cloth input picture is carried out following operation:
To a certain width of cloth image A, the corresponding two-dimensional matrix through resetting is A ', order
Y k=A′X k,k=1,2,…,d
Then can obtain one group of projection properties vector Y 1, Y 2..., Y d, being called the pivot of image A, all pivots of piece image constitute matrix B=[Y of a m * d 1, Y 2..., Y d], be called the eigenmatrix of image A.
The eigenmatrix B that this step is obtained 1, B 2... B N(N is the training image sum) all is kept in the database, and the stage to be identified uses.In above-mentioned steps, (2) step was most important, had embodied core content of the present invention.
2, cognitive phase
To a facial image T to be identified, carry out with build identical image standardization of storehouse stage (1), (2) step with image block, reset operate after, obtain two-dimensional matrix T ', T ' to the optimal projection direction projection, is obtained the eigenmatrix of image T through resetting
Figure A20061012319700092
The calculated characteristics matrix B TWith each the eigenmatrix B in the face database i(i=1,2 ..., the N) Euclidean distance d (B between T, B i) (i=1,2 ..., N).Euclidean distance is defined as follows: to any two eigenmatrixes With Its Euclidean distance d ( B i , B j ) Σ k = 1 d | | Y k ( i ) - Y k ( j ) | | 2 .
It is as follows to differentiate the process of classifying under the facial image T to be identified:
To all images A in the face database 1, A 2..., A N(N is the training image sum), each width of cloth image all belong to certain classification ω that determines k=(k=1,2 ..., C), their eigenmatrix is divided into B 1, B 2..., B NFor a given facial image T to be identified, its characteristic of correspondence matrix is B T, if having d ( B T , B 1 ) = min d j ( B T , B j ) , And B l∈ ω k, then sorting result is B ∈ ω k
Advantage of the present invention
Characteristics of the present invention have been to consider the local feature of facial image, and have made full use of the row of image block interior pixels and the correlation information between the row pixel.We know, the local feature of people's face such as nose, eyes, face or the like are to portray and to characterize interpersonal difference, adopt the method for image block among the present invention, can keep the local characteristic information of people's face preferably.The method of the computing block interior pixel correlativity of using among the present invention, not only considered the correlation information between row and the row pixel, and the correlativity between the pixel is stronger, overcome the defective of 2DPCA method, can obtain higher recognition of face rate, also keep the advantage of the low computation complexity of 2DPCA method simultaneously.
Description of drawings
Fig. 1, based on the face identification method of correlativity in the piece: the algorithm flow chart of building the storehouse stage
Fig. 2, based on the face identification method of correlativity in the piece: the algorithm flow chart of cognitive phase
The operation chart of Fig. 3, facial image piecemeal, rearrangement.Wherein,
Figure (a) expression is carried out piecemeal to original image, and each blockage is represented a partitioned matrix;
A partitioned matrix in figure (b) the expression original image, each blockage is represented a pixel;
The row vector that figure (c) expression is lined by line scan partitioned matrix to be arranged in, each blockage is represented a pixel;
Figure (d) expression is reset the new two dimensional image matrix that produces in order with all row vectors, row vector of each line display.
The result of Fig. 4, case study on implementation: the recognition of face rate on the FERET data set relatively
The result of Fig. 5, case study on implementation: the best recognition of face rate on the FERET data set relatively
Embodiment
Be example with the FERET face database below, the implementation process of this method is described.
Select 53 people in the FERET storehouse, everyone 6 width of cloth images, totally 318 width of cloth images are formed the training and testing face database, comprised at difference expression, light, the people's face under the situation such as blocked, and picture size is 112 * 92.
3 width of cloth image/humans are in training in the picked at random storehouse, and 3 remaining width of cloth image/humans are in test.Randomly draw experiment and repeat 30 times, the mean value of getting discrimination is as final discrimination.Concrete implementing procedure is as follows:
1, builds the storehouse stage
(1) image standardization
Image is carried out pre-service, comprise light normalization and size normalization.Through after the pre-service, the gray scale of all images is unified to standard level, and gray-level relatively clearly more demarcated (because having passed through histogram equalization).Pretreated picture size becomes 80 * 92.
(2) piecemeal of image, rearrangement
Piecemeal: the size of each piece is 8 * 10, comprises 80 pixels altogether, and is identical with the number of pixels of image delegation after the standardization.Each width of cloth image is divided into 96, and the line number of image is identical after piece number and the standardization.
Reset: obtain corresponding row vector by each piecemeal of line scanning, 96 capable vectors of piece image are arranged in order obtained one 80 * 92 matrix, the two dimensional image matrix after promptly resetting.
(3) calculate optimum projection vector group
In order to find out only pivot number, be that 1~20 situation is all done experiment to vector (being pivot) number in the optimum projection vector group in the present case, promptly obtain 20 kinds of different optimum projection vector groups.
(4) eigenmatrix that extracts image is set up database
Two dimensional image matrix with piecemeal, after resetting is to the projection of optimum projection vector group, because a pivot number of getting is totally 20 kinds of situations, therefore need to repeat this step, the process of each experiment is similar, but because optimum projection vector group is inequality, so the eigenmatrix storehouse that obtains is also inequality, finally produce 20 stack features matrix storehouses.
2, cognitive phase
To each width of cloth test pattern, carry out image standardization successively, three steps of eigenmatrix are extracted in the piecemeal of image, rearrangement, because the difference of pivot number can produce 20 stack features matrixes.The distance of each eigenmatrix in calculated characteristics matrix and the characteristic of correspondence matrix storehouse then, differentiating nearest is the match people face, and said here corresponding relation is meant and produces that eigenmatrix is identical with the employed pivot number in generation eigenmatrix storehouse in this step.
The facial image number that statistical match is correct and wrong calculates the recognition of face rate under the different pivot numbers, repeats random experiments then, and the recognition of face rate mean value of getting 30 experiments is as final discrimination.
In the present case, test findings is as follows:
The pivot number Average recognition rate The pivot number Average recognition rate
1 83.58% 11 90.04%
2 86.94% 12 90.15%
3 88.09% 13 89.94%
4 88.45% 14 89.83%
5 88.85% 15 89.94%
6 89.39% 16 90.02%
7 89.87% 17 90.15%
8 90.02% 18 90.15%
9 90.04% 19 90.06%
10 90.29% 20 89.94%
The subspace in pivot composing images space, increase along with the pivot number, the subspace that pivot constitutes can be to the adjacent image spatial expansion, the recognition effect of various PCA methods (comprising the 2DPCA method) also will be convergent gradually, therefore, the recognition performance of various PCA methods mainly is meant on low n-dimensional subspace n to obtain high differentiation rate.By the experimental result of present case as can be seen, when the pivot number was less, the recognition of face rate had obtained optimal identification result along with the increase of pivot number is risen rapidly when the pivot number is 10.
Fig. 4 has shown the algorithm that has utilized correlativity in the piece and has not utilized the difference on the recognition of face rate of the algorithm of correlativity in the piece, above that curve be the algorithm resulting result who has utilized correlativity in the piece, under the less situation of pivot number, obviously be better than not utilizing the algorithm of correlativity in the piece.Experimental conditions is used the parts of images in the FERET face database as hereinbefore, and getting 53 * 3 width of cloth is training sample, and 53 * 3 width of cloth are test sample book.
Fig. 5 shown under the different number of training purpose situations, utilized the algorithm of correlativity in the piece and do not utilized the difference of the best recognition of face rate of the algorithm of correlativity in the piece, above that curve be the resulting result of algorithm who has utilized correlativity in the piece.Experimental conditions is to get training k number of training/class (2≤k≤4), and promptly 53 * k width of cloth is as training sample, and 53 * (6-k) width of cloth are test sample book.

Claims (4)

1, a kind of face identification method based on correlativity in the piece is characterized in that algorithm, optimal projection direction computing method and the recognizer that compares similarity by the calculating Euclidean distance based on two-dimensional principal component analysis (being called for short 2DPCA) that the facial image piecemeal is reset.The concrete steps branch is built storehouse and two stages of identification:
(1) builds the storehouse stage
1. training sample image is carried out standardization, comprise light normalization and size normalization;
2. to carry out the operation of piecemeal, rearrangement through standardized training sample image, construct new two dimensional image matrix;
3. the two dimensional image matrix that previous step is obtained is suddenly selected certain pivot number as input picture, uses the 2DPCA method to calculate optimum projection vector group X 1, X 2..., X d
4. the two dimensional image matrix projection on optimal projection direction that neotectonics is gone out extracts the data of reflection face characteristic, i.e. eigenmatrix, and it all is kept in the database, the stage to be identified uses.
(2) cognitive phase
1. test pattern is carried out standardization, comprise light normalization and size normalization;
2. to carry out the operation of piecemeal, rearrangement through standardized test pattern, construct new two dimensional image matrix;
3. the two dimensional image matrix projection on optimal projection direction that neotectonics is gone out obtains corresponding eigenmatrix;
4. calculate the Euclidean distance between each eigenmatrix in the eigenmatrix of test pattern and the database, the facial image of nearest eigenmatrix correspondence is differentiated be matching image, the classification of matching image correspondence is recognition result.
2, face identification method according to claim 1 is characterized in that described algorithm with the rearrangement of facial image piecemeal is as follows:
Any width of cloth standardized images can represent that the gray-scale value of matrix element remarked pixel is designated as A with image array, that is: with the two-dimensional matrix of a m * n
Figure A2006101231970002C1
Matrix A is divided into p * q the submatrix that size is identical, that is:
Figure A2006101231970002C2
Wherein, each piecemeal A Kl(k=1,2 ..., p; L=1,2 ..., be the matrix of m ' * n ' dimension q), p * m '=m is promptly arranged, q * n '=n, and require the big or small m ' * n '=n of each piecemeal.
So each width of cloth image just is divided into p * q=m piecemeal, is designated as D in order i(i=1,2 ... m), that is:
Each piecemeal is lined by line scan, obtain the capable vector representation V of each piecemeal correspondence i(i=1,2 ... m):
Again these row vectors are reset the two dimensional image matrix A that structure makes new advances ' as follows in order:
A ′ = V 1 V 2 . . . V m
New two dimensional image matrix A ' by m n dimension (m ' * the row vector of n '=n) forms, size is the same with original image, also is m * n dimension, is that original image is in the interior pixel of same localized mass but each goes that vector chooses in the new matrix.
3, face identification method according to claim 1 is characterized in that described optimal projection direction computing method based on two-dimensional principal component analysis (being called for short 2DPCA) are as follows:
Carry out after the operation of image block, rearrangement each width of cloth input picture A i(i=1,2 ..., N) (N is the training image sum) is corresponding to the image array A of a m * n dimension i' (i=1,2 ..., N), can calculate image two dimension covariance matrix according to these two dimensional image matrixes, computing formula is as follows:
G t = 1 N Σ i = 1 M ( A i ′ - A ′ - ) T ( A T ′ - A ′ - )
Obtain optimal projection direction according to criterion function then:
J(x)=X TG tX
Under the more situation of sample class number, generally can select the optimum projection vector group X of one group of mutually orthogonal and maximization criterion function 1, X 2..., X d, promptly satisfy following requirement:
{ X 1 , X 2 , . . . , X d } = arg max J ( X ) X i T X j = 0 , i ≠ j , i , j = 1,2 , . . . , d
In the actual computation, optimum projection vector group X 1, X 2..., X dJust be exactly image two dimension covariance matrix G tPreceding d the pairing proper vector of eigenwert of maximum.
4, face identification method according to claim 1 is characterized in that described recognizer by calculating Euclidean distance comparison similarity is as follows:
To any two width of cloth image A iAnd A j, correspond respectively to eigenmatrix With
Figure A2006101231970004C3
Eigenmatrix B then iWith B jBetween distance definition be:
d ( B i , B j ) = Σ k = 1 d | | Y k ( i ) - Y k ( j ) | | 2
Wherein, || Y k (i)-Y k (j)|| 2Represent two principal component vector Y k (i)With Y k (j)Between Euclidean distance.
To all training sample A 1, A 2..., A N(N is the training sample sum), each sample all belong to certain classification ω that determines k(k=1,2 ..., C), their eigenmatrix is divided into B 1, B 2..., B NFor certain given test pattern A, its characteristic of correspondence matrix is B, if having d ( B , B l ) = min j d ( B , B j ) , And B l∈ ω k, then sorting result is B ∈ ω k
CNB2006101231973A 2006-11-02 2006-11-02 2-D main-element human-face analysis and identifying method based on relativity in block Expired - Fee Related CN100449567C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2006101231973A CN100449567C (en) 2006-11-02 2006-11-02 2-D main-element human-face analysis and identifying method based on relativity in block

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2006101231973A CN100449567C (en) 2006-11-02 2006-11-02 2-D main-element human-face analysis and identifying method based on relativity in block

Publications (2)

Publication Number Publication Date
CN1979523A true CN1979523A (en) 2007-06-13
CN100449567C CN100449567C (en) 2009-01-07

Family

ID=38130680

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2006101231973A Expired - Fee Related CN100449567C (en) 2006-11-02 2006-11-02 2-D main-element human-face analysis and identifying method based on relativity in block

Country Status (1)

Country Link
CN (1) CN100449567C (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101984576A (en) * 2010-10-22 2011-03-09 北京工业大学 Method and system for authenticating anonymous identity based on face encryption
CN102013020A (en) * 2009-09-08 2011-04-13 王晓刚 Method and system for synthesizing human face image
CN101706793B (en) * 2009-11-16 2012-09-26 中兴通讯股份有限公司 Method and device for searching picture
CN105447468A (en) * 2015-12-01 2016-03-30 无锡职业技术学院 Color image over-complete block feature extraction method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9400921B2 (en) * 2001-05-09 2016-07-26 Intel Corporation Method and system using a data-driven model for monocular face tracking

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102013020A (en) * 2009-09-08 2011-04-13 王晓刚 Method and system for synthesizing human face image
CN102013020B (en) * 2009-09-08 2015-03-04 王晓刚 Method and system for synthesizing human face image
CN101706793B (en) * 2009-11-16 2012-09-26 中兴通讯股份有限公司 Method and device for searching picture
CN101984576A (en) * 2010-10-22 2011-03-09 北京工业大学 Method and system for authenticating anonymous identity based on face encryption
CN101984576B (en) * 2010-10-22 2012-07-04 北京工业大学 Method and system for authenticating anonymous identity based on face encryption
CN105447468A (en) * 2015-12-01 2016-03-30 无锡职业技术学院 Color image over-complete block feature extraction method
CN105447468B (en) * 2015-12-01 2019-04-16 无锡职业技术学院 The excessively complete blocking characteristic abstracting method of color image

Also Published As

Publication number Publication date
CN100449567C (en) 2009-01-07

Similar Documents

Publication Publication Date Title
CN100410963C (en) Two-dimensional linear discrimination human face analysis identificating method based on interblock correlation
Guo et al. Support vector machines for face recognition
Ekenel et al. Feature selection in the independent component subspace for face recognition
CN100426314C (en) Feature classification based multiple classifiers combined people face recognition method
Li et al. Overview of principal component analysis algorithm
Hassaballah et al. Efficient eye detection method based on grey intensity variance and independent components analysis
US20240029431A1 (en) A data dimension reduction method based on maximizing ratio sum for linear discriminant analysis
CN106096517A (en) A kind of face identification method based on low-rank matrix Yu eigenface
Sisodia et al. ISVM for face recognition
Aly et al. A multi-modal feature fusion framework for kinect-based facial expression recognition using dual kernel discriminant analysis (DKDA)
CN100449567C (en) 2-D main-element human-face analysis and identifying method based on relativity in block
Kim et al. Teeth recognition based on multiple attempts in mobile device
Lu et al. Automatic gender recognition based on pixel-pattern-based texture feature
Kim et al. Image covariance-based subspace method for face recognition
Khalifa et al. The integrated usage of LBP and HOG transformations and machine learning algorithms for age range prediction from facial images
Ergin et al. Face Recognition by Using 2D Orthogonal Subspace Projections.
Das et al. Dimensionality reduction for handwritten digit recognition
Kawulok et al. Supervised relevance maps for increasing the distinctiveness of facial images
Amine et al. Investigation of feature dimension reduction based DCT/SVM for face recognition
Ali et al. Facial emotion recognition under partial occlusion using empirical mode decomposition
Ismaila et al. A study of features extraction algorithms for human face recognition
Shmaglit et al. Gender classification of human face images based on adaptive features and support vector machines
KHALIFA et al. Gender prediction from facial images using local binary patterns and histograms of oriented gradients transformations
Kathavarayan et al. Identification of untrained facial image in combined global and local preserving feature space
Essa et al. Fusing facial shape and appearance based features for robust face recognition

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090107

Termination date: 20101102