CN103065122A - Facial expression recognition method based on facial motion unit combination features - Google Patents

Facial expression recognition method based on facial motion unit combination features Download PDF

Info

Publication number
CN103065122A
CN103065122A CN2012105601443A CN201210560144A CN103065122A CN 103065122 A CN103065122 A CN 103065122A CN 2012105601443 A CN2012105601443 A CN 2012105601443A CN 201210560144 A CN201210560144 A CN 201210560144A CN 103065122 A CN103065122 A CN 103065122A
Authority
CN
China
Prior art keywords
facial
expression
width
cloth
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012105601443A
Other languages
Chinese (zh)
Inventor
冯晓毅
彭进业
夏召强
范建平
赖阳明
王保平
谢红梅
李会方
何贵青
蒋晓月
吴俊�
王珺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN2012105601443A priority Critical patent/CN103065122A/en
Publication of CN103065122A publication Critical patent/CN103065122A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a facial expression recognition method based on facial motion unit combination features. The facial expression recognition method based on the facial motion unit combination features is used for solving the technical problem of low recognition rate of a single facial motion unit with an existing facial expression recognition method based on facial motion units. The technical scheme includes that the facial expression recognition method based on the facial motion unit combination features comprises the steps of building a large-scale facial expression data base, carrying out clustering of each category of facial expressions and corresponding training samples by using the affinity propagation (AP) clustering algorithm, judging arbitrary unit (AU) combinations in each sub-category, determining the number of the sub-categories under the same facial expression by combining main AU combinations, generating the number of the categories of the training sample by combining the sub-categories of all facial expressions, and carrying out classification training by using the support vector machine (SVM) method. According to the facial expression recognition method based on the facial motion unit combination features, the recognition rate of the single facial motion unit is improved, namely that average recognition rate of a single AU unit is improved from 87.5% in prior are to 90.1% and by 2.6%.

Description

Facial expression recognizing method based on the Facial action unit assemblage characteristic
Technical field
The present invention relates to a kind of facial expression recognizing method, be specifically related to a kind of facial expression recognizing method based on the Facial action unit assemblage characteristic.
Background technology
Human face expression identification is the important research direction that the fields such as man-machine interaction, machine learning, Based Intelligent Control and image processing relate to, and has become the focus of domestic and international research at present.Based on the facial activity No system on the psychology, by setting up people's face visual signature and facial active unit, and the relation between facial active unit and the six kinds of basic facial expressions is carried out Expression Recognition based on the expression recognition method of facial active unit.The introducing of facial active unit has reduced the wide gap between complicated people's face visual signature and the six kinds of basic facial expressions.
Document 1 " Velusamy Sudha; Kannan Hariprasad; Anand Balasubramanian; Navathe Bilva; Sharma; Anshul Sharma, A Method to Infer Emotions From Facial Action Units, In IEEEInternational Conference on Acoustics, Speech and Signal Processing (ICASSP), Prague (2011) " a kind of facial expression recognizing method based on facial active unit disclosed; and the statistical relationship of the method utilization study and suitable matching process have been set up six kinds of facial basic facial expressions and 15 specific Facial action units (Action Unit; the mapping AU), and then carry out Expression Recognition.The prerequisite that the method realizes is accurately to identify 15 kinds of specific AU unit, but research experience shows: generally, expression is the result of some AU unit immixture, therefore, is difficult to accurately detect 15 kinds of single AU unit in facial expression image.For example, the method is used for the JAFFE database, only has 87.5% for the average recognition rate of single AU unit.
Summary of the invention
In order to overcome existing facial expression recognizing method based on facial active unit to the deficiency of single Facial action unit identification rate variance, the invention provides a kind of facial expression recognizing method based on the Facial action unit assemblage characteristic.The method is by setting up extensive Facial expression database, utilize the AP clustering algorithm, training sample corresponding to the facial basic facial expression of each class carried out cluster, judge the AU unit combination of each subclass, and in conjunction with main AU unit combination, determine the subclass number under the same expression; With the subclass classification number that the composing training sample is corresponding altogether of all kinds of expressions, utilize the SVM method to carry out the sorter training, can improve the discrimination to single Facial action unit.
The technical solution adopted for the present invention to solve the technical problems is: a kind of facial expression recognizing method based on the Facial action unit assemblage characteristic is characterized in may further comprise the steps:
(a) make up extensive expression database, to the every width of cloth expression picture in the expression database, carry out people's face and detect and normalized.Select many people that every width of cloth expression picture is carried out manual AU mark, probability is worked the AU combination that is used as this width of cloth picture greater than 50% AU unit combination.
(b) carry out the feature extraction of people's face local binary patterns, people's face is divided into m * m sub-block, each sub-block is calculated its LBP histogram, with the set of histograms of each sub-block altogether, consist of the binary pattern of every width of cloth expression picture at last.
(c) two width of cloth image x, the corresponding LBP feature of y Υ, the visual similarity between Ψ use following kernel function to calculate:
K ( x , y ) = e - χ 2 ( γ , ψ ) / θ = Π j = 1 n Π i = 1 m e - χ i 2 ( γ j ( i ) , ψ j ( i ) ) / θ i - - - ( 1 )
In the formula, m is the sub-block quantity that every width of cloth image is divided into, and n is the intrinsic dimensionality of each sub-block, θ=[θ 1..., θ m] each corresponding blocks χ of expression two width of cloth images 2The mean value of value, χ 2The computing formula of value is:
χ 2 ( α , ξ ) = Σ i ( α i - ξ i ) 2 α i + ξ i - - - ( 2 )
In the formula, α iAnd ξ iTwo width of cloth image x, i component of LBP feature of y.
To the image in the every class expression training set, utilize formula (1) to calculate each other similarity after, utilize the AP clustering algorithm that the image clustering in every class expression training set is different subclasses.
To each subclass of every class expression, add up its corresponding AU combined situation, determine the AU combined situation that this subclass is final according to maximum probability criterion.Probability of erasure makes up and subclass less than 10% AU.
All subclasses of every class expression are combined into classification corresponding to whole expression database.
(d) carry out face normalization and feature extraction, adopt the svm classifier device that trains that every class expression subclass is classified.
The invention has the beneficial effects as follows: because by setting up extensive Facial expression database, utilize the AP clustering algorithm, training sample corresponding to the facial basic facial expression of each class carried out cluster, judge the AU unit combination of each subclass, and in conjunction with main AU unit combination, determine the subclass number under the same expression; With the subclass classification number that the composing training sample is corresponding altogether of all kinds of expressions, utilize the SVM method to carry out the sorter training, improved the discrimination to single Facial action unit.After testing, the inventive method is used for the JAFFE database, brings up to 90.1% for the average recognition rate of single AU unit by 87.5% of background technology, has improved 2.6%.
Below in conjunction with drawings and Examples the present invention is elaborated.
Description of drawings
Fig. 1 is the process flow diagram that the present invention is based on the facial expression recognizing method of Facial action unit assemblage characteristic.
Fig. 2 is the geometry site figure between people's face each several part.
Fig. 3 is human face expression local binary patterns characteristic pattern.
Embodiment
With reference to Fig. 1-3.The inventive method concrete steps are as follows:
(a) express one's feelings the on a large scale foundation of database.
Adopt document 2 " X.Wang; X Feng; and J Peng; A Novel Facial Expression DatabaseConstruction Method based on Web Images; in Proc.Of the Third Intl.Conf.on InternetMultimedia Computing and Service (ICIMCS ' 11) (2011) 124-127 " method, to every kind of expression, download 2000 width of cloth pictures from network, and pass through interactive filter, 500 width of cloth pictures that obtain every kind of expression consist of extensive Facial expression database, and this database is totally 500 * 6=3000 width of cloth expression picture.
To the every width of cloth expression picture in this database, eyes position, manual location is carried out people's face according to the geometric proportion (as shown in Figure 2) at each position of people's face and is detected and normalized.
Select many people that every width of cloth expression picture is carried out manual AU mark, and get majority's annotation results as the AU annotation results of this width of cloth picture.
With " happiness ", " happy ", " laughing at ", " happy ", " happiness ", " smile ", words such as " laugh " is as keyword, and inputted search engine such as ***.com select front 1000 pictures in the lump; Utilize the method for document 2, carry out the rubbish image filtering, obtain the effective picture of 500 width of cloth (also can manually reject the rubbish image).So far, obtain the training sample of 500 width of cloth " happiness " expression.Can obtain the training sample of other 5 kinds of basic facial expressions with similar approach, thereby consist of Sample Storehouse.
Every width of cloth expression picture in the database is carried out Computer display, utilize eyes position, the manual location of mouse, according to the geometry site between eyes, eyes and face in people's face shown in Figure 2, carry out people's face and detect and normalized.
Select 10 to carry out the postgraduates that image is processed, every width of cloth expression picture is carried out manual AU mark, calculate the probability that each AU unit is marked, probability is worked the AU combination that is used as this width of cloth picture greater than 50% AU unit combination.
(b) face characteristic extracts.
Employing document 3 " T.Ahonen,, A.Hadid, and M. Face Description with LocalBinary Patterns:Application to Face Recognition.IEEE Trans.Pattern Analysis andMachine Intelligence.28 (12): 2037-2041,2006 " method; carry out respectively local binary patterns (the Local Binary Patterns; LBP) feature extraction; then people's face is divided into 5 * 5 sub-block; each sub-block is calculated its LBP histogram; with the set of histograms of each sub-block altogether, consist of binary pattern (LBP feature) (see figure 3) of every width of cloth expression picture at last under two different scales of R=1 and R=3.
(c) AP(Affinity Propagation) cluster.
Two width of cloth image x, the corresponding LBP feature of y Υ, the visual similarity between Ψ use following kernel function to calculate:
K ( x , y ) = e - χ 2 ( γ , ψ ) / θ = Π j = 1 n Π i = 1 m e - χ i 2 ( γ j ( i ) , ψ j ( i ) ) / θ i - - - ( 1 )
Wherein m is the sub-block quantity that every width of cloth image is divided into, and n is the intrinsic dimensionality of each sub-block, θ=[θ 1..., θ m] each corresponding blocks χ of expression two width of cloth images 2The mean value of value, χ 2The computing formula of value is:
χ 2 ( α , ξ ) = Σ i ( α i - ξ i ) 2 α i + ξ i - - - ( 2 )
α wherein iAnd ξ iI component of two width of cloth image LBP features.
To the image in the every class expression training set, utilize formula (1) to calculate each other similarity after, utilize the AP clustering algorithm that its cluster is different subclasses.
To each subclass of every class expression, add up its corresponding AU combined situation, according to maximum probability criterion determine this subclass final the AU combined situation.Probability of erasure makes up and subclass less than 10% AU.
All subclasses of every class expression are combined into classification corresponding to whole expression database.
To 500 width of cloth pictures of " happiness " expression, utilize formula (1) to calculate the wherein visual similarity of any two width of cloth images, then utilize the AP clustering algorithm that its cluster is different subclasses, present embodiment is 7 subclasses.
To 7 subclasses of " happiness " expression, add up its corresponding AU combined situation, the combination of maximum probability is defined as the AU combined result of this subclass.If the maximum probability of certain subclass AU combination is deleted this combination and this subclass less than 20%.So far, obtain 4 subclasses and the corresponding combination of " happiness " expression.
By the method, final six kinds of basic facial expressions are divided into 21 subclass, and the expression that every class is corresponding and AU combination see Table 1.
Table 1 basic facial expression and AU combination
Figure BDA00002626894300043
So far, set up the rule of correspondence between AU combination and six kinds of basic facial expressions.
(d) svm classifier device training.
Adopt the svm classifier device, carry out the training of 21 expression subclass.
When test sample book is carried out Expression Recognition, adopt the method for above-mentioned steps (a) and step (b), carry out face normalization and feature extraction, then utilize the svm classifier device that trains that test sample book is classified.
After testing, the inventive method is used for the JAFFE database, and average recognition rate is 90.1%.

Claims (1)

1. facial expression recognizing method based on the Facial action unit assemblage characteristic is characterized in that may further comprise the steps:
(a) make up extensive expression database, to the every width of cloth expression picture in the expression database, carry out people's face and detect and normalized; Select many people that every width of cloth expression picture is carried out manual AU mark, probability is worked the AU combination that is used as this width of cloth picture greater than 50% AU unit combination;
(b) carry out the feature extraction of people's face local binary patterns, people's face is divided into m * m sub-block, each sub-block is calculated its LBP histogram, with the set of histograms of each sub-block altogether, consist of the binary pattern of every width of cloth expression picture at last;
(c) two width of cloth image x, the corresponding LBP feature of y Υ, the visual similarity between Ψ use following kernel function to calculate:
K ( x , y ) = e - χ 2 ( γ , ψ ) / θ = Π j = 1 n Π i = 1 m e - χ i 2 ( γ j ( i ) , ψ j ( i ) ) / θ i - - - ( 1 )
In the formula, m is the sub-block quantity that every width of cloth image is divided into, and n is the intrinsic dimensionality of each sub-block, θ=[θ 1..., θ m] each corresponding blocks χ of expression two width of cloth images 2The mean value of value, χ 2The computing formula of value is:
χ 2 ( α , ξ ) = Σ i ( α i - ξ i ) 2 α i + ξ i - - - ( 2 )
In the formula, α iAnd ξ iTwo width of cloth image x, i component of LBP feature of y;
To the image in the every class expression training set, utilize formula (1) to calculate each other similarity after, utilize the AP clustering algorithm that the image clustering in every class expression training set is different subclasses;
To each subclass of every class expression, add up its corresponding AU combined situation, determine the AU combined situation that this subclass is final according to maximum probability criterion; Probability of erasure makes up and subclass less than 10% AU;
All subclasses of every class expression are combined into classification corresponding to whole expression database;
(d) carry out face normalization and feature extraction, adopt the svm classifier device that trains that every class expression subclass is classified.
CN2012105601443A 2012-12-21 2012-12-21 Facial expression recognition method based on facial motion unit combination features Pending CN103065122A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012105601443A CN103065122A (en) 2012-12-21 2012-12-21 Facial expression recognition method based on facial motion unit combination features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012105601443A CN103065122A (en) 2012-12-21 2012-12-21 Facial expression recognition method based on facial motion unit combination features

Publications (1)

Publication Number Publication Date
CN103065122A true CN103065122A (en) 2013-04-24

Family

ID=48107745

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012105601443A Pending CN103065122A (en) 2012-12-21 2012-12-21 Facial expression recognition method based on facial motion unit combination features

Country Status (1)

Country Link
CN (1) CN103065122A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103559175A (en) * 2013-10-12 2014-02-05 华南理工大学 Spam mail filtering system and method based on clusters
CN103971095A (en) * 2014-05-09 2014-08-06 西北工业大学 Large-scale facial expression recognition method based on multiscale LBP and sparse coding
CN104680141A (en) * 2015-02-13 2015-06-03 华中师范大学 Motion unit layering-based facial expression recognition method and system
CN104751198A (en) * 2013-12-27 2015-07-01 华为技术有限公司 Method and device for identifying target object in image
CN106384083A (en) * 2016-08-31 2017-02-08 上海交通大学 Automatic face expression identification and information recommendation method
CN106779047A (en) * 2016-12-30 2017-05-31 纳恩博(北京)科技有限公司 A kind of information processing method and device
CN107633207A (en) * 2017-08-17 2018-01-26 平安科技(深圳)有限公司 AU characteristic recognition methods, device and storage medium
CN107704834A (en) * 2017-10-13 2018-02-16 上海壹账通金融科技有限公司 Householder method, device and storage medium are examined in micro- expression face
CN107862292A (en) * 2017-11-15 2018-03-30 平安科技(深圳)有限公司 Personage's mood analysis method, device and storage medium
CN108564016A (en) * 2018-04-04 2018-09-21 北京红云智胜科技有限公司 A kind of AU categorizing systems based on computer vision and method
CN108717663A (en) * 2018-05-18 2018-10-30 深圳壹账通智能科技有限公司 Face label fraud judgment method, device, equipment and medium based on micro- expression
CN109584050A (en) * 2018-12-14 2019-04-05 深圳壹账通智能科技有限公司 Consumer's risk degree analyzing method and device based on micro- Expression Recognition
CN109784179A (en) * 2018-12-15 2019-05-21 深圳壹账通智能科技有限公司 Intelligent monitor method, apparatus, equipment and medium based on micro- Expression Recognition
CN109840513A (en) * 2019-02-28 2019-06-04 北京科技大学 A kind of micro- expression recognition method of face and identification device
TWI731297B (en) * 2018-05-22 2021-06-21 大陸商深圳壹賬通智能科技有限公司 Risk prediction method and apparatus, storage medium, and server

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100266213A1 (en) * 2009-04-16 2010-10-21 Hill Daniel A Method of assessing people's self-presentation and actions to evaluate personality type, behavioral tendencies, credibility, motivations and other insights through facial muscle activity and expressions
CN102799870A (en) * 2012-07-13 2012-11-28 复旦大学 Single-training sample face recognition method based on blocking consistency LBP (Local Binary Pattern) and sparse coding
CN103168314A (en) * 2010-10-21 2013-06-19 三星电子株式会社 Method and apparatus for recognizing an emotion of an individual based on facial action units

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100266213A1 (en) * 2009-04-16 2010-10-21 Hill Daniel A Method of assessing people's self-presentation and actions to evaluate personality type, behavioral tendencies, credibility, motivations and other insights through facial muscle activity and expressions
CN103168314A (en) * 2010-10-21 2013-06-19 三星电子株式会社 Method and apparatus for recognizing an emotion of an individual based on facial action units
CN102799870A (en) * 2012-07-13 2012-11-28 复旦大学 Single-training sample face recognition method based on blocking consistency LBP (Local Binary Pattern) and sparse coding

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
AHONE等: "Face Description with Local Binary Patterns:Application to Face Recognition", 《PATTERN ANALYSIS AND MACHINE INTELLIGENCE,IEEE TRANSACTIONS ON》, 30 October 2006 (2006-10-30) *
SUDHA VELUSAMY等: "A Method to infer emotions from facial action units", 《ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2011 IEEE INTERNATIONAL CONFERENCE ON 》, 27 May 2011 (2011-05-27) *
TOBIAS GEHRIG等: "A Common Framework for Real-Time Emotion Recognition and Facial Action Unit Detection", 《COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2011 IEEE COMPUTER SOCIETY CONFERENCE ON 》, 25 June 2011 (2011-06-25) *
XIBO WANG等: "A novel facial expression database construction method based on web images", 《ICIMCS ‘11 PROCEEDINGS OF THE THIRD INTERNATIONAL CONFERENCE ON INTERNET MULTIMEDIA COMPUTING AND SERVICE 》, 7 August 2011 (2011-08-07) *
赵晖等: "人脸活动单元自动识别研究综述", 《计算机辅助设计与图形学学报》, vol. 22, no. 5, 15 May 2010 (2010-05-15) *

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103559175A (en) * 2013-10-12 2014-02-05 华南理工大学 Spam mail filtering system and method based on clusters
CN103559175B (en) * 2013-10-12 2016-08-10 华南理工大学 A kind of Spam Filtering System based on cluster and method
CN104751198A (en) * 2013-12-27 2015-07-01 华为技术有限公司 Method and device for identifying target object in image
US9798956B2 (en) 2013-12-27 2017-10-24 Huawei Technologies Co., Ltd. Method for recognizing target object in image, and apparatus
CN104751198B (en) * 2013-12-27 2018-04-27 华为技术有限公司 The recognition methods of object in image and device
CN103971095B (en) * 2014-05-09 2017-02-01 西北工业大学 Large-scale facial expression recognition method based on multiscale LBP and sparse coding
CN103971095A (en) * 2014-05-09 2014-08-06 西北工业大学 Large-scale facial expression recognition method based on multiscale LBP and sparse coding
CN104680141B (en) * 2015-02-13 2017-11-14 华中师范大学 Facial expression recognizing method and system based on moving cell layering
CN104680141A (en) * 2015-02-13 2015-06-03 华中师范大学 Motion unit layering-based facial expression recognition method and system
CN106384083A (en) * 2016-08-31 2017-02-08 上海交通大学 Automatic face expression identification and information recommendation method
CN106779047B (en) * 2016-12-30 2019-06-18 纳恩博(北京)科技有限公司 A kind of information processing method and device
CN106779047A (en) * 2016-12-30 2017-05-31 纳恩博(北京)科技有限公司 A kind of information processing method and device
CN107633207A (en) * 2017-08-17 2018-01-26 平安科技(深圳)有限公司 AU characteristic recognition methods, device and storage medium
CN107633207B (en) * 2017-08-17 2018-10-12 平安科技(深圳)有限公司 AU characteristic recognition methods, device and storage medium
CN107704834A (en) * 2017-10-13 2018-02-16 上海壹账通金融科技有限公司 Householder method, device and storage medium are examined in micro- expression face
CN107704834B (en) * 2017-10-13 2021-03-30 深圳壹账通智能科技有限公司 Micro-surface examination assisting method, device and storage medium
CN107862292A (en) * 2017-11-15 2018-03-30 平安科技(深圳)有限公司 Personage's mood analysis method, device and storage medium
CN107862292B (en) * 2017-11-15 2019-04-12 平安科技(深圳)有限公司 Personage's mood analysis method, device and storage medium
CN108564016A (en) * 2018-04-04 2018-09-21 北京红云智胜科技有限公司 A kind of AU categorizing systems based on computer vision and method
CN108717663A (en) * 2018-05-18 2018-10-30 深圳壹账通智能科技有限公司 Face label fraud judgment method, device, equipment and medium based on micro- expression
CN108717663B (en) * 2018-05-18 2023-06-09 深圳壹账通智能科技有限公司 Facial tag fraud judging method, device, equipment and medium based on micro expression
TWI731297B (en) * 2018-05-22 2021-06-21 大陸商深圳壹賬通智能科技有限公司 Risk prediction method and apparatus, storage medium, and server
CN109584050A (en) * 2018-12-14 2019-04-05 深圳壹账通智能科技有限公司 Consumer's risk degree analyzing method and device based on micro- Expression Recognition
CN109784179A (en) * 2018-12-15 2019-05-21 深圳壹账通智能科技有限公司 Intelligent monitor method, apparatus, equipment and medium based on micro- Expression Recognition
CN109840513A (en) * 2019-02-28 2019-06-04 北京科技大学 A kind of micro- expression recognition method of face and identification device

Similar Documents

Publication Publication Date Title
CN103065122A (en) Facial expression recognition method based on facial motion unit combination features
EP2889805A2 (en) Method and system for emotion and behavior recognition
CN103279768B (en) A kind of video face identification method based on incremental learning face piecemeal visual characteristic
CN101464946A (en) Detection method based on head identification and tracking characteristics
CN104268134B (en) Subjective and objective classifier building method and system
CN105512624A (en) Smile face recognition method and device for human face image
CN105389593A (en) Image object recognition method based on SURF
CN104978550A (en) Face recognition method and system based on large-scale face database
CN102521561B (en) Face identification method on basis of multi-scale weber local features and hierarchical decision fusion
CN101930549B (en) Second generation curvelet transform-based static human detection method
CN102722712A (en) Multiple-scale high-resolution image object detection method based on continuity
CN103679189A (en) Method and device for recognizing scene
CN104298981A (en) Face microexpression recognition method
CN105844221A (en) Human face expression identification method based on Vadaboost screening characteristic block
CN105956570B (en) Smiling face's recognition methods based on lip feature and deep learning
CN104021375A (en) Model identification method based on machine learning
CN105117708A (en) Facial expression recognition method and apparatus
CN104268598A (en) Human leg detection method based on two-dimensional scanning lasers
CN105760472A (en) Video retrieval method and system
CN104077594A (en) Image recognition method and device
Paul et al. Extraction of facial feature points using cumulative histogram
CN102129568A (en) Method for detecting image-based spam email by utilizing improved gauss hybrid model classifier
CN103177266A (en) Intelligent stock pest identification system
CN103500323B (en) Based on the template matching method of self-adaptation gray level image filtering
CN105893941B (en) A kind of facial expression recognizing method based on area image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130424