CN106096503A - A kind of based on key point with the three-dimensional face identification method of local feature - Google Patents
A kind of based on key point with the three-dimensional face identification method of local feature Download PDFInfo
- Publication number
- CN106096503A CN106096503A CN201610367550.6A CN201610367550A CN106096503A CN 106096503 A CN106096503 A CN 106096503A CN 201610367550 A CN201610367550 A CN 201610367550A CN 106096503 A CN106096503 A CN 106096503A
- Authority
- CN
- China
- Prior art keywords
- key point
- face
- point
- local feature
- neighborhood
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a kind of based on key point with the three-dimensional face identification method of local feature, step is as follows: three-dimensional face model is carried out pretreatment, including human face region cutting, smoothing processing and attitude normalization, is placed under attitude frame of reference by all of face;According to valuable contour line and average curvature measuring key point;The space structure of the formal construction local feature according to DAISY descriptor;Use shape index rectangular histogram, inclination angle rectangular histogram and deflection rectangular histogram as local feature;Carry out key point coupling, and weigh the similarity of two face curved surfaces with the key point number that the match is successful.Recognition performance of the present invention is relatively strong, and has certain robustness to expression shape change.
Description
Technical field
The invention belongs to technical field of face recognition, be specifically related to a kind of based on key point with the three-dimensional face of local feature
Recognition methods, particularly relates to a kind of face identification method utilizing face face key point local feature, is particularly suited for expression
The occasion of large change.
Background technology
In modern society, accurately and reliably personal authentication is more and more subject to people's attention, and authentication exists
Many occasions suffer from important application, such as gate control system, video monitoring, man-machine interaction etc..Identification technology mainly include based on
The method of the human body biological characteristics such as fingerprint, iris, face, wherein authentication based on fingerprint and iris have the highest accurate
Property and reliability, these features all have a uniqueness, but are limited owing to needing the factors such as user's cooperation to make it apply when using
System, and recognition of face has broader practice prospect because it has the advantage such as nature, close friend.People began to grind from nineteen sixty
Studying carefully Automatic face recognition technology, traditional two-dimension human face identification is that monochrome information based on image carries out identity validation, but because of
It is relatively big for being affected by factors such as illumination, attitude and cosmetics, so the research of recognition of face in recent years starts to expand to from two dimensional image
On three dimensions.
Three-dimensional face identification utilizes the spatial form information of face to carry out authentication, not by the factor such as illumination, cosmetic
Impact, although compare with two-dimension human face identification and have bigger advantage, but still suffer from following both sides challenge: one, expression
Change, expression can cause face to produce non-rigid shape deformations, and the difference that recognizer is difficult to differentiate between two faces is by expressing one's feelings
The still different people that change causes causes.Two, block, block and the three-dimensional data of face can be caused to lose, affect recognition effect.
Summary of the invention
Goal of the invention: in order to overcome the deficiencies in the prior art, the present invention provides one based on key point and local
The three-dimensional face identification method of feature, the method method recognition performance is higher, and has certain robustness to expression shape change.
Technical scheme: for achieving the above object, the technical solution used in the present invention is:
A kind of based on key point with the three-dimensional face identification method of local feature, comprise the following steps:
Step 1: three-dimensional face model is carried out pretreatment, including human face region cutting, smoothing processing and attitude normalization,
All of face is placed under attitude frame of reference, constitutes the three-dimensional face point cloud of face curved surface;
Step 2: extract the central area of face, extract valuable contour line, to clicking on described valuable contour line
Row uniform sampling, obtains crucial point set P1;
Step 3: screen according to the average curvature of each point and neighborhood thereof and obtain crucial point set P2;
Step 4: according to the space structure of the formal construction local feature of DAISY descriptor;
Step 5: calculate key point and the shape index rectangular histogram of neighborhood, inclination angle rectangular histogram and deflection rectangular histogram are made
For local feature;
Step 6: mate the key point on two face curved surfaces, obtains the key point logarithm that the match is successful, and this number
Foundation as authentication.
Further, described step 2 method particularly includes: with nose for centre of sphere 90mm from the three-dimensional face point cloud of input
Extract human face region for radius, the three-dimensional face point cloud extracted is carried out gridding operation, use based on grid smooth
Algorithm carries out smoothing denoising process, the smooth three-dimensional face grid that then will obtain through iterative processing to three-dimensional face model
Revert to three-dimensional face point cloud.
Further, described step 2 specifically includes following steps:
Step 2.1: extracting face central area, be with prenasale as the centre of sphere, r=50mm is the region of radius;
Step 2.2: detection facial symmetry face, uses central area face mirror image matching method to determine facial symmetry face;
Step 2.3: extract valuable contour line and obtain crucial point set P1, method particularly includes: described facial symmetry face and people
The intersection of face curved surface is central lateral plane contour line, according to the cut-off position of central lateral plane contour line described in nose Y-axis coordinate selection
Putting, the nasal portion of extraction central lateral plane contour line, as valuable contour line, selects by uniform in described valuable contour line
The crucial point set P1 of some composition after sampling.
Further, in described step 2.2, face mirror image matching method determines specifically comprising the following steps that of facial symmetry face
Step 2.2.1: central area curved surface is F, the initial plane of symmetry is S, determines that the initial plane of symmetry of described S is YOZ plane;
Step 2.2.2: the x-axis coordinate of described central area curved surface F is reverse, obtains the mirror image mould for this symmetrical plane
Type;Described mirror image model and former input model being alignd, this step uses ICP matching process;
Step 2.2.3: extract the plane of symmetry, after specially two models alignment, take defeated as attitude updating of both meansigma methodss
Enter model, and with PCA process, extract three orthogonal major axes orientations of face, by corresponding for first, second and third main constituent
The principal direction YOZ plane respectively as Y-axis, X-axis, Z axis, under the plane of symmetry the most now coordinate.
Further, described step 3 specifically includes following steps:
Step 3.1: calculate the average curvature of and neighborhood at every: for summit p, its average curvature CmeanP () is by maximum
Principal curvatures CM(p) and minimum principal curvatures CmP () is calculated, such as following formula:
Cmean(p)=(CM(p)+Cm(p))/2 (1)
If its neighborhood is N (p), then average curvature μ (p) of its neighborhood is calculated as follows:
Step 3.2: screening key point: if the average curvature on one summit of definition meets following formula, then it is assumed that it is crucial
Point, screening the key point obtained obtains crucial point set P2:
|Cmean(p)|≥(1+a)|μ(p)|or|Cmean(p)|≤(1-b)|μ(p)|,b≤1,a≥0 (3)
Wherein, a and b is to limit | Cmean(p) | and two parameters of | μ (p) | difference size, the most also it is used for selecting key
Point.
Further, described step 4 specifically includes following steps:
Step 4.1: the establishment in standardization direction: for key point p on face curved surface, its neighborhood N (p) is defined as follows:
N (p)=q | dg(p, q) < R} (4)
Wherein, dg(p, q) is the p point geodesic distance to q point, and R is radius;
Step 4.1.1: by all of some q ∈ N (p) and normal vector n (q) thereof, is transformed in following interim local coordinate system:
C={t (p'), t (p') × n (p'), n (p') } (5)
Wherein, p' be conversion after p, unit normal vector n (p') be Z axis, t (p') is the face curved surface incisal plane at p' point
The unit vector randomly choosed in T (p');
Step 4.1.2: the q' after conversion and their normal vector n (q') is mapped to incisal plane T (p'), corresponding angle
θ (q') and the long mag of mould (q') be:
Wherein, nx(q')=t (p') n (q'), ny(q')=t (p') × n (p') n (q');
Step 4.1.3: one the Gauss weight histogram of gradients of upper structure of the incisal plane T (p') under temporary coordinate system, wide
Being 360, each unit covers 1 °, and described Gauss weight is described as follows:
W (p', q')=mag (q') Gσ(dg(p',q')) (7)
Wherein, standard deviation sigma is set to the half of radius R;
Step 4.1.4: using the peak value of described Gauss weight histogram of gradients as standardization direction d (p') at p', adjacent
In territory be the most all transformed under new coordinate system:
C'={d (p'), d (p') × n (p'), n (p') } (8)
Step 4.2: descriptor structure: the spatial configuration of descriptor is that the incisal plane T (p') under new coordinate system C' is enterprising
Row, with key point p' and its 8 neighborhood points as the center of circle, obtain 9 circles, and by sequence counterclockwise, described 8 neighborhood points by
At standardization direction, start uniform sampling for the center of circle with key point p' to obtain.
Further, described step 5 specifically includes following steps:
Step 5.1: calculate the local feature of each circle shaped neighborhood region of key point p: the local feature of each border circular areas is by three
Part is constituted:
1) shape index rectangular histogram hs, a length of 8, shape index is calculated by following formula:
Wherein q ∈ N (p), SI (q) are with a gaussian kernel as weight, and its standard variance is the q geodesic distance to the center of circle.
2) inclination angle rectangular histogram ha, a length of 8, tilt angle alpha is defined as every some normal vector n (q) and standardization direction d
(p') angle:
Wherein q ∈ N (p), α (q) are with the angle between its normal vector and normal vector of key point
As weight;
3) deflection rectangular histogram hd, a length of 8, direction β are defined as any two points q in neighborhood1And q2Between line l
(q1,q2) and key point normal vector n (p) between angle:
It is normalized to unit vector after each rectangular histogram described above is all normalized to unit vector, and series connection obtain
Feature h of the n-th circlen:
hn=(hsn,han,hdn) (12)
Step 5.2: calculate the local feature of key point p: connect feature h of each circlenObtain the local feature of key point p
F:
F=(h1,h2,…,h9) (13)。
Further, described step 6 specifically includes following steps:
Step 6.1: calculate angle γ: set fiAnd fjIt is respectively on storehouse collection and test set face curved surface i-th, j key point
Feature, γ is calculated as follows:
Step 6.2: key point is mated, and concrete grammar is to each key point i in test set face curved surface, calculates itself and storehouse
Collect the angle γ between each key point j on face curved surfacei,j, by angle γi,jSort in descending order, if first and second
Ratio less than 0.8, then coupling can be accepted, and otherwise it fails to match;
Step 6.3: recognition methods, concrete grammar is that test set face curved surface concentrates each face curved surface to carry out key with storehouse
Point matching, test set with and the most face curved surface of its key point number that the match is successful and for same class.
Beneficial effect: the present invention provide based on key point and the three-dimensional face identification method of local feature, according to there being valency
Value contour line and average curvature measuring key point, according to the space structure of the formal construction local feature of DAISY descriptor, use
Shape index rectangular histogram, inclination angle rectangular histogram and deflection rectangular histogram, as local feature, carry out key point coupling, and with mating
Successfully key point number weighs the similarity of two face curved surfaces.The method has following several advantage.
1) key point that this algorithm obtains is stable and representative;
2) space structure of key point local feature, not only has rotational invariance, meanwhile, it is capable to sufficiently give expression to neighbour
Contact between territory;
3) key point local feature contains shape index rectangular histogram, inclination angle rectangular histogram and deflection rectangular histogram, fully
The geological information illustrating face curved surface, there is good separability.
Accompanying drawing explanation
Fig. 1 be the present invention provide based on half key point and the overall flow of the three-dimensional face identification method of local feature
Figure;
Fig. 2 is original faceform;
Fig. 3 is face central area;
Fig. 4 is the face central lateral plane contour line extracted;
Fig. 5 is valuable contour line;
Fig. 6 is crucial point set P1;
Fig. 7 is crucial point set P2;
Fig. 8 is key point;
Fig. 9 is DAISY descriptor space structure.
Detailed description of the invention
Below in conjunction with the accompanying drawings the present invention is further described.
Embodiment 1
The present invention based on key point and the three-dimensional face identification method of local feature, logical in Windows operating system
Cross Matlab R2015b programming tool and realize three-dimensional face identification process.Experimental data is from FRGC v2.0 three-dimensional face data
Storehouse, comprises 4007 faceforms for 466 people tested in this data base.
Fig. 1 is the inventive method overall flow figure, specifically comprises the following steps that
Step 1: extract human face region for centre of sphere 90mm for radius with nose from the three-dimensional face point cloud of input.To institute
The three-dimensional face point cloud extracted carries out gridding operation, uses smoothing algorithm based on grid to smooth three-dimensional face model
Denoising, then reverts to three-dimensional face point cloud by the smooth three-dimensional face grid obtained through iterative processing.
Step 2: with prenasale as the centre of sphere, 50mm is the central area that radius extracts face, utilizes face mirror image matching method
Extract the central lateral plane contour line of face, thus obtain valuable contour line, the point on valuable contour line is uniformly adopted
Sample, obtains crucial point set P1, specifically comprises the following steps that
Step 2.1: extracting face central area, central area is with nose as the centre of sphere, and r=50mm is the district of radius
Territory.
Step 2.2: detection facial symmetry face, uses central area face mirror image matching method to determine facial symmetry face.In if
Heart region curved surface is F, and the initial plane of symmetry is S, and S elects YOZ plane as herein, and calculation procedure is as follows:
(1) initial symmetrical plane is determined.ICP method for registering has dependency for initial condition, thus initially symmetrical
Plane choose the order of accuarcy determining final symmetrical plane to a certain extent.Initial plane we select smoothed denoising to cut
The YOZ plane of rear human face data.
(2) x-axis coordinate is reversely obtained the mirror image model for this symmetrical plane;By mirror image model and former input model pair
Together.This step uses ICP matching process.
(3) plane of symmetry is extracted.After two model alignment, take both meansigma methodss input model as attitude updating, and use
PCA process, extracts three orthogonal major axes orientations of face.Principal direction corresponding for first, second and third main constituent is made respectively
For Y-axis, X-axis, Z axis.The most different faces will have approximately uniform frontal pose under this coordinate system.The plane of symmetry is the most now
YOZ plane under coordinate.
Step 2.3: extract valuable contour line and obtain crucial point set P1: facial symmetry face is with the intersection of face curved surface
Central lateral plane contour line.According to the rest position of nose Y-axis coordinate selection contour line, extract the nose portion of central lateral plane contour line
It is allocated as valuable contour line, selects form key point set P1 by the point after uniform sampling in valuable contour line.
Step 3: screen according to the average curvature of each point and neighborhood thereof and obtain crucial point set P2.Specifically comprise the following steps that
Step 3.1: calculate the average curvature of and neighborhood at every: for summit p, its average curvature CmeanP () is by maximum
Main
Curvature CM(p) and minimum principal curvatures CmP () is calculated, such as following formula:
Cmean(p)=(CM(p)+Cm(p))/2 (1)
If its neighborhood is N (p), then average curvature μ (p) of its neighborhood is calculated as follows:
Step 3.2: screening key point: if our average curvature on one summit of definition meets following formula, then it is assumed that it is
Key point, screening the key point obtained obtains crucial point set P2.
|Cmean(p)|≥(1+a)|μ(p)|or|Cmean(p)|≤(1-b)|μ(p)|,b≤1,a≥0 (3)
Wherein, a and b is to limit | Cmean(p) | and two parameters of | μ (p) | difference size, the most also it is used for selecting key
Point.
Step 4: according to the space structure of the formal construction local feature of DAISY descriptor, specifically comprise the following steps that
Step 4.1: the establishment in standardization direction.In order to realize the rotational invariance of DAISY descriptor, need to establish specification
Change direction.For key point p on face curved surface, its neighborhood N (p) is defined as follows:
N (p)=q | dg(p, q) < R} (4)
Wherein, dg(p, q) is the p point geodesic distance to q point, and R is radius.First, by all of some q ∈ N (p) and method thereof
Vector n (q), is transformed in following interim local coordinate system:
C={t (p'), t (p') × n (p'), n (p') } (5)
Wherein, p' be conversion after p, unit normal vector n (p') be Z axis, t (p') is the face curved surface incisal plane at p' point
The unit vector randomly choosed in T (p').Then, the q' after conversion and their normal vector n (q') is mapped to incisal plane T
(p'), corresponding angle theta (q') and the long mag of mould (q') are:
θ (q')=arctan [ny(q')/nx(q')]
Wherein, nx(q')=t (p') n (q'), ny(q')=t (p') × n (p') n (q').Finally, at temporal coordinate
Incisal plane T (p') under Xi one Gauss weight histogram of gradients of upper structure, a width of 360, each unit covers 1 °.Gauss weighs
Heavily it is described as follows:
W (p', q')=mag (q') Gσ(dg(p',q')) (7)
In above formula, standard deviation sigma is set to the half of radius R.Selection multiple directions are as standardization direction herein, with Gauss
The peak value of weight histogram of gradients is as standardization direction d (p') at p'.After d (p') determines, the institute in neighborhood is the most all
It is transformed under new coordinate system:
C'={d (p'), d (p') × n (p'), n (p') } (8)
Step 4.2: descriptor structure: the spatial configuration of descriptor is that the incisal plane T (p') under new coordinate system C' is enterprising
Row, with key point p' and its 8 neighborhood points as the center of circle, obtain 9 circles, and by sequence counterclockwise, as shown in fig. 1.8
Neighborhood point by with key point p' for the center of circle from standardization direction at uniform sampling obtain.
Step 5: calculate key point and the shape index rectangular histogram of neighborhood, inclination angle rectangular histogram and deflection rectangular histogram are made
For local feature, specifically comprise the following steps that
Step 5.1: calculate the local feature of each circle shaped neighborhood region of key point p: the local feature of each border circular areas is by three
Part is constituted:
1) shape index rectangular histogram hs, a length of 8, shape index is calculated by following formula:
Wherein q ∈ N (p), SI (q) are with a gaussian kernel as weight, and its standard variance is the q geodesic distance to the center of circle.
2) inclination angle rectangular histogram ha, a length of 8, tilt angle alpha is defined as every some normal vector n (q) and standardization direction d
(p') angle:
Wherein q ∈ N (p), α (q) are with the angle between its normal vector and normal vector of key point
As weight.
3) deflection rectangular histogram hd, a length of 8, direction β are defined as any two points q in neighborhood1And q2Between line l
(q1,q2) and key point normal vector n (p) between angle:
It is normalized to unit vector after each rectangular histogram described above is all normalized to unit vector, and series connection obtain
Feature h of the n-th circlen:
hn=(hsn,han,hdn) (12)
Step 5.2: calculate the local feature of key point p: connect feature h of each circlenObtain the local feature of key point p
F:
F=(h1,h2,…,h9) (13)
Step 6: mate the key point on two face curved surfaces, obtains the key point logarithm that the match is successful, and this number
As the foundation of authentication, specifically comprise the following steps that
Step 6.1: calculate angle γ: set fiAnd fjIt is respectively on storehouse collection and test set face curved surface i-th, j key point
Feature, γ is calculated as follows:
Step 6.2: key point is mated.To each key point i in test set face curved surface, calculate itself and storehouse collection face curved surface
Angle γ between upper each key point ji,j, the angle obtained is sorted in descending order, if the ratio of first and second is little
In certain 0.8, then coupling can be accepted, and otherwise it fails to match.
Step 6.3: recognition methods.Test set face curved surface carries out key point with the storehouse each face curved surface of concentration and mates, and surveys
Examination integrate with and the most face curved surface of its key point number that the match is successful and as same class.
In above method, storehouse integrates face as processed offline pattern, and test face is online treatment pattern.
Embodiment 2
The method using embodiment 1, carries out experimental verification.Specifically include following steps:
Step 7: authentication identification is tested, experiment all uses R1RR (Rank-one Recognition Rate) conduct
Recognition performance index.
Step 7.1: experiment one, this experiment uses FRGC v2.0 data base, and this storehouse acquires 4 007 of 466 objects
Face point cloud, comprises smile, the face of the band such as startled, angry expression, experiment choose everyone first as storehouse collection, remaining
As test set, obtain the discrimination of 96.9%.
Step 7.2: experiment two, this experiment acquires 105 objects based on Bosphorus data base, this data base
4666 face point clouds, abundant species of wherein expressing one's feelings, and expression amplitude is bigger.This experiment uses 194 Nature faces as storehouse
Collection, espressiove face is as test set.Experiment as test set, is surveyed using Nature face as storehouse collection, the face of different expressions
Examination collection expression for angry, detest, fear, glad, sad, surprised time, R1RR is respectively 94.4%, 89.9%, 95.7%,
96.9%, 98.5%, 98.6%, during using institute's espressiove face as test set, R1RR is 95.8%, it follows that in expression
In the case of relatively big, the algorithm that the present invention proposes remains to complete well to identify.
The above is only the preferred embodiment of the present invention, it should be pointed out that: for the ordinary skill people of the art
For Yuan, under the premise without departing from the principles of the invention, it is also possible to make some improvements and modifications, these improvements and modifications also should
It is considered as protection scope of the present invention.
Claims (8)
1. one kind based on key point and the three-dimensional face identification method of local feature, it is characterised in that: comprise the following steps:
Step 1: three-dimensional face model is carried out pretreatment, including human face region cutting, smoothing processing and attitude normalization, by institute
Somebody's face is placed under attitude frame of reference, constitutes the three-dimensional face point cloud of face curved surface;
Step 2: extract the central area of face, extract valuable contour line, the point on described valuable contour line is carried out all
Even sampling, obtains crucial point set P1;
Step 3: screen according to the average curvature of each point and neighborhood thereof and obtain crucial point set P2;
Step 4: according to the space structure of the formal construction local feature of DAISY descriptor;
Step 5: calculate key point and the shape index rectangular histogram of neighborhood, inclination angle rectangular histogram and deflection rectangular histogram as office
Portion's feature;
Step 6: mate the key point on two face curved surfaces, obtains the key point logarithm that the match is successful, and using this number as
The foundation of authentication.
It is the most according to claim 1 based on key point with the three-dimensional face identification method of local feature, it is characterised in that: institute
State step 2 method particularly includes: from the three-dimensional face point cloud of input, extract face district for centre of sphere 90mm for radius with nose
Territory, carries out gridding operation to the three-dimensional face point cloud extracted, uses smoothing algorithm based on grid to three-dimensional face model
Carry out smoothing denoising process, then the smooth three-dimensional face grid obtained through iterative processing is reverted to three-dimensional face point
Cloud.
It is the most according to claim 1 based on key point with the three-dimensional face identification method of local feature, it is characterised in that: institute
State step 2 and specifically include following steps:
Step 2.1: extracting face central area, be with prenasale as the centre of sphere, r=50mm is the region of radius;
Step 2.2: detection facial symmetry face, uses central area face mirror image matching method to determine facial symmetry face;
Step 2.3: extract valuable contour line and obtain crucial point set P1, method particularly includes: described facial symmetry face is bent with face
The intersection in face is central lateral plane contour line, according to the rest position of central lateral plane contour line described in nose Y-axis coordinate selection, carries
Take the nasal portion of central lateral plane contour line as valuable contour line, after selecting by uniform sampling in described valuable contour line
The point crucial point set P1 of composition.
It is the most according to claim 3 based on key point with the three-dimensional face identification method of local feature, it is characterised in that: institute
State face mirror image matching method in step 2.2 and determine specifically comprising the following steps that of facial symmetry face
Step 2.2.1: central area curved surface is F, the initial plane of symmetry is S, determines that the initial plane of symmetry of described S is YOZ plane;
Step 2.2.2: the x-axis coordinate of described central area curved surface F is reverse, obtains the mirror image model for this symmetrical plane;
Described mirror image model and former input model being alignd, this step uses ICP matching process;
Step 2.2.3: extract the plane of symmetry, after specially two model alignment, take both meansigma methodss input mould as attitude updating
Type, and with PCA process, extract three orthogonal major axes orientations of face, by main formula corresponding for first, second and third main constituent
To the YOZ plane respectively as Y-axis, X-axis, Z axis, under the plane of symmetry the most now coordinate.
It is the most according to claim 1 based on key point with the three-dimensional face identification method of local feature, it is characterised in that: institute
State step 3 and specifically include following steps:
Step 3.1: calculate the average curvature of and neighborhood at every: for summit p, its average curvature CmeanP () is by maximum principal curvatures
CM(p) and minimum principal curvatures CmP () is calculated, such as following formula:
Cmean(p)=(CM(p)+Cm(p))/2 (1)
If its neighborhood is N (p), then average curvature μ (p) of its neighborhood is calculated as follows:
Step 3.2: screening key point: if the average curvature on one summit of definition meets following formula, then it is assumed that it is key point, by
The key point that obtains of screening obtains crucial point set P2:
|Cmean(p)|≥(1+a)|μ(p)|or|Cmean(p)|≤(1-b)|μ(p)|,b≤1,a≥0 (3)
Wherein, a and b is to limit | Cmean(p) | and two parameters of | μ (p) | difference size, the most also it is used for selecting key point.
It is the most according to claim 1 based on key point with the three-dimensional face identification method of local feature, it is characterised in that: institute
State step 4 and specifically include following steps:
Step 4.1: the establishment in standardization direction: for key point p on face curved surface, its neighborhood N (p) is defined as follows:
N (p)=q | dg(p, q) < R} (4)
Wherein, dg(p, q) is the p point geodesic distance to q point, and R is radius;
Step 4.1.1: by all of some q ∈ N (p) and normal vector n (q) thereof, is transformed in following interim local coordinate system:
C={t (p'), t (p') × n (p'), n (p') } (5)
Wherein, p' be conversion after p, unit normal vector n (p') be Z axis, t (p') is the face curved surface incisal plane T at p' point
(p') unit vector randomly choosed in;
Step 4.1.2: the q' after conversion and their normal vector n (q') is mapped to incisal plane T (p'), corresponding angle theta
And the long mag of mould (q') is (q'):
Wherein, nx(q')=t (p') n (q'), ny(q')=t (p') × n (p') n (q');
Step 4.1.3: one the Gauss weight histogram of gradients of upper structure of the incisal plane T (p') under temporary coordinate system, a width of
360, each unit covers 1 °, and described Gauss weight is described as follows:
W (p', q')=mag (q') Gσ(dg(p',q')) (7)
Wherein, standard deviation sigma is set to the half of radius R;
Step 4.1.4: using the peak value of described Gauss weight histogram of gradients as standardization direction d (p') at p', in neighborhood
Be the most all transformed under new coordinate system:
C'={d (p'), d (p') × n (p'), n (p') } (8)
Step 4.2: descriptor structure: the spatial configuration of descriptor is to carry out on the incisal plane T (p') under new coordinate system C',
With key point p' and its 8 neighborhood points as the center of circle, obtain 9 circles, and by sequence counterclockwise, described 8 neighborhood points are by close
Key point p' is the center of circle to start uniform sampling at standardization direction and obtains.
It is the most according to claim 1 based on key point with the three-dimensional face identification method of local feature, it is characterised in that: institute
State step 5 and specifically include following steps:
Step 5.1: calculate the local feature of each circle shaped neighborhood region of key point p: the local feature of each border circular areas is by three parts
Constitute:
1) shape index rectangular histogram hs, a length of 8, shape index is calculated by following formula:
Wherein q ∈ N (p), SI (q) are with a gaussian kernel as weight, and its standard variance is the q geodesic distance to the center of circle.
2) inclination angle rectangular histogram ha, a length of 8, tilt angle alpha is defined as every some normal vector n (q) and standardization direction d (p')
Angle:
Wherein q ∈ N (p), α (q) are with the angle between its normal vector and normal vector of key pointAs
Weight;
3) deflection rectangular histogram hd, a length of 8, direction β are defined as any two points q in neighborhood1And q2Between line l (q1,q2)
And the angle between key point normal vector n (p):
It is normalized to unit vector after each rectangular histogram described above is all normalized to unit vector, and series connection and obtains n-th
Feature h of individual circlen:
hn=(hsn,han,hdn) (12)
Step 5.2: calculate the local feature of key point p: connect feature h of each circlenObtain the local feature f of key point p:
F=(h1,h2,…,h9) (13)。
It is the most according to claim 1 based on key point with the three-dimensional face identification method of local feature, it is characterised in that: institute
State step 6 and specifically include following steps:
Step 6.1: calculate angle γ: set fiAnd fjIt is respectively on storehouse collection and test set face curved surface i-th, the spy of j key point
Levying, γ is calculated as follows:
Step 6.2: key point is mated, and concrete grammar is to each key point i in test set face curved surface, calculates itself and storehouse collection people
Angle γ between each key point j on face curved surfacei,j, by angle γi,jSort in descending order, if the ratio of first and second
Example is less than 0.8, then coupling can be accepted, and otherwise it fails to match;
Step 6.3: recognition methods, concrete grammar is that test set face curved surface concentrates each face curved surface to carry out key point with storehouse
Join, test set with and the most face curved surface of its key point number that the match is successful and for same class.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610367550.6A CN106096503A (en) | 2016-05-30 | 2016-05-30 | A kind of based on key point with the three-dimensional face identification method of local feature |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610367550.6A CN106096503A (en) | 2016-05-30 | 2016-05-30 | A kind of based on key point with the three-dimensional face identification method of local feature |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106096503A true CN106096503A (en) | 2016-11-09 |
Family
ID=57229642
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610367550.6A Pending CN106096503A (en) | 2016-05-30 | 2016-05-30 | A kind of based on key point with the three-dimensional face identification method of local feature |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106096503A (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107679515A (en) * | 2017-10-24 | 2018-02-09 | 西安交通大学 | A kind of three-dimensional face identification method based on curved surface mediation shape image depth representing |
CN107748859A (en) * | 2017-08-10 | 2018-03-02 | 东南大学 | A kind of three-dimensional face identification method under partial occlusion based on RADIAL |
CN108416331A (en) * | 2018-03-30 | 2018-08-17 | 百度在线网络技术(北京)有限公司 | Method, apparatus, storage medium and the terminal device that face symmetrically identifies |
CN109147430A (en) * | 2018-10-19 | 2019-01-04 | 渭南师范学院 | A kind of teleeducation system based on cloud platform |
CN109481248A (en) * | 2018-12-26 | 2019-03-19 | 浙江师范大学 | A kind of smart guide glasses |
WO2019080488A1 (en) * | 2017-10-27 | 2019-05-02 | 东南大学 | Three-dimensional human face recognition method based on multi-scale covariance descriptor and local sensitive riemann kernel sparse classification |
CN109801236A (en) * | 2018-12-29 | 2019-05-24 | 中国科学院遥感与数字地球研究所 | A kind of photon point cloud denoising method based on mixed Gauss model |
CN109871818A (en) * | 2019-02-27 | 2019-06-11 | 东南大学 | Face identification method based on normal vector distribution histogram and covariance description |
CN110096983A (en) * | 2019-04-22 | 2019-08-06 | 苏州海赛人工智能有限公司 | The safe dress ornament detection method of construction worker in a kind of image neural network based |
CN110210318A (en) * | 2019-05-06 | 2019-09-06 | 深圳市华芯技研科技有限公司 | A kind of three-dimensional face identification method based on characteristic point |
CN110298275A (en) * | 2019-06-19 | 2019-10-01 | 东南大学 | Three-dimensional human ear identification method based on key point and local feature |
CN110335297A (en) * | 2019-06-21 | 2019-10-15 | 华中科技大学 | A kind of point cloud registration method based on feature extraction |
CN110879972A (en) * | 2019-10-24 | 2020-03-13 | 深圳云天励飞技术有限公司 | Face detection method and device |
CN111144169A (en) * | 2018-11-02 | 2020-05-12 | 深圳比亚迪微电子有限公司 | Face recognition method and device and electronic equipment |
CN111414862A (en) * | 2020-03-22 | 2020-07-14 | 西安电子科技大学 | Expression recognition method based on neural network fusion key point angle change |
CN111652086A (en) * | 2020-05-15 | 2020-09-11 | 汉王科技股份有限公司 | Face living body detection method and device, electronic equipment and storage medium |
WO2020248096A1 (en) * | 2019-06-10 | 2020-12-17 | 哈尔滨工业大学(深圳) | Local feature-based three-dimensional face recognition method and system |
CN112183276A (en) * | 2020-09-21 | 2021-01-05 | 西安理工大学 | Partially-occluded face recognition method based on feature descriptors |
CN113168729A (en) * | 2019-12-09 | 2021-07-23 | 深圳大学 | 3D shape matching method and device based on local reference coordinate system |
CN115839675A (en) * | 2023-02-20 | 2023-03-24 | 宜科(天津)电子有限公司 | Object contour line recognition system |
-
2016
- 2016-05-30 CN CN201610367550.6A patent/CN106096503A/en active Pending
Non-Patent Citations (7)
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107748859A (en) * | 2017-08-10 | 2018-03-02 | 东南大学 | A kind of three-dimensional face identification method under partial occlusion based on RADIAL |
CN107748859B (en) * | 2017-08-10 | 2021-04-27 | 东南大学 | Three-dimensional face recognition method under local shielding based on radial lines |
CN107679515A (en) * | 2017-10-24 | 2018-02-09 | 西安交通大学 | A kind of three-dimensional face identification method based on curved surface mediation shape image depth representing |
WO2019080488A1 (en) * | 2017-10-27 | 2019-05-02 | 东南大学 | Three-dimensional human face recognition method based on multi-scale covariance descriptor and local sensitive riemann kernel sparse classification |
CN108416331A (en) * | 2018-03-30 | 2018-08-17 | 百度在线网络技术(北京)有限公司 | Method, apparatus, storage medium and the terminal device that face symmetrically identifies |
CN108416331B (en) * | 2018-03-30 | 2019-08-09 | 百度在线网络技术(北京)有限公司 | Method, apparatus, storage medium and the terminal device that face symmetrically identifies |
CN109147430A (en) * | 2018-10-19 | 2019-01-04 | 渭南师范学院 | A kind of teleeducation system based on cloud platform |
CN111144169A (en) * | 2018-11-02 | 2020-05-12 | 深圳比亚迪微电子有限公司 | Face recognition method and device and electronic equipment |
CN109481248A (en) * | 2018-12-26 | 2019-03-19 | 浙江师范大学 | A kind of smart guide glasses |
CN109801236A (en) * | 2018-12-29 | 2019-05-24 | 中国科学院遥感与数字地球研究所 | A kind of photon point cloud denoising method based on mixed Gauss model |
CN109871818A (en) * | 2019-02-27 | 2019-06-11 | 东南大学 | Face identification method based on normal vector distribution histogram and covariance description |
CN110096983A (en) * | 2019-04-22 | 2019-08-06 | 苏州海赛人工智能有限公司 | The safe dress ornament detection method of construction worker in a kind of image neural network based |
CN110210318A (en) * | 2019-05-06 | 2019-09-06 | 深圳市华芯技研科技有限公司 | A kind of three-dimensional face identification method based on characteristic point |
WO2020248096A1 (en) * | 2019-06-10 | 2020-12-17 | 哈尔滨工业大学(深圳) | Local feature-based three-dimensional face recognition method and system |
CN110298275A (en) * | 2019-06-19 | 2019-10-01 | 东南大学 | Three-dimensional human ear identification method based on key point and local feature |
CN110298275B (en) * | 2019-06-19 | 2022-02-22 | 东南大学 | Three-dimensional human ear identification method based on key points and local features |
CN110335297B (en) * | 2019-06-21 | 2021-10-08 | 华中科技大学 | Point cloud registration method based on feature extraction |
CN110335297A (en) * | 2019-06-21 | 2019-10-15 | 华中科技大学 | A kind of point cloud registration method based on feature extraction |
CN110879972A (en) * | 2019-10-24 | 2020-03-13 | 深圳云天励飞技术有限公司 | Face detection method and device |
CN113168729B (en) * | 2019-12-09 | 2023-06-30 | 深圳大学 | 3D shape matching method and device based on local reference coordinate system |
CN113168729A (en) * | 2019-12-09 | 2021-07-23 | 深圳大学 | 3D shape matching method and device based on local reference coordinate system |
CN111414862A (en) * | 2020-03-22 | 2020-07-14 | 西安电子科技大学 | Expression recognition method based on neural network fusion key point angle change |
CN111414862B (en) * | 2020-03-22 | 2023-03-24 | 西安电子科技大学 | Expression recognition method based on neural network fusion key point angle change |
CN111652086B (en) * | 2020-05-15 | 2022-12-30 | 汉王科技股份有限公司 | Face living body detection method and device, electronic equipment and storage medium |
CN111652086A (en) * | 2020-05-15 | 2020-09-11 | 汉王科技股份有限公司 | Face living body detection method and device, electronic equipment and storage medium |
CN112183276A (en) * | 2020-09-21 | 2021-01-05 | 西安理工大学 | Partially-occluded face recognition method based on feature descriptors |
CN112183276B (en) * | 2020-09-21 | 2024-02-09 | 西安理工大学 | Partial occlusion face recognition method based on feature descriptors |
CN115839675A (en) * | 2023-02-20 | 2023-03-24 | 宜科(天津)电子有限公司 | Object contour line recognition system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106096503A (en) | A kind of based on key point with the three-dimensional face identification method of local feature | |
CN107145842B (en) | Face recognition method combining LBP characteristic graph and convolutional neural network | |
CN107194341B (en) | Face recognition method and system based on fusion of Maxout multi-convolution neural network | |
Wang et al. | Robust 3D face recognition by local shape difference boosting | |
WO2018107979A1 (en) | Multi-pose human face feature point detection method based on cascade regression | |
Islam et al. | Efficient detection and recognition of 3D ears | |
CN100517370C (en) | Identification recognizing method based on binocular iris | |
CN102270308B (en) | Facial feature location method based on five sense organs related AAM (Active Appearance Model) | |
CN109902590A (en) | Pedestrian's recognition methods again of depth multiple view characteristic distance study | |
CN103246875B (en) | A kind of three-dimensional face identification method based on facial contours Elastic Matching | |
CN106250858A (en) | A kind of recognition methods merging multiple face recognition algorithms and system | |
CN103679158A (en) | Face authentication method and device | |
CN103268483A (en) | Method for recognizing palmprint acquired in non-contact mode in open environment | |
Liu et al. | Finger vein recognition with superpixel-based features | |
CN108681737B (en) | Method for extracting image features under complex illumination | |
TW201137768A (en) | Face recognition apparatus and methods | |
CN104850838A (en) | Three-dimensional face recognition method based on expression invariant regions | |
CN104361313A (en) | Gesture recognition method based on multi-kernel learning heterogeneous feature fusion | |
CN109784219A (en) | A kind of face identification method, system and device based on concentration cooperated learning | |
CN106778489A (en) | The method for building up and equipment of face 3D characteristic identity information banks | |
CN112686191B (en) | Living body anti-counterfeiting method, system, terminal and medium based on three-dimensional information of human face | |
CN204791050U (en) | Authentication equipment | |
Jiang | A review of the comparative studies on traditional and intelligent face recognition methods | |
CN106022241A (en) | Face recognition method based on wavelet transformation and sparse representation | |
CN101533466A (en) | Image processing method for positioning eyes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20161109 |
|
RJ01 | Rejection of invention patent application after publication |