CN107403161B - Biological feather recognition method and device - Google Patents

Biological feather recognition method and device Download PDF

Info

Publication number
CN107403161B
CN107403161B CN201710637494.8A CN201710637494A CN107403161B CN 107403161 B CN107403161 B CN 107403161B CN 201710637494 A CN201710637494 A CN 201710637494A CN 107403161 B CN107403161 B CN 107403161B
Authority
CN
China
Prior art keywords
image
network
vena metacarpea
network node
node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710637494.8A
Other languages
Chinese (zh)
Other versions
CN107403161A (en
Inventor
张晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201710637494.8A priority Critical patent/CN107403161B/en
Publication of CN107403161A publication Critical patent/CN107403161A/en
Priority to PCT/CN2017/113585 priority patent/WO2019024350A1/en
Application granted granted Critical
Publication of CN107403161B publication Critical patent/CN107403161B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of biological feather recognition method and devices, this method comprises: acquiring the palmprint image and vena metacarpea image of user to be identified;The palmprint image and the vena metacarpea image are merged, blending image is obtained;Using palmmprint in the blending image and vena metacarpea corresponding pixel points as network node;Based on various boundary conditions, the network node by meeting any constraint condition constructs complex network, to obtain multiple complex networks;Feature to be identified is constituted by the network characterization of the multiple complex network.The present invention improves the validity and accuracy of identification.

Description

Biological feather recognition method and device
Technical field
The invention belongs to field of biological recognition, specifically, being related to a kind of biological feather recognition method and device.
Background technique
Each individual have can uniquely measure or can automatic identification and verifying physiological characteristic or behavior, i.e., it is raw Object feature.Biological identification technology can carry out identification and body to it by unique biological characteristic between this each individual The certification of part is often referred to computer using the biological characteristics such as human body intrinsic fingerprint, face or sound, Lai Jinhang user's body The technology of part certification.
In the prior art, can be by acquiring the fingerprint of each individual, and it can be based on the fingerprint characteristic of fingerprint conversion Carry out fingerprint recognition.It can be by acquiring face face-image, and facial characteristics progress can be converted to based on the face-image Face recognition.The sound that can also be issued by acquiring each individual, and sound characteristic is converted to based on the sound and carries out face Identification.
But the biological characteristics such as fingerprint, face or sound are easy to be tampered, for example, fingerprint can be forged, face can be hidden Gear, sound can use voice changer also to change, therefore, it is impossible to guarantee effectively to identify.
Summary of the invention
In view of this, the present invention provides a kind of biological feather recognition method and device, by by personal recognition with and Vena metacarpea identification combines, and solves the problems, such as effectively identify in the prior art, improves the validity and standard of identification Exactness.
In order to solve the above-mentioned technical problem, the first aspect of the present invention provides a kind of biological feather recognition method, this method Include:
Acquire the palmprint image and vena metacarpea image of user to be identified;
The palmprint image and the vena metacarpea image are merged, blending image is obtained;
Using palmmprint in the blending image and vena metacarpea corresponding pixel points as network node;
Based on various boundary conditions, the network node by meeting any constraint condition constructs complex network, multiple to obtain Complex network;
Feature to be identified is constituted by the network characterization of the multiple complex network.
Preferably, the constraint condition includes that the nodal distance of arbitrary network node is less than constraint distance;Difference constraint item The constraint distance of part is different;
The method also includes:
Calculate the nodal distance of any two network node;
Described to be based on various boundary conditions, the network node by meeting any constraint condition constructs complex network, to obtain Multiple complex networks include:
Based on different constraint distances, by the network section for being less than any constraint distance with the nodal distance of arbitrary network node Point building complex network, to obtain multiple complex networks.
Preferably, the nodal distance for calculating any two network node includes:
According to the corresponding pixel coordinate of any two network node, calculate the coordinate of any two network node away from From;
The coordinate distance is normalized and obtains the nodal distance.
Preferably, the fusion palmprint image and the vena metacarpea image, obtaining blending image includes:
By the palmprint image and vena metacarpea image binaryzation, first is converted by palmmprint and vena metacarpea respective pixel Numerical value, and second value is converted by non-palmmprint and non-vena metacarpea respective pixel value;
The palmprint image and the vena metacarpea image after binaryzation are merged, blending image is obtained.
Preferably, described to include: using palmmprint in the blending image and vena metacarpea corresponding pixel points as network node
By in the blending image, pixel value is the pixel of the first numerical value as network node.
Preferably, the network characterization by the multiple complex network constitutes feature to be identified and includes:
Determine the node degree of each network node in any complex network;
According to the node degree, the network characterization of any complex network is calculated;
The network characterization of complex network any in the multiple complex network is subjected to feature combination;
Using the network characterization after combination as feature to be identified.
Preferably, the fusion palmprint image and the vena metacarpea image, obtaining blending image includes:
The palmprint image of the user to be identified and vena metacarpea image are subjected to noise reduction;
The palmprint image and the vena metacarpea image after merging noise reduction, obtain blending image.
The second aspect of the present invention provides a kind of biometric devices, which includes:
Image capture module, for acquiring the palmprint image and vena metacarpea image of user to be identified;
Image co-registration module obtains blending image for merging the palmprint image and the vena metacarpea image;
Node determining module, for using palmmprint in the blending image and vena metacarpea corresponding pixel points as network section Point;
Network struction module, for being based on various boundary conditions, the network node by meeting any constraint condition constructs multiple Miscellaneous network, to obtain multiple complex networks;
Feature construction module, for constituting feature to be identified by the network characterization of the multiple complex network.
Preferably, the constraint condition includes that the nodal distance of arbitrary network node is less than constraint distance;Difference constraint item The constraint distance of part is different;
Described device further include:
Distance calculation module, for calculating the nodal distance of any two network node;
The network struction module includes:
Network struction unit, for being appointed by being less than with the nodal distance of arbitrary network node based on different constraint distances The network node of one constraint distance constructs complex network, to obtain multiple complex networks.
Preferably, the distance calculation module includes:
Metrics calculation unit, for calculating any two according to the corresponding pixel coordinate of any two network node The coordinate distance of network node;
Range normalization unit obtains the nodal distance for the coordinate distance to be normalized.
Preferably, described image Fusion Module includes:
Image conversion unit is used for by the palmprint image and vena metacarpea image binaryzation, by palmmprint and vena metacarpea Respective pixel is converted into the first numerical value, and converts second value for non-palmmprint and non-vena metacarpea respective pixel value;
First integrated unit is melted for merging the palmprint image and the vena metacarpea image after binaryzation Close image.
Preferably, the node determining module includes:
Node determination unit, for by the blending image, pixel value to be the pixel of the first numerical value as network section Point.
Preferably, the feature construction module includes:
First determination unit, for determining the node degree of each network node in any complex network;
Feature calculation unit, for calculating the network characterization of any complex network according to the node degree;
Feature assembled unit, for the network characterization of complex network any in the multiple complex network to be carried out feature group It closes;
Second determination unit, for the network characterization after combining as feature to be identified.
Preferably, described image Fusion Module includes:
Image noise reduction unit, for the palmprint image of the user to be identified and vena metacarpea image to be carried out noise reduction;
Second integrated unit obtains fusion figure for merging the palmprint image after noise reduction and the vena metacarpea image Picture.
Compared with prior art, the present invention can be obtained including following technical effect:
In the present invention, the palmprint image and vena metacarpea image of user to be identified are acquired, and the two is merged, is melted Image is closed, the blending image combines the feature of palmprint image Yu vena metacarpea image, the enhancing of feature identification degree.By the fusion Palmmprint and vena metacarpea corresponding pixel points are as network node in image, and are based on various boundary conditions, by meeting any constraint The network node building complex network of condition obtains multiple complex networks, be made of the network characterization of the multiple complex network to Identification feature.The multiple complex network is that the pixel based on the higher palmmprint of identification and vena metacarpea is constituted, in conjunction with The feature of the two, also, feature to be identified constituted by the network characterization of multiple complex networks can more characterize user to be identified Identification feature, the validity and accuracy of identification can be improved.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present invention, constitutes a part of the invention, this hair Bright illustrative embodiments and their description are used to explain the present invention, and are not constituted improper limitations of the present invention.In the accompanying drawings:
Fig. 1 is a kind of flow chart of one embodiment of biological feather recognition method of the embodiment of the present invention;
Fig. 2 is a kind of flow chart of another embodiment of biological feather recognition method of the embodiment of the present invention;
Fig. 3 is a kind of structural schematic diagram of one embodiment of biometric devices of the embodiment of the present invention;
Fig. 4 is a kind of structural schematic diagram of another embodiment of biometric devices of the embodiment of the present invention.
Specific embodiment
Carry out the embodiment that the present invention will be described in detail below in conjunction with accompanying drawings and embodiments, how the present invention is applied whereby Technological means solves technical problem and reaches the realization process of technical effect to fully understand and implement.
The embodiment of the present invention is mainly used in field of biological recognition, and mainly acquisition palmmprint and two kinds of features of vena metacarpea carry out table The validity and accuracy of identification can be improved in the biological characteristic for levying user to be identified.
In the prior art, bio-identification mostly uses face recognition, iris recognition or voice recognition, but these identification features It is easy to be tampered, it cannot be guaranteed that effectively identification.
Inventor is the study found that palmmprint and vena metacarpea are the biological characteristics of human body.And everyone palmmprint and the palm Vein is different, can be used for bio-identification.But inventors have found that palmmprint is outside a kind of be exposed to when using personal recognition Biological characteristic is easy to be forged;And the vena metacarpea feature that vena metacarpea is constituted is less, accuracy of identification is lower.Therefore, hold to overcome Easy drawbacks described above, inventor expect blending personal recognition and vena metacarpea identification, and the fusion for obtaining palmmprint and vena metacarpea is special Sign, can not only guarantee the accuracy of identification of the identification feature of user to be identified, but also may insure the identification feature of user to be identified Unforgeable, propose technical solution of the present invention accordingly.
In the embodiment of the present invention, the palmprint image and vena metacarpea image of user to be identified are acquired, two kinds for acquiring are passed through Image is merged, and blending image is obtained;And according in blending image determine network node, using different constraint condition with It determines different multiple complex networks, feature to be identified is constituted by the network characterization of multiple complex networks.The complex network of building It is the basis of determining feature to be identified, and complex network is obtained based on the pixel of blending image, it can be ensured that The information to be identified of building has the content there are two types of biological characteristic simultaneously, and then improves the validity and accuracy of identification.
In the following, the embodiment of the present invention will be described in detail in conjunction with attached drawing.
As shown in Figure 1, being a kind of process of one embodiment of biological feather recognition method provided in an embodiment of the present invention Figure, this method may include following steps:
101: acquiring the palmprint image and vena metacarpea image of user to be identified.
Wherein, the palmprint image is identical as the size of the vena metacarpea image, for example, the two can be pixels tall For h, pixel wide is the image of w, and the size of picture element matrix is w*h.
It is alternatively possible to acquire the palmprint image and vena metacarpea image using multispectral palm imaging sensor.Mostly light Spectrum palm imaging sensor can acquire different types of palm image under different light sources, for example, can be under visible light source Palmprint image is acquired, vena metacarpea image can be acquired under infrared light supply.
It is alternatively possible within the time interval shorter time user to be identified described in continuous acquisition palmprint image and the palm The position of vein image, palmprint image and palm in vena metacarpea image to ensure to acquire is constant.The palmprint image and the palm are quiet Arteries and veins image is also possible in different periods acquisition, it is only necessary to the palm position of palmprint image and vena metacarpea image when ensuring to acquire It does not change.
102: merging the palmprint image and the vena metacarpea image, obtain blending image.
Palm print characteristics and vena metacarpea feature are contained in the palmprint image and the vena metacarpea image, by palmprint image And vena metacarpea image is merged and is also merged palm print characteristics and vena metacarpea feature.
The fusion palmprint image can refer to the palmprint image and the palm is quiet with the vena metacarpea image Arteries and veins image carries out pixel overlapping according to the position of each pixel respectively, makes corresponding two pixel values in the same position.With described The pixels tall of palmprint image and the vena metacarpea image is h, and for pixel wide is w, the palmprint image and the palm are quiet Arteries and veins image is illustrated as the two-dimensional matrix of w*h, and after the two is overlapped, the blending image can be expressed as 2*w* The three-dimensional matrice of h.
103: using palmmprint in the blending image and vena metacarpea corresponding pixel points as network node.
In blending image, palmmprint and the corresponding pixel value of vena metacarpea picture corresponding with the non-palmmprint of surrounding and non-vena metacarpea Plain value differs greatly, and therefore, the difference that can use pixel value determines that palmmprint and vena metacarpea are corresponding in blending image Pixel.
104: being based on various boundary conditions, the network node by meeting any constraint condition constructs complex network, to obtain Multiple complex networks.
The constraint condition is mainly used for constraining the network node, when the network node meets constraint condition, i.e., The network node building complex network for meeting constraint condition can be used.
Optionally, under various boundary conditions, the network node for meeting the constraint condition is also different, therefore, by heterogeneous networks The complex network of node building is also different.
Since network node is the palmmprint and the corresponding pixel of vena metacarpea, the network section of the complex network Point is actually to be constituted according to the palmmprint and the corresponding pixel of vena metacarpea, and then complex network is determined for slapping The correlated characteristic of line and vena metacarpea.
It is alternatively possible to the network characterization of the multiple complex network be obtained, for example, the network characterization can refer to averagely The characteristic values such as degree, variance degree and maximal degree.
105: feature to be identified is constituted by the network characterization of the multiple complex network.
It is alternatively possible to by the network characterization of the multiple complex network feature to be identified in series.With described For the network characterization of Complex Networks Feature is the two-dimensional matrix of 1*3, it is assumed that the multiple complex network is 10, then structure of connecting At feature to be identified be 10*3.In the embodiment of the present invention, palmprint image and vena metacarpea image are acquired, to obtain melting for the two Image is closed, and then two kinds of biological characteristics of palmmprint and vena metacarpea can be merged, and pair based on palmmprint and vena metacarpea It answers pixel as network node, determines different complex networks, namely obtain multiple network between palmmprint and vena metacarpea Association, and then can determine different network characterizations, the feature to be identified being made of the network characterization of the multiple complex network Content more comprehensively, the validity and accuracy of identification can be improved.
It is illustrated in figure 2 a kind of process of another embodiment of biological feather recognition method provided in an embodiment of the present invention Figure, in this embodiment, the constraint condition may include: that the nodal distance of arbitrary network node is less than constraint distance;It is different The constraint distance of constraint condition is different.The method can be with including the following steps:
201: acquiring the palmprint image and vena metacarpea image of user to be identified;
202: merging the palmprint image and the vena metacarpea image, obtain blending image;
203: using palmmprint in the blending image and vena metacarpea corresponding pixel points as network node;
204: calculating the nodal distance of any two network node.
205: based on different constraint distances, by the net for being less than any constraint distance with the nodal distance of arbitrary network node Network node constructs complex network, to obtain multiple complex networks.
In the case where regarding the blending image as a three-dimensional matrice, any one pixel can be understood as according to What the mode of three-dimensional matrice was arranged, therefore there are corresponding coordinate points in any one pixel, thus may determine that any two Coordinate distance between a network node.
It is assumed that the coordinate position of any one pixel is (X1, Y1, Z1) on palmprint image, on vena metacarpea image any one The coordinate position of pixel is (X2, Y2, Z2), then the coordinate distance between two network nodes can be expressed as D, and D can be by It is calculated according to following formula:
Certainly, the coordinate distance comprised more than between palmmprint pixel and vena metacarpea pixel can also include the palm The coordinate between coordinate distance and vena metacarpea pixel and vena metacarpea pixel between line pixel and palmmprint pixel away from From.
The constraint distance refers to the distance constant for constraining the nodal distance between two network nodes, constrains distance It is to be estimated according to the nodal distance.When facing different palmprint image and vein image, the nodal distance can Can be different, and there may be larger difference between different nodal distances, it needs for the nodal distance to differ greatly obtained, It needs to be determined that the corresponding constraint distance to differ greatly.But when face to face to the biggish palmprint image of quantity and vein image, Successively determine that constraint apart from sufficiently complex, is unfavorable for widespread adoption of the invention.
Therefore, optionally, the nodal distance for calculating any two network node may include:
According to the corresponding pixel coordinate of any two network node, calculate the coordinate of any two network node away from From.
The coordinate distance is normalized and obtains the nodal distance.
Normalization is the ratio for calculating the size of coordinate distance and the blending image between the network node, is obtained Corresponding normalized cumulant.The normalization formula can be calculated according to following formula:
Wherein, w is the width of palmprint image and vena metacarpea image, and h is the height of palmprint image and vena metacarpea image; Since blending image is overlapped to obtain by palmprint image and vena metacarpea image, the height of the blending image is 2.
It is described that the coordinate distance is normalized, the coordinate distance is subjected to unification, and by the constraint Away from it is unified to (0,1] in section.And then when facing a large amount of palmprint image and vein image, same group of constraint can be used Distance does not need repeatedly to be defined the constraint distance.Calculating process is simplified, also facilitates and of the invention largely answers With.
206: feature to be identified is constituted by the network characterization of the multiple complex network.
The network node is made of the pixel of palmmprint and vena metacarpea.The multiple complex network by meet constraint away from From any two network node between line constitute, have recorded distance relation between any two network node, rather than The spatial relation of arbitrary network node.And the feature to be identified is made of the network characterization of multiple complex networks, therefore, The feature to be identified represents the relative positional relationship of any two pixel of the palmmprint and vena metacarpea, opposite to revolve Turn, displacement equal error has stronger robustness.That is, acquire the palmprint image and vena metacarpea image every time, with obtain to When identification feature, is not influenced by rotating, be displaced etc., more stable feature to be identified can be obtained.
For example it is assumed that when palmprint image and vena metacarpea image acquire for the first time, finger direction 12 o'clock direction, and palmmprint When image and vena metacarpea image acquire for the second time, finger is directed toward 1 o'clock direction, although this palmprint image for acquiring twice and There are 30 ° of differences for vena metacarpea image, still, the phase between any two pixel of the palmmprint and vena metacarpea that acquire twice To position relationship, and then the multiple complex networks constituted are constant, and the feature to be identified of acquisition is relatively stable.
In the embodiment of the present invention, corresponding different complex web is determined by the nodal distance between two network nodes Network, complex network that can be multiple and different by determination, and the complex network is the pixel based on palmmprint and vena metacarpea It constitutes, and then can determine multiple network characterizations of palmmprint and vena metacarpea, so that network characterization is more acurrate, and then can obtain Obtain higher recognition effect and accuracy.
As another embodiment, the fusion palmprint image and the vena metacarpea image obtain blending image packet It includes:
By the palmprint image and vena metacarpea image binaryzation, first is converted by palmmprint and vena metacarpea respective pixel Numerical value, and second value is converted by non-palmmprint and non-vena metacarpea respective pixel value;
The palmprint image and the vena metacarpea image after binaryzation are merged, blending image is obtained.
Optionally, described to include: using palmmprint in the blending image and vena metacarpea corresponding pixel points as network node
By in the blending image, pixel value is the pixel of the first numerical value as network node.
The palmprint image and vein image binaryzation are referred to, by the palmmprint and vena metacarpea image in palmprint image In vena metacarpea extract, the first numerical value that its corresponding pixel defines is identified, and other non-palmmprints with And non-vena metacarpea pixel definition second value be identified, and then can clearly determine the palmmprint in palmprint image with And the vena metacarpea in vena metacarpea image.
Optionally, in order to keep the palmmprint and vena metacarpea apparent, first numerical value can be 1, second number Value can be 0.
It is alternatively possible to which the palmprint image and vena metacarpea image are carried out binaryzation transformation using Binarization methods.
The Binarization methods can refer to LBP (Local Binary Patterns, local binary patterns), mean value window Mouth filtering algorithm scheduling algorithm.
In the embodiment of the present invention, before the palmprint image and vein image are merged, first by the palmmprint figure As and vena metacarpea image carried out binaryzation transformation, and then can the palm by the palmmprint in palmprint image, in vena metacarpea image The features such as vein determine that other useless features are then rejected, and then can accurately determine and belong to palmmprint and slap quiet The pixel of arteries and veins, and using accurately determining palmmprint pixel terminal and vena metacarpea pixel as network node, it is accurate to constitute Complex network, and then can determine more accurate feature to be identified further increases the accuracy and effectively of identification feature Property.
As another embodiment, the network characterization by the multiple complex network constitutes feature to be identified and includes:
Determine the node degree of each network node in any complex network;
According to the node degree, the network characterization of any complex network is calculated;
The network characterization of complex network any in the multiple complex network is subjected to feature combination;
Using the network characterization after combination as feature to be identified.
The node degree of the network node can refer to the connection quantity an of network node and other network nodes.Example Such as, a network node is connected to 3 other network nodes, and the degree of the network node is 3.
Determine that the node degree of each network node in any complex network is actually each in determining any complex network Connection relationship between network node and other network nodes, and then phase can be determined according to the connection relationship between network node The network characterization answered.
The network characterization for calculating any complex network may include:
Calculate the average degree, variance degree and/or maximal degree of any complex network.
The average degree refers to the average value of all-network node moderate;The variance degree refer to according to the average degree with And each node degree calculates the variance yields of the degree obtained;The maximal degree refers to the maximum value of all-network node moderate.
In the embodiment of the present invention, based on the node degree of each network node, the multiple networks for calculating blending image are special Sign.The network characterization of any complex network can be more accurately determined using the calculation of node degree, it is available more acurrate Network characterization, and then improve identification validity and accuracy.
As another embodiment, the fusion palmprint image and the vena metacarpea image obtain blending image packet It includes:
The palmprint image of the user to be identified and vena metacarpea image are subjected to noise reduction;
The palmprint image and the vena metacarpea image after merging noise reduction, obtain blending image.
Wherein, by the palmprint image of the user to be identified and vena metacarpea image carry out noise reduction can refer to by it is described to The palmprint image and the radio-frequency component in vena metacarpea image for identifying user filter out.
It is alternatively possible to using noise reduction algorithm by the height in the palmprint image and vena metacarpea image of the user to be identified Frequency ingredient is filtered out.The noise reduction algorithm can refer to wavelet transformation, Kalman filtering algorithm, median filtering algorithm etc..
In the embodiment of the present invention, before being merged the palmprint image and vena metacarpea image, by the palmmprint figure Picture and vein image have carried out noise reduction process, and the palmmprint in palmprint image, the vena metacarpea in vena metacarpea image can be made more clear Clear, when determining palmmprint and vena metacarpea, the reduction of various noises can make the palmmprint and the corresponding pixel of vena metacarpea It is more acurrate, and then can determine more accurate complex network, to obtain the higher feature to be identified of accuracy, further improve The validity and accuracy of identification.
As shown in figure 3, be a kind of structural schematic diagram of one embodiment of biometric devices of the embodiment of the present invention, The apparatus may include following modules:
Image capture module 301, for acquiring the palmprint image and vena metacarpea image of user to be identified.
Wherein, the palmprint image is identical as the size of the vena metacarpea image, for example, the two can be pixels tall For h, pixel wide is the image of w, and the size of picture element matrix is w*h.
It is alternatively possible to acquire the palmprint image and vena metacarpea image using multispectral palm imaging sensor.Mostly light Spectrum palm imaging sensor can acquire different types of palm image under different light sources, for example, can be under visible light source Palmprint image is acquired, vena metacarpea image can be acquired under infrared light supply.
It is alternatively possible within the time interval shorter time user to be identified described in continuous acquisition palmprint image and the palm The position of vein image, palmprint image and palm in vena metacarpea image to ensure to acquire is constant.The palmprint image and the palm are quiet Arteries and veins image is also possible in different periods acquisition, it is only necessary to the palm position of palmprint image and vena metacarpea image when ensuring to acquire It does not change.
Image co-registration module 302 obtains blending image for merging the palmprint image and the vena metacarpea image.
Palm print characteristics and vena metacarpea feature are contained in the palmprint image and the vena metacarpea image, by palmprint image And vena metacarpea image is merged and is also merged palm print characteristics and vena metacarpea feature.
The fusion palmprint image can refer to the palmprint image and the palm is quiet with the vena metacarpea image Arteries and veins image carries out pixel overlapping according to the position of each pixel respectively, and making the same position includes two pixel values.With described The pixels tall of palmprint image and the vena metacarpea image is h, and for pixel wide is w, the palmprint image and the palm are quiet Arteries and veins image is illustrated as the two-dimensional matrix of w*h, and after the two is overlapped, the blending image can be expressed as 2*w* The three-dimensional matrice of h.
Node determining module 303, for using palmmprint in the blending image and vena metacarpea corresponding pixel points as network Node.
In blending image, the pixel value of palmmprint and vena metacarpea and differing greatly for the pixel value of surrounding normal skin, because This, the difference that can use pixel value determines palmmprint and the corresponding pixel of vena metacarpea in blending image.
Network struction module 304, for being based on various boundary conditions, the network node by meeting any constraint condition is constructed Complex network, to obtain multiple complex networks.
The constraint condition is mainly used for constraining the network node, when the network node meets constraint condition, i.e., The network node building complex network for meeting constraint condition can be used.
Optionally, under various boundary conditions, the network node for meeting the constraint condition is also different, therefore, by heterogeneous networks The complex network of node building is also different.
Since network node is the palmmprint and the corresponding pixel of vena metacarpea, the network section of the complex network Point is actually to be constituted according to the palmmprint and the corresponding pixel of vena metacarpea, and then complex network is determined for slapping The correlated characteristic of line and vena metacarpea.
It is alternatively possible to the network characterization of the multiple complex network be obtained, for example, the network characterization can refer to averagely The characteristic values such as degree, variance degree and maximal degree.
Feature construction module 305, for constituting feature to be identified by the network characterization of the multiple complex network.
It is alternatively possible to by the network characterization of the multiple complex network feature to be identified in series.With described For the network characterization of Complex Networks Feature is the two-dimensional matrix of 1*3, it is assumed that the multiple complex network is 10, then structure of connecting At feature to be identified be 10*3.
In the embodiment of the present invention, palmprint image and vena metacarpea image are acquired, to obtain the blending image of the two, Jin Erke To merge two kinds of biological characteristics of palmmprint and vena metacarpea, and the corresponding pixel points based on palmmprint and vena metacarpea are as net Network node determines different complex networks, namely obtains multiple network between palmmprint and vena metacarpea and be associated with, and then can be true Fixed different network characterization, the content for the feature to be identified being made of the network characterization of the multiple complex network more comprehensively, can To improve the validity and accuracy of identification.
As shown in figure 4, be a kind of structure flow chart of one embodiment of biometric devices of the embodiment of the present invention, The apparatus may include following modules:
Image capture module 401, for acquiring the palmprint image and vena metacarpea image of user to be identified.
Image co-registration module 402 obtains blending image for merging the palmprint image and the vena metacarpea image.
Node determining module 403, for using palmmprint in the blending image and vena metacarpea corresponding pixel points as network Node.
Distance calculation module 404, for calculating the nodal distance of any two network node.
In the case where regarding the blending image as a three-dimensional matrice, any one pixel can be understood as according to What the mode of three-dimensional matrice was arranged, therefore there are corresponding coordinate points in any one pixel, thus may determine that any two Coordinate distance between a network node.
Certainly, the coordinate distance comprised more than between palmmprint pixel and vena metacarpea pixel can also include the palm The coordinate between coordinate distance and vena metacarpea pixel and vena metacarpea pixel between line pixel and palmmprint pixel away from From.
The constraint distance refers to the distance constant for constraining the nodal distance between two network nodes, constrains distance It is to be estimated according to the nodal distance.When facing different palmprint image and vein image, the nodal distance can Can be different, and there may be larger difference between different nodal distances, it needs for the nodal distance to differ greatly obtained, It needs to be determined that the corresponding constraint distance to differ greatly.But when face to face to the biggish palmprint image of quantity and vein image, Successively determine that constraint apart from sufficiently complex, is unfavorable for widespread adoption of the invention.
Optionally, the distance calculation module may include:
Metrics calculation unit, for calculating any two according to the corresponding pixel coordinate of any two network node The coordinate distance of network node;
Range normalization unit obtains the nodal distance for the coordinate distance to be normalized.
Normalization is the ratio of the size of the coordinate distance and the blending image between the network node, is obtained corresponding Normalized cumulant.
It is described that the coordinate distance is normalized, the coordinate distance is subjected to unification, and by the constraint Away from it is unified to (0,1] in section.And then when facing a large amount of palmprint image and vein image, same group of constraint can be used Distance does not need repeatedly to be defined the constraint distance.Calculating process is simplified, also facilitates and of the invention largely answers With.
Network struction module 405, for being based on various boundary conditions, the network node by meeting any constraint condition is constructed Complex network, to obtain multiple complex networks.
The constraint condition may include that the nodal distance of arbitrary network node is less than constraint distance;Various boundary conditions Constraint distance is different.
The network struction module may include:
Network struction unit 4051, for based on different constraint distances, by small with the nodal distance of arbitrary network node Complex network is constructed in the network node of any constraint distance, to obtain multiple complex networks.
Feature construction module 406, for constituting feature to be identified by the network characterization of the multiple complex network.
The network node is made of the pixel of palmmprint and vena metacarpea.The multiple complex network by meet constraint away from From any two network node between line constitute, have recorded distance relation between any two network node, rather than The spatial relation of arbitrary network node.And the feature to be identified is made of the network characterization of multiple complex networks, therefore, The feature to be identified represents the relative positional relationship of any two pixel of the palmmprint and vena metacarpea, opposite to revolve Turn, displacement equal error has stronger robustness.That is, acquire the palmprint image and vena metacarpea image every time, with obtain to When identification feature, is not influenced by rotating, be displaced etc., more stable feature to be identified can be obtained.
In the embodiment of the present invention, corresponding different complex web is determined by the nodal distance between two network nodes Network, complex network that can be multiple and different by determination, and the complex network is the pixel based on palmmprint and vena metacarpea It constitutes, and then can determine multiple network characterizations of palmmprint and vena metacarpea, so that network characterization is more acurrate, and then can obtain Obtain higher recognition effect and accuracy.
As another embodiment, described image Fusion Module may include:
Image conversion unit is used for by the palmprint image and vena metacarpea image binaryzation, by palmmprint and vena metacarpea Respective pixel is converted into the first numerical value, and converts second value for non-palmmprint and non-vena metacarpea respective pixel value.
First integrated unit is melted for merging the palmprint image and the vena metacarpea image after binaryzation Close image.
Optionally, the node determining module may include:
Node determination unit, for by the blending image, pixel value to be the pixel of the first numerical value as network section Point.
The palmprint image and vein image binaryzation are referred to, by the palmmprint and vena metacarpea image in palmprint image In vena metacarpea extract, the first numerical value that its corresponding pixel defines is identified, and other non-palmmprints with And the second value of the pixel definition of non-vena metacarpea is identified.It can clearly determine the palmmprint and the palm in palmprint image Vena metacarpea in vein image.
Optionally, in order to keep the palmmprint and vena metacarpea apparent, first numerical value can be 1, second number Value can be 0.
It is alternatively possible to which the palmprint image and vena metacarpea image are carried out binaryzation using Binarization methods.
The Binarization methods can refer to LBP (Local Binary Patterns, local binary patterns), mean value window Mouth filtering algorithm etc..
In the embodiment of the present invention, before the palmprint image and vein image are merged, first by the palmmprint figure As and vena metacarpea image carried out binaryzation transformation, and then can the palm by the palmmprint in palmprint image, in vena metacarpea image The features such as vein determine that other useless features are then rejected, and then can accurately determine and belong to palmmprint and slap quiet The pixel of arteries and veins, and using accurately determining palmmprint pixel terminal and vena metacarpea pixel as network node, it is accurate to constitute Complex network, and then can determine more accurate feature to be identified further increases the accuracy and effectively of identification feature Property.
As another embodiment, the feature construction module may include:
First determination unit, for determining the node degree of each network node in any complex network.
Feature calculation unit, for calculating the network characterization of any complex network according to the node degree.
Feature assembled unit, for the network characterization of complex network any in the multiple complex network to be carried out feature group It closes.
Second determination unit, for the network characterization after combining as feature to be identified.
The node degree of the network node can refer to the connection an of network node and other network nodes.Example Such as, a network node is connected to 3 other network nodes, and the degree of the network node is 3.
Determine that the node degree of each network node in any complex network is actually each in determining any complex network Connection relationship between network node and other network nodes, and then phase can be determined according to the connection relationship between network node The network characterization answered.
The network characterization for calculating any complex network may include:
Calculate the average degree, variance degree and/or maximal degree of any complex network.
The average degree refers to the average value of all-network node moderate;The variance degree refer to according to the average degree with And each node degree calculates the variance yields of the degree obtained;The maximal degree refers to the maximum value of all-network node moderate.
In the embodiment of the present invention, based on the node degree of each network node, the multiple networks for calculating blending image are special Sign.The network characterization of any complex network can be more accurately determined using the calculation of node degree, it is available more acurrate Network characterization, and then improve identification validity and accuracy.
As another embodiment, described image Fusion Module may include:
Image noise reduction unit, for the palmprint image of the user to be identified and vena metacarpea image to be carried out noise reduction;
Second integrated unit obtains fusion figure for merging the palmprint image after noise reduction and the vena metacarpea image Picture.
Wherein, by the palmprint image of the user to be identified and vena metacarpea image carry out noise reduction can refer to by it is described to The palmprint image and the radio-frequency component in vena metacarpea image for identifying user filter out.
It is alternatively possible to using noise reduction algorithm by the height in the palmprint image and vena metacarpea image of the user to be identified Frequency ingredient is filtered out.The noise reduction algorithm can refer to wavelet transformation, Kalman filtering algorithm, median filtering algorithm etc..
In the embodiment of the present invention, before being merged the palmprint image and vena metacarpea image, by the palmmprint figure Picture and vein image have carried out noise reduction process, and the palmmprint in palmprint image, the vena metacarpea in vena metacarpea image can be made more clear Clear, when determining palmmprint and vena metacarpea, the reduction of various noises can make the palmmprint and the corresponding pixel of vena metacarpea It is more acurrate, and then can determine more accurate complex network, to obtain the higher feature to be identified of accuracy, further improve The validity and accuracy of identification.
In a typical configuration, calculating equipment includes one or more processors (CPU), input/output interface, net Network interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/or The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method Or technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data. The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), moves State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable Programmable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM), Digital versatile disc (DVD) or other optical storage, magnetic cassettes, tape magnetic disk storage or other magnetic storage devices Or any other non-transmission medium, can be used for storage can be accessed by a computing device information.As defined in this article, it calculates Machine readable medium does not include non-temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
As used some vocabulary to censure specific components in the specification and claims.Those skilled in the art answer It is understood that hardware manufacturer may call the same component with different nouns.This specification and claims are not with name The difference of title is as the mode for distinguishing component, but with the difference of component functionally as the criterion of differentiation.Such as logical The "comprising" of piece specification and claim mentioned in is an open language, therefore should be construed to " include but do not limit In "." substantially " refer within the acceptable error range, those skilled in the art can within a certain error range solve described in Technical problem basically reaches the technical effect.In addition, " coupling " word includes any direct and indirect electric property coupling herein Means.Therefore, if it is described herein that a first device is coupled to a second device, then representing the first device can directly electrical coupling It is connected to the second device, or the second device indirectly electrically coupled through other devices or coupling means.Specification Subsequent descriptions are to implement better embodiment of the invention, so the description be for the purpose of illustrating rule of the invention, The range being not intended to limit the invention.Protection scope of the present invention is as defined by the appended claims.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludability Include, so that commodity or system including a series of elements not only include those elements, but also including not clear The other element listed, or further include for this commodity or the intrinsic element of system.In the feelings not limited more Under condition, the element that is limited by sentence "including a ...", it is not excluded that in the commodity or system for including the element also There are other identical elements
Several preferred embodiments of the invention have shown and described in above description, but as previously described, it should be understood that the present invention Be not limited to forms disclosed herein, should not be regarded as an exclusion of other examples, and can be used for various other combinations, Modification and environment, and the above teachings or related fields of technology or knowledge can be passed through in application contemplated scope described herein It is modified.And changes and modifications made by those skilled in the art do not depart from the spirit and scope of the present invention, then it all should be in this hair In the protection scope of bright appended claims.

Claims (12)

1. a kind of biological feather recognition method characterized by comprising
Acquire the palmprint image and vena metacarpea image of user to be identified;
The palmprint image and the vena metacarpea image are merged, blending image is obtained;
Using palmmprint in the blending image and vena metacarpea corresponding pixel points as network node;
Based on various boundary conditions, the network node by meeting any constraint condition constructs complex network, to obtain multiple complexity Network;
Feature to be identified is constituted by the network characterization of the multiple complex network;
The constraint condition includes that the nodal distance of arbitrary network node is less than constraint distance;The constraint distance of various boundary conditions It is different;
The method also includes:
Calculate the nodal distance of any two network node;
Described to be based on various boundary conditions, the network node by meeting any constraint condition constructs complex network, multiple to obtain Complex network includes:
Based on different constraint distances, by the network node structure for being less than any constraint distance with the nodal distance of arbitrary network node Complex network is built, to obtain multiple complex networks.
2. the method according to claim 1, wherein the nodal distance packet for calculating any two network node It includes:
According to the corresponding pixel coordinate of any two network node, the coordinate distance of any two network node is calculated;
The coordinate distance is normalized and obtains the nodal distance.
3. the method according to claim 1, wherein the fusion palmprint image and the vena metacarpea figure Picture, obtaining blending image includes:
By the palmprint image and vena metacarpea image binaryzation, the first number is converted by palmmprint and vena metacarpea respective pixel Value, and second value is converted by non-palmmprint and non-vena metacarpea respective pixel value;
The palmprint image and the vena metacarpea image after binaryzation are merged, blending image is obtained.
4. according to the method described in claim 3, it is characterized in that, described by palmmprint and vena metacarpea pair in the blending image The pixel is answered to include: as network node
By in the blending image, pixel value is the pixel of the first numerical value as network node.
5. the method according to claim 1, wherein the network characterization by the multiple complex network is constituted Feature to be identified includes:
Determine that the node degree of each network node in any complex network, the node degree of the network node refer to a network section The connection quantity of point and other network nodes;
According to the node degree, the network characterization of any complex network is calculated;
The network characterization of complex network any in the multiple complex network is subjected to feature combination;
Using the network characterization after combination as feature to be identified.
6. the method according to claim 1, wherein the fusion palmprint image and the vena metacarpea figure Picture, obtaining blending image includes:
The palmprint image of the user to be identified and vena metacarpea image are subjected to noise reduction;
The palmprint image and the vena metacarpea image after merging noise reduction, obtain blending image.
7. a kind of biometric devices characterized by comprising
Image capture module, for acquiring the palmprint image and vena metacarpea image of user to be identified;
Image co-registration module obtains blending image for merging the palmprint image and the vena metacarpea image;
Node determining module, for using palmmprint in the blending image and vena metacarpea corresponding pixel points as network node;
Network struction module, for being based on various boundary conditions, the network node by meeting any constraint condition constructs complex web Network, to obtain multiple complex networks;
Feature construction module, for constituting feature to be identified by the network characterization of the multiple complex network;
The constraint condition includes that the nodal distance of arbitrary network node is less than constraint distance;The constraint distance of various boundary conditions It is different;
Described device further include:
Distance calculation module, for calculating the nodal distance of any two network node;
The network struction module includes:
Network struction unit, for based on different constraint distances, by with the nodal distance of arbitrary network node be less than it is any about The network node of beam distance constructs complex network, to obtain multiple complex networks.
8. device according to claim 7, which is characterized in that the distance calculation module includes:
Metrics calculation unit, for calculating any two network according to the corresponding pixel coordinate of any two network node The coordinate distance of node;
Range normalization unit obtains the nodal distance for the coordinate distance to be normalized.
9. device according to claim 7, which is characterized in that described image Fusion Module includes:
Image conversion unit, for by the palmprint image and vena metacarpea image binaryzation, palmmprint and vena metacarpea to be corresponded to Pixel is converted into the first numerical value, and converts second value for non-palmmprint and non-vena metacarpea respective pixel value;
First integrated unit obtains fusion figure for merging the palmprint image and the vena metacarpea image after binaryzation Picture.
10. device according to claim 9, which is characterized in that the node determining module includes:
Node determination unit, for by the blending image, pixel value to be the pixel of the first numerical value as network node.
11. device according to claim 7, which is characterized in that the feature construction module includes:
First determination unit, for determining the node degree of each network node in any complex network, the section of the network node Point degree refers to the connection quantity an of network node and other network nodes;
Feature calculation unit, for calculating the network characterization of any complex network according to the node degree;
Feature assembled unit, for the network characterization of complex network any in the multiple complex network to be carried out feature combination;
Second determination unit, for the network characterization after combining as feature to be identified.
12. device according to claim 7, which is characterized in that described image Fusion Module includes:
Image noise reduction unit, for the palmprint image of the user to be identified and vena metacarpea image to be carried out noise reduction;
Second integrated unit obtains blending image for merging the palmprint image after noise reduction and the vena metacarpea image.
CN201710637494.8A 2017-07-31 2017-07-31 Biological feather recognition method and device Active CN107403161B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710637494.8A CN107403161B (en) 2017-07-31 2017-07-31 Biological feather recognition method and device
PCT/CN2017/113585 WO2019024350A1 (en) 2017-07-31 2017-11-29 Biometric recognition method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710637494.8A CN107403161B (en) 2017-07-31 2017-07-31 Biological feather recognition method and device

Publications (2)

Publication Number Publication Date
CN107403161A CN107403161A (en) 2017-11-28
CN107403161B true CN107403161B (en) 2019-07-05

Family

ID=60401655

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710637494.8A Active CN107403161B (en) 2017-07-31 2017-07-31 Biological feather recognition method and device

Country Status (2)

Country Link
CN (1) CN107403161B (en)
WO (1) WO2019024350A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107403161B (en) * 2017-07-31 2019-07-05 歌尔科技有限公司 Biological feather recognition method and device
CN109344849B (en) * 2018-07-27 2022-03-11 广东工业大学 Complex network image identification method based on structure balance theory
TWI678661B (en) * 2018-08-31 2019-12-01 中華電信股份有限公司 Palm print recognition apparatus and method having data extension
CN109614988B (en) * 2018-11-12 2020-05-12 国家电网有限公司 Biological identification method and device
CN117542090B (en) * 2023-11-25 2024-06-18 一脉通(深圳)智能科技有限公司 Palm print vein recognition method based on fusion network and SIF characteristics

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116741A (en) * 2013-01-28 2013-05-22 天津理工大学 Capture and identification system for blending images of palm veins and palm prints
CN103200096A (en) * 2013-03-13 2013-07-10 南京理工大学 Heuristic routing method avoiding key nodes in complex network
CN106022218A (en) * 2016-05-06 2016-10-12 浙江工业大学 Palm print palm vein image layer fusion method based on wavelet transformation and Gabor filter
CN106548134A (en) * 2016-10-17 2017-03-29 沈阳化工大学 GA optimizes palmmprint and the vena metacarpea fusion identification method that SVM and normalization combine

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8229178B2 (en) * 2008-08-19 2012-07-24 The Hong Kong Polytechnic University Method and apparatus for personal identification using palmprint and palm vein
CN107403161B (en) * 2017-07-31 2019-07-05 歌尔科技有限公司 Biological feather recognition method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116741A (en) * 2013-01-28 2013-05-22 天津理工大学 Capture and identification system for blending images of palm veins and palm prints
CN103200096A (en) * 2013-03-13 2013-07-10 南京理工大学 Heuristic routing method avoiding key nodes in complex network
CN106022218A (en) * 2016-05-06 2016-10-12 浙江工业大学 Palm print palm vein image layer fusion method based on wavelet transformation and Gabor filter
CN106548134A (en) * 2016-10-17 2017-03-29 沈阳化工大学 GA optimizes palmmprint and the vena metacarpea fusion identification method that SVM and normalization combine

Also Published As

Publication number Publication date
CN107403161A (en) 2017-11-28
WO2019024350A1 (en) 2019-02-07

Similar Documents

Publication Publication Date Title
CN107403161B (en) Biological feather recognition method and device
Ming et al. A survey on anti-spoofing methods for facial recognition with rgb cameras of generic consumer devices
Abate et al. 2D and 3D face recognition: A survey
US20160162673A1 (en) Technologies for learning body part geometry for use in biometric authentication
KR20160101973A (en) System and method for identifying faces in unconstrained media
Thavalengal et al. Iris liveness detection for next generation smartphones
US20080298644A1 (en) System and method for controlling image quality
Perakis et al. Feature fusion for facial landmark detection
Ratyal et al. Deeply learned pose invariant image analysis with applications in 3D face recognition
CN111222380B (en) Living body detection method and device and recognition model training method thereof
CN108932504A (en) Identity identifying method, device, electronic equipment and storage medium
Lee et al. Enhanced iris recognition method by generative adversarial network-based image reconstruction
Li et al. Design and learn distinctive features from pore-scale facial keypoints
Sams et al. Hq-fingan: High-quality synthetic fingerprint generation using gans
CN111723762B (en) Face attribute identification method and device, electronic equipment and storage medium
Lin et al. Human face recognition using a spatially weighted modified Hausdorff distance
CN111814738A (en) Human face recognition method, human face recognition device, computer equipment and medium based on artificial intelligence
Singh et al. Face liveness detection through face structure analysis
Suh et al. Face liveness detection for face recognition based on cardiac features of skin color image
Shyam et al. Automatic face recognition in digital world
Guo et al. Human face recognition using a spatially weighted Hausdorff distance
Monteiro et al. Multimodal hierarchical face recognition using information from 2.5 D images
Cohen et al. Iris identification in 3D
CN110502996A (en) A kind of dynamic identifying method towards fuzzy finger vein image
Caroppo et al. Vision-Based Heart Rate Monitoring in the Smart Living Domains

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant