CN110472501A - A kind of fingerprint pore coding specification method neural network based - Google Patents

A kind of fingerprint pore coding specification method neural network based Download PDF

Info

Publication number
CN110472501A
CN110472501A CN201910618569.7A CN201910618569A CN110472501A CN 110472501 A CN110472501 A CN 110472501A CN 201910618569 A CN201910618569 A CN 201910618569A CN 110472501 A CN110472501 A CN 110472501A
Authority
CN
China
Prior art keywords
fingerprint
neural network
pore
layer
hidden layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910618569.7A
Other languages
Chinese (zh)
Other versions
CN110472501B (en
Inventor
张明
戴建新
张沼斌
邹聪
何勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Post and Telecommunication University
Original Assignee
Nanjing Post and Telecommunication University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Post and Telecommunication University filed Critical Nanjing Post and Telecommunication University
Priority to CN201910618569.7A priority Critical patent/CN110472501B/en
Publication of CN110472501A publication Critical patent/CN110472501A/en
Application granted granted Critical
Publication of CN110472501B publication Critical patent/CN110472501B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • G06V40/1371Matching features related to minutiae or pores

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention proposes a kind of fingerprint pore coding specification methods neural network based, the following steps are included: step 1) extracts high-resolution fingerprint image, by segmentation, method for normalizing pre-processes fingerprint image, obtains the position feature of fingerprint pore by Gabor filter;Step 2 is trained the position feature using neural network, obtains classification based training collection;This method facilitates direct, reduces external environment influence, reduces the complexity of recognizer, improve the robustness of system, saves recognition time, be with a wide range of applications in security system, advanced door lock access control system and criminal investigation field.

Description

A kind of fingerprint pore coding specification method neural network based
Technical field
The present invention relates to a kind of fingerprint pore classification methods neural network based, and specifically one kind is by pore Position carry out BP neural network training, belong to technical field.
Background technique
Currently, traditional fingerprint recognition system based on fingerprint minutiae is more perfect, but its accuracy still needs It improves, it is more accurate using the identifying system based on fingerprint pore for there is the place of the requirement of high security.Jian A K etc. People proposes improved iteration closest approach algorithmic match pore, has good anti-noise effect;Zhao Q et al. proposes pore-valley line Descriptor solves the translation and rotational invariance of pore feature, improves accuracy.However fingerprint pore is usually more, at present Algorithm it is low to its treatment effeciency, complexity is very high, time-consuming very long.
Summary of the invention
The method that encode the object of the present invention is to provide a kind of pair of fingerprint pore and then classify, to reduce fingerprint pore Matching algorithm complexity improves system robustness, saves the time.
The object of the present invention is achieved like this: a kind of fingerprint pore coding specification method neural network based, including Following steps:
Step 1) extracts high-resolution fingerprint image, and by segmentation, method for normalizing pre-processes fingerprint image, The position feature of fingerprint pore is obtained by Gabor filter;
Step 2) is trained the position feature using neural network, obtains classification based training collection.
It is further limited as of the invention, step 1) specifically includes: fingerprint pore is obtained by Gabor filter filtering Model:
Wherein, δiIndicate the pore size perpendicular to crestal line direction, δjIndicate the pore size along crestal line direction, θ is indicated Crestal line direction can show that Rot is to P by calculating the field of direction0The operation for carrying out rotation θ angle, passes through the above filtering operation The position feature of you can get it fingerprint pore.
It is further limited as of the invention, step 2) specifically includes: carrying out off-line training using BP neural network, obtain The function model of fingerprint pore identification, for the identification to fingerprint database;The function model is as follows:
Y=purelin (W2×tan sig(W1×Xn1)+θ2) wherein, XnFor the input vector of BP neural network;W1For Weight between input layer and hidden layer;W2Weight between hidden layer and output layer;θ1Between input layer and hidden layer Threshold value;θ2Threshold value between hidden layer and output layer;Y is the output vector of BP neural network;Tan sig () be input layer and Tanh S function between hidden layer;Linear transfer function of the purelin () between hidden layer and output layer.
It is further limited as of the invention, for BP neural network input layer, sets the input points of BP neural network It is 500, this 500 input values are the local detail feature of pore of taking the fingerprint;Utilize formula
Calculate hidden layer and output layer number of nodes, nIFor the number of nodes of input layer, nOFor the number of nodes of output layer, ncIt is Constant sets constant nc=8, the number of nodes of output layer is set to 200, and hidden layer number of nodes is then n=34;
For output layer, 200 binary codings are carried out in advance to each fingerprint image, coding range is 0~ 2200-1;Each binary numeral is classified as one, a corresponding people;
In training module, using three layers of BP neural network of input layer, hidden layer and output layer, BP nerve net is constructed Network model;The learning rate γ that the model is arranged is 0.8, and momentum coefficient α is 0.9, maximum cycle 5000, and target error is 1e-5;Fingerprint recognition function is obtained after off-line training, is used for subsequent recognition operation;
In identification module, it is stored in a feature vector, X, connects from 500 characteristic values that fingerprint image extracts By vector X input BP neural network model in, the model has trained and has obtained corresponding weight value W and threshold θ at this time, thus Output Y is obtained, output Y is a binary coding, it is corresponding with the fingerprint pore number in database, the element of vector Y Value is 0 or 1, and the classification to fingerprint pore can be realized in this way, can the other fingerprint of knowledge due to the corresponding people of each classification Which people image is derived from.
The invention adopts the above technical scheme compared with prior art, has following technical effect that and is instructed by neural network Practice, so that each fingerprint image coding is unique in fingerprint base, recognition accuracy is improved;In addition by pretreatment after, Fingerprint recognition function is brought directly to link to be encoded rapidly, match complexity is reduced, and time-consuming shortens, and system robustness becomes It is high.
Detailed description of the invention
Fig. 1 is neural network model figure of the invention.
Specific embodiment
Technical solution of the present invention is described in further detail with reference to the accompanying drawing:
The present embodiment proposes a kind of coding specification method of fingerprint pore neural network based, and process is as follows:
1. extracting the higher fingerprint image of resolution ratio, by segmentation, method for normalizing pre-processes fingerprint image.It is logical It crosses Gabor filtering and obtains fingerprint pore model:
Wherein, δxIndicate scale factor along the x-axis direction, δyIndicate scale factor along the y-axis direction.
Wherein, δiIndicate the pore size perpendicular to crestal line direction, δjIndicate the pore size along crestal line direction, θ is indicated Crestal line direction can show that Rot is to P by calculating the field of direction0Carry out the operation of rotation θ angle.
Pass through the position feature of the above filtering operation you can get it fingerprint pore.
2. the position for each pore carries out BP neural network training, the pore of fingerprint image is subjected to 0 and 1 coding.
Fingerprint pore sorting algorithm based on BP error Feedback Neural Network, this method only need to train enough training Sample can make BP network adjust weight and the threshold value of itself to adapt to the various features of fingerprint pore, the mould of BP neural network Type is shown in attached drawing 1, and formula is as follows
Y=purelin (W2×tan sig(W1×Xn1)+θ2) (3)
Wherein, XnFor the input vector of BP neural network;W1Weight between input layer and hidden layer;W2For hidden layer Weight between output layer;θ1Threshold value between input layer and hidden layer;θ2Threshold value between hidden layer and output layer;Y For the output vector of BP neural network;Tanh S function of the tan sig () between input layer and hidden layer;purelin The linear transfer function of () between hidden layer and output layer.
One width complete finger print image pore quantity is about 500, and the input points that can set BP neural network are 500, This 500 input values are the local detail feature to take the fingerprint.According to formula
Hidden layer and output layer number of nodes, n can be calculatedIFor the number of nodes of input layer, nOFor the number of nodes of output layer, nc It is constant, sets constant nc=8, the number of nodes of output layer is set to 200, and hidden layer number of nodes is then n=34.
In training module, using three layers of BP neural network of input layer, hidden layer and output layer, BP nerve net is constructed Network model, the learning rate γ for being provided with the model is 0.8, and momentum coefficient α is 0.9, and maximum cycle 5000, target is missed Difference is 1 × 10-5
In identification module, it is stored in a feature vector, X, connects from 500 characteristic values that fingerprint image extracts By vector X input BP neural network model in, the model has trained and has obtained corresponding weight value W and threshold θ at this time, thus Obtain output Y.Exporting Y is a binary coding, it is corresponding with the fingerprint pore number in database, the element of vector Y Value is 0 or 1, can be classified in this way to fingerprint pore.
The above, the only specific embodiment in the present invention, but scope of protection of the present invention is not limited thereto, appoints What is familiar with the people of the technology within the technical scope disclosed by the invention, it will be appreciated that expects transforms or replaces, and should all cover Within scope of the invention, therefore, the scope of protection of the invention shall be subject to the scope of protection specified in the patent claim.

Claims (4)

1. a kind of fingerprint pore coding specification method neural network based, which comprises the following steps:
Step 1) extracts high-resolution fingerprint image, and by segmentation, method for normalizing pre-processes fingerprint image, passes through Gabor filter obtains the position feature of fingerprint pore;
Step 2) is trained the position feature using neural network, obtains classification based training collection.
2. a kind of fingerprint pore coding specification method neural network based according to claim 1, which is characterized in that step It is rapid 1) to specifically include: fingerprint pore model is obtained by Gabor filter filtering:
Wherein, δiIndicate the pore size perpendicular to crestal line direction, δjIndicate the pore size along crestal line direction, θ indicates crestal line Direction can show that Rot is to P by calculating the field of direction0The operation for carrying out rotation θ angle, passes through the above filtering operation Obtain the position feature of fingerprint pore.
3. a kind of fingerprint pore coding specification method neural network based according to claim 2, which is characterized in that step It is rapid 2) to specifically include: to carry out off-line training using BP neural network, obtain the function model of fingerprint pore identification, for fingerprint The identification of database;The function model is as follows:
Y=purelin (W2×tan sig(W1×Xn1)+θ2)
Wherein, XnFor the input vector of BP neural network;W1Weight between input layer and hidden layer;W2For hidden layer and defeated Weight between layer out;θ1Threshold value between input layer and hidden layer;θ2Threshold value between hidden layer and output layer;Y is BP The output vector of neural network;Tanh S function of the tan sig () between input layer and hidden layer;Purelin () is Linear transfer function between hidden layer and output layer.
4. a kind of fingerprint pore coding specification method neural network based according to claim 3, which is characterized in that right In BP neural network input layer, the input points for setting BP neural network are 500, this 500 input values are to take the fingerprint The local detail feature of pore;Utilize formula
Calculate hidden layer and output layer number of nodes, nIFor the number of nodes of input layer, nOFor the number of nodes of output layer, ncIt is constant, Set constant nc=8, the number of nodes of output layer is set to 200, and hidden layer number of nodes is then n=34;
For output layer, 200 binary codings are carried out in advance to each fingerprint image, coding range is 0~2200-1; Each binary numeral is classified as one, a corresponding people;
In training module, using three layers of BP neural network of input layer, hidden layer and output layer, BP neural network mould is constructed Type;The learning rate γ that the model is arranged is 0.8, and momentum coefficient α is 0.9, maximum cycle 5000, target error 1e- 5;Fingerprint recognition function is obtained after off-line training, is used for subsequent recognition operation;
It in identification module, is stored in a feature vector, X from 500 characteristic values that fingerprint image extracts, then will Vector X is inputted in BP neural network model, and the model has trained and obtained corresponding weight value W and threshold θ at this time, to obtain Y is exported, output Y is a binary coding, it is corresponding with the fingerprint pore number in database, and the element value of vector Y is 0 Or 1, the classification to fingerprint pore can be realized in this way, it, can the other fingerprint image of knowledge due to the corresponding people of each classification Which people be derived from.
CN201910618569.7A 2019-07-10 2019-07-10 Neural network-based fingerprint sweat pore coding classification method Active CN110472501B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910618569.7A CN110472501B (en) 2019-07-10 2019-07-10 Neural network-based fingerprint sweat pore coding classification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910618569.7A CN110472501B (en) 2019-07-10 2019-07-10 Neural network-based fingerprint sweat pore coding classification method

Publications (2)

Publication Number Publication Date
CN110472501A true CN110472501A (en) 2019-11-19
CN110472501B CN110472501B (en) 2022-08-30

Family

ID=68507469

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910618569.7A Active CN110472501B (en) 2019-07-10 2019-07-10 Neural network-based fingerprint sweat pore coding classification method

Country Status (1)

Country Link
CN (1) CN110472501B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112668595A (en) * 2021-01-25 2021-04-16 数网金融有限公司 Image processing method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107480649A (en) * 2017-08-24 2017-12-15 浙江工业大学 Fingerprint sweat pore extraction method based on full convolution neural network
CN108449295A (en) * 2018-02-05 2018-08-24 西安电子科技大学昆山创新研究院 Combined modulation recognition methods based on RBM networks and BP neural network
CN108959833A (en) * 2018-09-26 2018-12-07 北京工业大学 Tool wear prediction technique based on improved BP neural network
CN109547431A (en) * 2018-11-19 2019-03-29 国网河南省电力公司信息通信公司 A kind of network security situation evaluating method based on CS and improved BP

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107480649A (en) * 2017-08-24 2017-12-15 浙江工业大学 Fingerprint sweat pore extraction method based on full convolution neural network
CN108449295A (en) * 2018-02-05 2018-08-24 西安电子科技大学昆山创新研究院 Combined modulation recognition methods based on RBM networks and BP neural network
CN108959833A (en) * 2018-09-26 2018-12-07 北京工业大学 Tool wear prediction technique based on improved BP neural network
CN109547431A (en) * 2018-11-19 2019-03-29 国网河南省电力公司信息通信公司 A kind of network security situation evaluating method based on CS and improved BP

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112668595A (en) * 2021-01-25 2021-04-16 数网金融有限公司 Image processing method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN110472501B (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN111524525B (en) Voiceprint recognition method, device, equipment and storage medium of original voice
Fahmy Online handwritten signature verification system based on DWT features extraction and neural network classification
Kashi et al. A Hidden Markov Model approach to online handwritten signature verification
CN107967695B (en) A kind of moving target detecting method based on depth light stream and morphological method
CN109241995B (en) Image identification method based on improved ArcFace loss function
Ng et al. Iris recognition using rapid Haar wavelet decomposition
CN107832747B (en) Face recognition method based on low-rank dictionary learning algorithm
CN108564040B (en) Fingerprint activity detection method based on deep convolution characteristics
CN108681698B (en) Large-scale iris recognition method with privacy protection function
CN113312989B (en) Finger vein feature extraction network based on aggregated descriptors and attention
CN113779643B (en) Signature handwriting recognition system and method based on pre-training technology and storage medium
CN107403153A (en) A kind of palmprint image recognition methods encoded based on convolutional neural networks and Hash
CN105118509A (en) Security authentication method based on voiceprint two-dimensional code
CN107122725B (en) Face recognition method and system based on joint sparse discriminant analysis
CN110956082A (en) Face key point detection method and detection system based on deep learning
Saponara et al. Recreating fingerprint images by convolutional neural network autoencoder architecture
Öztürk et al. Minnet: Minutia patch embedding network for automated latent fingerprint recognition
CN110472501A (en) A kind of fingerprint pore coding specification method neural network based
Saffar et al. Online signature verification using deep representation: a new descriptor
CN115984906A (en) Palm print image recognition method, device, equipment and storage medium
CN113076930B (en) Face recognition and expression analysis method based on shared backbone network
CN111950333B (en) Electronic handwritten signature recognition method based on neural network
CN108537213A (en) Enhance the system and method for iris recognition precision
CN114155554A (en) Transformer-based camera domain pedestrian re-recognition method
CN108197573A (en) The face identification method that LRC and CRC deviations based on mirror image combine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant