CN107169471A - A kind of fingerprint recognition system based on image co-registration - Google Patents

A kind of fingerprint recognition system based on image co-registration Download PDF

Info

Publication number
CN107169471A
CN107169471A CN201710421000.2A CN201710421000A CN107169471A CN 107169471 A CN107169471 A CN 107169471A CN 201710421000 A CN201710421000 A CN 201710421000A CN 107169471 A CN107169471 A CN 107169471A
Authority
CN
China
Prior art keywords
mrow
msup
msub
image
mtd
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710421000.2A
Other languages
Chinese (zh)
Other versions
CN107169471B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hui Gu Artificial Intelligence Studies Institute Nanjing co Ltd
Original Assignee
Shenzhen City Creative Industry Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen City Creative Industry Technology Co Ltd filed Critical Shenzhen City Creative Industry Technology Co Ltd
Priority to CN201710421000.2A priority Critical patent/CN107169471B/en
Publication of CN107169471A publication Critical patent/CN107169471A/en
Application granted granted Critical
Publication of CN107169471B publication Critical patent/CN107169471B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1318Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Input (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides a kind of fingerprint recognition system based on image co-registration, including finger print acquisition module, D/A converter module, fingerprint image processing module, fingerprint database module and result verification module, the finger print acquisition module is used for the infrared image and ultraviolet image for gathering target fingerprint;The D/A converter module is used to carry out digital-to-analogue conversion respectively with ultraviolet image to the infrared image of target fingerprint;The fingerprint image processing module is used to carry out denoising, decomposition and synthesis processing with ultraviolet image to the infrared image of target fingerprint, obtains target fingerprint image;The fingerprint image for the standard that is stored with the fingerprint database module;Target fingerprint image and the fingerprint image of standard are carried out contrast verification by the result verification module, obtain fingerprint authentication result, and result is shown.The present invention is merged the infrared image of target fingerprint with ultraviolet image, and fingerprint recognition is carried out using the fingerprint image after fusion, improves the accuracy of fingerprint recognition.

Description

A kind of fingerprint recognition system based on image co-registration
Technical field
The present invention relates to fingerprint recognition field, and in particular to a kind of fingerprint recognition system based on image co-registration.
Background technology
Fingerprint recognition system of the prior art is generally adopted using single camera or single sensor to target fingerprint The fingerprint recognition system of collection, single camera or single sensor disclosure satisfy that some required precisions are not very high applied fields Scape, such as fingerprint is unlocked, fingerprint is checked card, fingerprint access control, if but it is very high for some accuracy, and fingerprint is endless Whole application scenarios, can not usually be tackled using the fingerprint recognition system of single camera or single sensor.
The content of the invention
In view of the above-mentioned problems, a kind of the present invention is intended to provide fingerprint recognition system based on image co-registration.
The purpose of the present invention is realized using following technical scheme:
A kind of fingerprint recognition system based on image co-registration, including finger print acquisition module, D/A converter module, fingerprint image Processing module, fingerprint database module and result verification module, the finger print acquisition module are used to gather the infrared of target fingerprint Image and ultraviolet image;The D/A converter module is used to carry out digital-to-analogue respectively with ultraviolet image to the infrared image of target fingerprint Conversion;The fingerprint image processing module is used to carry out denoising with ultraviolet image to the infrared image of target fingerprint, decomposes and close Into processing, target fingerprint image is obtained;The fingerprint image for the standard that is stored with the fingerprint database module;The result verification Target fingerprint image and the fingerprint image of standard are carried out contrast verification by module, obtain fingerprint authentication result, and result is carried out Display.
Beneficial effects of the present invention are:The present invention is merged the infrared image of target fingerprint with ultraviolet image, is utilized Fingerprint image after fusion carries out fingerprint recognition, is greatly improved the accuracy of fingerprint recognition.
Brief description of the drawings
Using accompanying drawing, the invention will be further described, but the embodiment in accompanying drawing does not constitute any limit to the present invention System, for one of ordinary skill in the art, on the premise of not paying creative work, can also be obtained according to the following drawings Other accompanying drawings.
Fig. 1 is the frame construction drawing of the present invention;
Fig. 2 is the frame construction drawing of the fingerprint image processing module of the present invention.
Reference:
Finger print acquisition module 1, D/A converter module 2, fingerprint image processing module 3, fingerprint database module 4, result are tested Demonstrate,prove module 5, fingerprint image preprocessing submodule 31, fingerprint image resolution process submodule 32 and fingerprint image fusion submodule 33。
Embodiment
With reference to following application scenarios, the invention will be further described.
Referring to Fig. 1, including finger print acquisition module 1, D/A converter module 2, fingerprint image processing module 3, fingerprint database Module 4 and result verification module 5, the rear of finger print acquisition module 1 connect the D/A converter module 2, for gathering target The infrared image and ultraviolet image of fingerprint;The D/A converter module 2 is used for infrared image and ultraviolet image to target fingerprint Digital-to-analogue conversion is carried out respectively;The fingerprint image processing module 3 is used to carry out the infrared image of target fingerprint with ultraviolet image Denoising, decomposition and synthesis processing, obtain target fingerprint image;The fingerprint image for the standard that is stored with the fingerprint database module 4 Picture;The result verification module 5 is connected with the fingerprint image processing module 3 and the fingerprint database module 4, for by mesh The fingerprint image for marking fingerprint image and standard carries out contrast verification, obtains fingerprint authentication result, and result is shown.
Preferably, when the finger print acquisition module is acquired to the infrared image and ultraviolet image of target fingerprint, use Infrared COMS imaging lens are acquired to the infrared image of target fingerprint, using ultraviolet CCD imaging lens to target fingerprint Ultraviolet image is acquired, and LED exciter components are connected with front of ultraviolet CCD imaging lens, and rear is connected with ultraviolet image enhancing Device, finally connects infrared COMS imaging lens, all camera lens light path coaxials and is placed in parallel.
Preferably, the infrared COMS imaging lens and ultraviolet CCD imaging lens are all double-colored integrated structure.
The above embodiment of the present invention, the infrared image of target fingerprint is merged with ultraviolet image, after fusion Fingerprint image carries out fingerprint recognition, is greatly improved the accuracy of fingerprint recognition.
Preferably, as shown in Fig. 2 the fingerprint image processing module includes fingerprint image preprocessing submodule, fingerprint image As resolution process submodule and fingerprint image fusion submodule;The fingerprint image preprocessing submodule is by the infrared figure with noise Picture and ultraviolet image carry out wavelet transform process, obtain corresponding wavelet coefficient, and the wavelet coefficient now obtained includes noiseless Wavelet coefficient and noise wavelet coefficients, then utilize the infrared image of improved wavelet threshold function pair target fingerprint and ultraviolet Image carries out denoising, is specially:
(1) obtained wavelet coefficient is carried out by thresholding processing using improved threshold function table, to filter out noise wavelet system Number, obtain noiseless wavelet coefficient, the improved threshold function table used for:
In formula,Noiseless wavelet coefficient is represented, ψ is that the infrared image with noise and ultraviolet image are carried out at wavelet transformation The wavelet coefficient obtained after reason, sgn () is sign function, and i is the variable of sign function, and p and q are adjustable parameter, and υ is small echo Coefficient threshold, ε is that noise criteria is poor;
As p=0 or q=∞, this improved threshold function table is hard threshold function, and as p=1 and q=1, this is improved Threshold function table is soft-threshold function;
(2) infrared image and ultraviolet figure of the noiseless wavelet coefficient progress to target fingerprint are obtained after being handled using thresholding As being reconstructed, noiseless infrared image and noiseless ultraviolet image are obtained.
The above embodiment of the present invention, is passed through by improved threshold function table to the infrared image and ultraviolet image of fingerprint image Band noise wavelet coefficients in the wavelet coefficient obtained after wavelet transform process are filtered out, then using filter out band noise wavelet Wavelet coefficient after coefficient carries out Image Reconstruction, obtains side noiseless infrared image and ultraviolet image, carries out fingerprint image and locates in advance Reason filters out the noise in fingerprint image, to obtain high-quality target fingerprint image graph in target fingerprint image co-registration Picture, and threshold function table after improving can by adjusting p and q value, make improved threshold function table between soft, hard threshold function it Between, flexibility when enhancing is filtered out to wavelet coefficient.
Preferably, the fingerprint image resolution process submodule is first to passing through the fingerprint image preprocessing resume module Afterwards, the noiseless infrared image X obtained1With noiseless ultraviolet image X2Using non-downsampling Contourlet conversion (NSCT) point Resolution process is not carried out, obtains a respective low frequency sub-band coefficient and a series of high-frequency sub-band coefficient, i.e.,With X1LOW (j, k) and X2LOW (j, k) represent noiseless infrared image X1 and noiseless ultraviolet image X2 respectively Low frequency sub-band coefficient at pixel (j, k) place,WithRepresent noiseless infrared image X1 With noiseless ultraviolet image X2High-frequency sub-band coefficient in n-th of direction of m-th of yardstick at pixel (j, k) place, M is yardstick Number, m represents m-th of yardstick, and n is n-th of direction, nmFor the direction number under m-th of yardstick;
Then the sub-band coefficients and 4 pixels of surrounding (j, k) place pixel obtained after NSCT carries out resolution process The subband coefficient values of point are compared, with the work of the self-defined width noise-free picture pixel of liveness calculation formula node-by-node algorithm two Jerk, utilizes noiseless infrared image X1With noiseless ultraviolet image X2High-frequency sub-band coefficient carry out liveness calculating, obtain picture The high-frequency sub-band liveness of vegetarian refreshments (j, k), self-defined liveness calculation formula is:
In formula,WithNoiseless infrared image X is represented respectively1With noiseless ultraviolet image X2 In (j, k) place m-th of yardstick of pixel n-th of direction high-frequency sub-band liveness,WithTable Show noiseless infrared image X1With noiseless ultraviolet image X2High-frequency sub-band system in m-th of yardstick, n-th of direction at (j, k) place Number, Difference table Show noiseless infrared image X1With noiseless ultraviolet image X2It is adjacent with the up, down, left and right four directions of (j, k) place pixel The high-frequency sub-band coefficient in m-th of yardstick, n-th of direction of pixel;
Similarly, high-frequency sub-band coefficient is changed into low frequency sub-band coefficient, using self-defined liveness calculation formula, to two width The liveness of the low frequency sub-band of noise-free picture is calculated, and obtains noiseless infrared image X1With noiseless ultraviolet image X2 The liveness W of the low frequency sub-band of (j, k) place pixel1 Low(j, k) and W2 Low(j,k)。
The above embodiment of the present invention, noiseless infrared image and nothing are gone out by self-defined liveness calculation formula node-by-node algorithm The liveness of pixel in noise ultraviolet image, the readability of pixel is reflected with liveness, and the liveness of pixel is got over High then the point definition is higher, during according to liveness to infrared image and UV Image Fusion, is conducive to choosing and arrives definition Higher pixel, obtain that quality is higher, more merged comprising minutia after target fingerprint image.
Preferably, the fingerprint image merges submodule first to noiseless infrared image X1With noiseless ultraviolet image X2 Liveness at point (j, k) place is compared, and corresponding liveness comparative result is obtained, according to liveness comparative result, antithetical phrase Band coefficient carries out value again, obtains new high-frequency sub-band coefficient and new low frequency sub-band coefficient, and new sub-band coefficients are made For the sub-band coefficients of the target fingerprint image after fusion, it is specially:
In formula, sub-band coefficients of the X (j, k) for the target fingerprint image after fusion behind point (j, k) place again value, X1 (j, k) and X2(j, k) is respectively noiseless infrared image X1With noiseless ultraviolet image X2Sub-band coefficients at point (j, k) place, W1 (j, k) and W2(j, k) is respectively noiseless infrared image X1With noiseless ultraviolet image X2In (j, k) place pixel liveness, W1(j, k) includes W1 Low(j, k) andW2(j, K) includes W2 Low(j, k) andX1(j, k) includes X1 Low(j, k) andX2(j, k) includes X2 Low(j, k) andWork as X1(j, k)=X1 LowWhen (j, k), X2(j, k)=X2 Low(j, k), W1(j, k)=W1 Low(j, k), W2(j, k)=W2 Low(j, k), when When, To adjust threshold value,It is set as 0.5, X1 Low(j, k) and X2 Low(j, k) represents noiseless infrared image X respectively1It is purple with noiseless Outer image X2Low frequency sub-band coefficient at pixel (j, k) place,WithRepresent that noiseless is red Outer image X1With noiseless ultraviolet image X2High-frequency sub-band coefficient in n-th of direction of m-th of yardstick at pixel (j, k) place, J and K represent the width and height of two width noise-free pictures respectively;
High-frequency sub-band coefficient by the use of new high-frequency sub-band coefficient and new low frequency sub-band coefficient as composograph with it is low Frequency sub-band coefficients, by NSCT inverse transformation reconstructed images, the target fingerprint image after being merged, and exported.
The above embodiment of the present invention, to noiseless infrared image and noiseless ultraviolet image pixel liveness size ratio Compared with, according to comparative result, different values are taken to the target fingerprint Image Sub-Band coefficient after fusion, be conducive to noise infrared image and The details of noiseless ultraviolet image is complementary so that the target fingerprint image after fusion can include the minutia of two images, When carrying out fingerprint recognition checking, accuracy of identification is greatly improved.
Finally it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than to present invention guarantor The limitation of scope is protected, although being explained with reference to preferred embodiment to the present invention, one of ordinary skill in the art should Work as understanding, technical scheme can be modified or equivalent substitution, without departing from the reality of technical solution of the present invention Matter and scope.

Claims (6)

1. a kind of fingerprint recognition system based on image co-registration, it is characterized in that, including finger print acquisition module, D/A converter module, Fingerprint image processing module, fingerprint database module and result verification module, the finger print acquisition module refer to for gathering target The infrared image and ultraviolet image of line;The D/A converter module is used to distinguish the infrared image of target fingerprint with ultraviolet image Carry out digital-to-analogue conversion;The fingerprint image processing module be used to carrying out the infrared image of target fingerprint and ultraviolet image denoising, Decompose and synthesis processing, obtain target fingerprint image;The fingerprint image for the standard that is stored with the fingerprint database module;It is described Target fingerprint image and the fingerprint image of standard are carried out contrast verification by result verification module, obtain fingerprint authentication result, and right As a result shown.
2. a kind of fingerprint recognition system based on image co-registration according to claim 1, it is characterized in that, the fingerprint collecting When module is acquired to the infrared image and ultraviolet image of target fingerprint, using infrared COMS imaging lens to target fingerprint Infrared image is acquired, and the ultraviolet image of target fingerprint is acquired using ultraviolet CCD imaging lens, ultraviolet CCD imagings LED exciter components are connected with front of camera lens, rear is connected with ultraviolet image booster, finally connects infrared COMS imaging lens, All camera lens light path coaxials and it is placed in parallel.
3. a kind of fingerprint recognition system based on image co-registration according to claim 2, it is characterized in that, the infrared COMS Imaging lens and ultraviolet CCD imaging lens are all double-colored integrated structure.
4. a kind of fingerprint recognition system based on image co-registration according to claim 3, it is characterized in that, the fingerprint image Processing module includes fingerprint image preprocessing submodule, fingerprint image resolution process submodule and fingerprint image fusion submodule; Infrared image with noise and ultraviolet image are carried out wavelet transform process by the fingerprint image preprocessing submodule, obtain corresponding Wavelet coefficient, the wavelet coefficient now obtained include noiseless wavelet coefficient and noise wavelet coefficients, then using improve Wavelet threshold function pair target fingerprint infrared image and ultraviolet image carry out denoising, be specially:
(1) obtained wavelet coefficient is carried out by thresholding processing using improved threshold function table, to filter out noise wavelet coefficients, obtained To noiseless wavelet coefficient, the improved threshold function table used for:
<mrow> <mover> <mi>&amp;psi;</mi> <mo>~</mo> </mover> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mi>s</mi> <mi>g</mi> <mi>n</mi> <mrow> <mo>(</mo> <mi>&amp;psi;</mi> <mo>)</mo> </mrow> <mo>&amp;lsqb;</mo> <mi>&amp;psi;</mi> <mo>-</mo> <mo>|</mo> <mfrac> <mrow> <mi>p</mi> <mi>&amp;upsi;</mi> </mrow> <mi>&amp;psi;</mi> </mfrac> <msup> <mo>|</mo> <mrow> <mi>q</mi> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mo>+</mo> <msqrt> <mrow> <mn>2</mn> <mi>l</mi> <mi>n</mi> <mi>&amp;epsiv;</mi> </mrow> </msqrt> <mo>&amp;rsqb;</mo> </mrow> </mtd> <mtd> <mrow> <mi>&amp;psi;</mi> <mo>&amp;GreaterEqual;</mo> <mi>&amp;upsi;</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>&amp;psi;</mi> <mo>&lt;</mo> <mn>0</mn> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
<mrow> <mi>s</mi> <mi>g</mi> <mi>n</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mi>i</mi> <mo>&gt;</mo> <mn>0</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mo>&lt;</mo> <mn>0</mn> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
In formula,Noiseless wavelet coefficient is represented, ψ is that the infrared image with noise and ultraviolet image are carried out after wavelet transform process Obtained wavelet coefficient, sgn () is sign function, and i is the variable of sign function, and p and q are adjustable parameter, and v is wavelet coefficient Threshold value, ε is that noise criteria is poor;
(2) noiseless wavelet coefficient is obtained after handling using thresholding enter the infrared image and ultraviolet image of target fingerprint Line reconstruction, obtains noiseless infrared image and noiseless ultraviolet image.
5. a kind of fingerprint recognition system based on image co-registration according to claim 4, it is characterized in that, the fingerprint image Noiseless infrared image X of the resolution process submodule first to after the fingerprint image preprocessing resume module, obtaining1With Noiseless ultraviolet image X2Resolution process is carried out using non-downsampling Contourlet conversion (NSCT) respectively, respective one is obtained Individual low frequency sub-band coefficient and a series of high-frequency sub-band coefficient, i.e., With X2Low (j, k), X2m, nHigh (j, k) (1≤m≤M, 1≤n≤nm), X1Low (j, k) and X2Low (j, k) are respectively Represent noiseless infrared image X1With noiseless ultraviolet image X2Low frequency sub-band coefficient at pixel (j, k) place,WithRepresent noiseless infrared image X1With noiseless ultraviolet image X2Pixel (j, K) the high-frequency sub-band coefficient in n-th of direction of m-th of yardstick at place, M is scale parameter, and m represents m-th of yardstick, and n is n-th of side To nmFor the direction number under m-th of yardstick;
Then the sub-band coefficients (j, k) place pixel obtained after NSCT carries out resolution process and 4 pixels of surrounding Subband coefficient values are compared, with enlivening for the self-defined width noise-free picture pixel of liveness calculation formula node-by-node algorithm two Degree, utilizes noiseless infrared image X1With noiseless ultraviolet image X2High-frequency sub-band coefficient carry out liveness calculating, obtain pixel The liveness of the high-frequency sub-band of point (j, k), self-defined liveness calculation formula is:
<mfenced open = "" close = ""> <mtable> <mtr> <mtd> <mrow> <msup> <msub> <mi>W</mi> <msub> <mn>1</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mrow> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mo>&amp;lsqb;</mo> <mo>|</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>1</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>1</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>|</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>+</mo> <mo>|</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>1</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>1</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>|</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>+</mo> <mo>|</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>1</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>1</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>-</mo> <mn>1</mn> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>|</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>+</mo> <mo>|</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>1</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>1</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>|</mo> <msup> <mo>&amp;rsqb;</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> </msup> </mrow> </mtd> </mtr> </mtable> </mfenced>
<mfenced open = "" close = ""> <mtable> <mtr> <mtd> <mrow> <msup> <msub> <mi>W</mi> <msub> <mn>2</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mrow> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mo>&amp;lsqb;</mo> <mo>|</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>2</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>2</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>|</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>+</mo> <mo>|</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>2</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>2</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>|</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>+</mo> <mo>|</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>2</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>2</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>-</mo> <mn>1</mn> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>|</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>+</mo> <mo>|</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>2</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <msup> <mrow> <mo>(</mo> <msup> <msub> <mi>X</mi> <msub> <mn>2</mn> <mrow> <mi>m</mi> <mo>,</mo> <mi>n</mi> </mrow> </msub> </msub> <mrow> <mi>H</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msup> <mo>(</mo> <mi>j</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mi>k</mi> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>|</mo> <msup> <mo>&amp;rsqb;</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> </msup> </mrow> </mtd> </mtr> </mtable> </mfenced>
In formula,WithNoiseless infrared image X is represented respectively1With noiseless ultraviolet image X2In (j, k) place m-th of yardstick of pixel n-th of direction high-frequency sub-band liveness,WithRepresent noiseless infrared image X1With noiseless ultraviolet image X2M-th of yardstick, n-th of direction at (j, k) place High-frequency sub-band coefficient, Difference table Show noiseless infrared image X1With noiseless ultraviolet image X2It is adjacent with the up, down, left and right four directions of (j, k) place pixel The high-frequency sub-band coefficient in m-th of yardstick, n-th of direction of pixel;
Similarly, high-frequency sub-band coefficient is changed into low frequency sub-band coefficient, using self-defined liveness calculation formula, to two width without making an uproar The liveness of the low frequency sub-band of acoustic image is calculated, and obtains noiseless infrared image X1With noiseless ultraviolet image X2At (j, k) Locate the liveness W of the low frequency sub-band of pixel1 Low(j, k) and W2 Low(j,k)。
6. a kind of fingerprint recognition system based on image co-registration according to claim 5, it is characterized in that, the fingerprint image Submodule is merged first to noiseless infrared image X1With noiseless ultraviolet image X2Liveness at point (j, k) place is compared, Corresponding liveness comparative result is obtained, according to liveness comparative result, value again is carried out to sub-band coefficients, new height is obtained Frequency sub-band coefficients and new low frequency sub-band coefficient, and using new sub-band coefficients as the target fingerprint image after fusion subband system Number, be specially:
In formula, sub-band coefficients of the X (j, k) for the target fingerprint image after fusion behind point (j, k) place again value, X1(j, k) and X2(j, k) is respectively noiseless infrared image X1With noiseless ultraviolet image X2Sub-band coefficients at point (j, k) place, W1(j, k) and W2(j, k) is respectively noiseless infrared image X1With noiseless ultraviolet image X2In (j, k) place pixel liveness, W1(j,k) Including W1 Low(j, k) andW2(j, k) includes W2 Low(j, k) andX1(j, k) includes X1 Low(j, K) andX2(j, k) includes X2 Low(j, k) andWork as X1(j, k)=X1 LowWhen (j, k), X2 (j, k)=X2 Low(j, k), W1(j, k)=W1 Low(j, k), W2(j, k)=W2 Low(j, k), when When, To adjust threshold value,X1 Low(j, k) and X2 Low(j, k) represents noiseless infrared image X respectively1With noiseless ultraviolet image X2Low frequency sub-band coefficient at pixel (j, k) place,WithRepresent noiseless infrared image X1With noiseless ultraviolet image X2High-frequency sub-band coefficient in n-th of direction of m-th of yardstick at pixel (j, k) place, J and K points The width and height of two width noise-free pictures are not represented;
The high-frequency sub-band coefficient of composograph and low frequency are used as by the use of new high-frequency sub-band coefficient and new low frequency sub-band coefficient Band coefficient, by NSCT inverse transformation reconstructed images, the target fingerprint image after being merged, and exported.
CN201710421000.2A 2017-06-07 2017-06-07 A kind of fingerprint recognition system based on image co-registration Expired - Fee Related CN107169471B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710421000.2A CN107169471B (en) 2017-06-07 2017-06-07 A kind of fingerprint recognition system based on image co-registration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710421000.2A CN107169471B (en) 2017-06-07 2017-06-07 A kind of fingerprint recognition system based on image co-registration

Publications (2)

Publication Number Publication Date
CN107169471A true CN107169471A (en) 2017-09-15
CN107169471B CN107169471B (en) 2018-10-12

Family

ID=59825411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710421000.2A Expired - Fee Related CN107169471B (en) 2017-06-07 2017-06-07 A kind of fingerprint recognition system based on image co-registration

Country Status (1)

Country Link
CN (1) CN107169471B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108038937A (en) * 2017-11-22 2018-05-15 同观科技(深圳)有限公司 A kind of methods of exhibiting, device, terminal device and the storage medium of welcome's information
CN108932492A (en) * 2018-06-28 2018-12-04 福州昌宇五金锁具制品有限公司 A kind of image fingerprint extracting method based on non-sampled shearing wave conversion
CN109946308A (en) * 2019-04-22 2019-06-28 深圳市阿赛姆电子有限公司 A kind of electronic component appearance delection device
CN111052141A (en) * 2019-08-02 2020-04-21 深圳市汇顶科技股份有限公司 Fingerprint detection device and electronic equipment
CN112986169A (en) * 2021-03-11 2021-06-18 广东新一代工业互联网创新技术有限公司 Ultraviolet spectrum pollutant classification detection method based on sampling contourlet transformation
US11531430B2 (en) 2019-07-12 2022-12-20 Shenzhen GOODIX Technology Co., Ltd. Fingerprint detection apparatus and electronic device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101510007A (en) * 2009-03-20 2009-08-19 北京科技大学 Real time shooting and self-adapting fusing device for infrared light image and visible light image
US8073234B2 (en) * 2007-08-27 2011-12-06 Acushnet Company Method and apparatus for inspecting objects using multiple images having varying optical properties
CN104834895A (en) * 2015-04-03 2015-08-12 南京理工大学 Ultraviolet-visible light dual-band fusion portable fingerprint detector
CN104866845A (en) * 2015-06-11 2015-08-26 武汉华炬光电有限公司 Ultraviolet infrared LED fingerprint detection system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8073234B2 (en) * 2007-08-27 2011-12-06 Acushnet Company Method and apparatus for inspecting objects using multiple images having varying optical properties
CN101510007A (en) * 2009-03-20 2009-08-19 北京科技大学 Real time shooting and self-adapting fusing device for infrared light image and visible light image
CN104834895A (en) * 2015-04-03 2015-08-12 南京理工大学 Ultraviolet-visible light dual-band fusion portable fingerprint detector
CN104866845A (en) * 2015-06-11 2015-08-26 武汉华炬光电有限公司 Ultraviolet infrared LED fingerprint detection system

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108038937A (en) * 2017-11-22 2018-05-15 同观科技(深圳)有限公司 A kind of methods of exhibiting, device, terminal device and the storage medium of welcome's information
CN108038937B (en) * 2017-11-22 2021-01-29 同观科技(深圳)有限公司 Method and device for showing welcome information, terminal equipment and storage medium
CN108932492A (en) * 2018-06-28 2018-12-04 福州昌宇五金锁具制品有限公司 A kind of image fingerprint extracting method based on non-sampled shearing wave conversion
CN109946308A (en) * 2019-04-22 2019-06-28 深圳市阿赛姆电子有限公司 A kind of electronic component appearance delection device
US11531430B2 (en) 2019-07-12 2022-12-20 Shenzhen GOODIX Technology Co., Ltd. Fingerprint detection apparatus and electronic device
CN111052141A (en) * 2019-08-02 2020-04-21 深圳市汇顶科技股份有限公司 Fingerprint detection device and electronic equipment
WO2021022425A1 (en) * 2019-08-02 2021-02-11 深圳市汇顶科技股份有限公司 Fingerprint detection apparatus and electronic device
US11776301B2 (en) 2019-08-02 2023-10-03 Shenzhen GOODIX Technology Co., Ltd. Fingerprint detection apparatus and electronic device
CN112986169A (en) * 2021-03-11 2021-06-18 广东新一代工业互联网创新技术有限公司 Ultraviolet spectrum pollutant classification detection method based on sampling contourlet transformation

Also Published As

Publication number Publication date
CN107169471B (en) 2018-10-12

Similar Documents

Publication Publication Date Title
CN107169471A (en) A kind of fingerprint recognition system based on image co-registration
Schuckers et al. On techniques for angle compensation in nonideal iris recognition
CN106504222B (en) A kind of underwater Polarization Image Fusion system based on bionic visual mechanism
Ellmauthaler et al. Multiscale image fusion using the undecimated wavelet transform with spectral factorization and nonorthogonal filter banks
CN109801250A (en) Infrared and visible light image fusion method based on ADC-SCM and low-rank matrix expression
CN102005037B (en) Multimodality image fusion method combining multi-scale bilateral filtering and direction filtering
CN110223242A (en) A kind of video turbulent flow removing method based on time-space domain Residual Generation confrontation network
Sutthiwichaiporn et al. Adaptive boosted spectral filtering for progressive fingerprint enhancement
CN104809734A (en) Infrared image and visible image fusion method based on guide filtering
CN101430759A (en) Optimized recognition pretreatment method for human face
CN109191416A (en) Image interfusion method based on sparse dictionary study and shearing wave
CN105761234A (en) Structure sparse representation-based remote sensing image fusion method
CN113837147B (en) Transform-based false video detection method
CN107918748A (en) A kind of multispectral two-dimension code recognition device and method
CN106204601A (en) A kind of live body parallel method for registering of EO-1 hyperion sequence image based on wave band scanning form
Bhateja et al. Medical image fusion in wavelet and ridgelet domains: a comparative evaluation
CN103632341A (en) Noisy CS-MRI reconstruction method for pyramid decomposition and dictionary learning
Yang Multiresolution Image Fusion Based on Wavelet Transform By Using a Novel Technique for Selection Coefficients.
CN109816617A (en) Multimode medical image fusion method based on Steerable filter and graph theory conspicuousness
Lu et al. Infrared and visible image fusion based on tight frame learning via VGG19 network
Lan et al. Multimodal medical image fusion using wavelet transform and human vision system
CN106780398B (en) A kind of image de-noising method based on noise prediction
CN113628148B (en) Method and device for reducing noise of infrared image
CN114998708A (en) Tea type identification method and device based on map signals
Chen et al. SFCFusion: Spatial-Frequency Collaborative Infrared and Visible Image Fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20180830

Address after: 210012 room 1601-1604, 3 building, Yun Mi Cheng, 19 ningshuang Road, Yuhuatai District, Nanjing, Jiangsu, China

Applicant after: Hui Gu artificial intelligence studies institute(Nanjing)Co., Ltd.

Address before: 518000 West Tower 1708, Nanshan Software Park, Nanshan Digital Culture Industry Base, 10128 Shennan Avenue, Nanshan Street, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN CHUANGYI INDUSTRIAL TECHNOLOGY CO.,LTD.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20181012

CF01 Termination of patent right due to non-payment of annual fee