GB2451888A - Processing fingerprints to identify sweat pores, i.e. third level information, from ridge structures, i.e. macroscopic first level information. - Google Patents

Processing fingerprints to identify sweat pores, i.e. third level information, from ridge structures, i.e. macroscopic first level information. Download PDF

Info

Publication number
GB2451888A
GB2451888A GB0716021A GB0716021A GB2451888A GB 2451888 A GB2451888 A GB 2451888A GB 0716021 A GB0716021 A GB 0716021A GB 0716021 A GB0716021 A GB 0716021A GB 2451888 A GB2451888 A GB 2451888A
Authority
GB
United Kingdom
Prior art keywords
pore
image
fingerprint
pores
regions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0716021A
Other versions
GB0716021D0 (en
Inventor
Li Wang
Abhir Bhalerao
Roland Wilson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Warwick Warp Ltd
Original Assignee
Warwick Warp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Warwick Warp Ltd filed Critical Warwick Warp Ltd
Priority to GB0716021A priority Critical patent/GB2451888A/en
Publication of GB0716021D0 publication Critical patent/GB0716021D0/en
Priority to PCT/GB2008/050670 priority patent/WO2009024811A1/en
Publication of GB2451888A publication Critical patent/GB2451888A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • G06K9/00073
    • G06K9/0008
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1353Extracting features related to minutiae or pores
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1359Extracting features related to ridge properties; Determining the fingerprint type, e.g. whorl or loop

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to feature extraction methods and systems for fingerprint images using third level feature information such as sweat pores and using this information for enrolment, matching, storage of fingerprint templates. The apparatus identifies candidate sweat pore regions from an intensity profile, models the intensity profile in at least one dimension in each candidate pore region and identifies at least one of the centre, the size and the shape of the sweat pore from said modelled intensity profile. In one embodiment the ridge structure images are converted to feature space, also known as canonical representation, to estimate alignment before pattern searching.

Description

METHOD AND APPARATUS FOR IDENTIFYING AND MATCHING
FINGERPRINTS USING SWEAT PORES
Field of the Invention
The present invention relates to feature extraction methods for fingerprint images.
More specifically, the present invention relates to methods of extracting third level feature information such as sweat pores and using this information alone or along with minutiae information for enrolment, matching, storage of fingerprint templates.
Background of the invention
A fingerprint is characterised by smoothly flowing ridges and valleys, characterised by their orientation, separation, shape and minutiae. Minutiae are ridge endings and ridge bifurcations.
Traditionally, fingerprints have been the most widely accepted biometric. The formation and distinctiveness of the fingerprint has been understood since the early twentieth century (see for example Handbook of Fingerprint Recognition, D. Maltoni, et al, Springer 2003).
Most automatic fingerprint matching algorithms use minutiae information to determine whether two fingerprints are from the same finger. Some techniques use other ridge features (e.g. ridge direction, ridge spacing, ridge shape etc).
Broadly speaking, the process of fingerprint verification/identification involves two phases: (1) enrolment and; (2) matching.
in me enrolment phase, peopi&s fingerprint image(s) are processed by computer programs and converted into a template. The template is then associated with meta-data of a person's identity (e.g. name, age, sex, address, etc) and stored in a database. During enrolment, acquired fingerprints are stored in a template database, where only those features of the print which are distinguishing, are extracted and represented in some form.
In the matching phase, in verification mode (1:1 matching), a person's fingerprint images will be matched against the template, which belong to the claimed identity, whereas during identification mode (1:N matching), a person's fingerprint images will be matched against all or a subset of templates stored in the database. A matching score may be calculated for each comparison of the test fingerprint to a stored template.
Thus, a newly presented test fingerprint is compared against the set of stored templates and a matching score is returned. Because the test print has to be compared with each stored template, it is necessary to convert it also into the same representation as the template. Then the system can return a score based on how close is the presented (test) print with each template. If this score value is sufficiently high, determined by a user-defined threshold, then a match is declared.
When analyzed at different levels, a fingerprint pattern exhibits different types of features: 1. At the first level, the ridge-flow forms a particular pattern configuration which can be broadly classified as left loop, right loop, whorl, arch and tented arch. These distinctions are insufficient to facilitate accurate fingerprint matching, but are nevertheless useful in categorising and indexing fingerprint images.
2. At the second level, the geometric location of each minutiae on the print is extracted and the relationship of second level details enables individualisation.
3. At the third ievei, intra-ridge details inciude ridge path deviation, width, shape, pores edge contour, incipient ridges, breaks, creases, scars.
Among those features, the most dominant and commonly used third level features are sweat pores whose position, shape and distribution are considered highly distinctive.
The third level of analysis is often used manually by fingerprint experts in forensic science when only a partial print can be reliably obtained and the second level data (minutiae) are insufficient to make a conclusive match. However with advanced scanning technology (l000dpi scanners), third level features are clearly visible from the scanned images (see figure 1). it is therefore possible for automatic fingerprint recognition system to use this information for identifying individuals. By using third level feature information alone or combined with first and second level information, the false acceptance rate and false rejection rate can be reduced.
Third level feature information is especially useful when the number of reliably detected minutiae are small or only a partial print is scanned.
One major advantage of using third level data, for example using sweat pore data is that sweat pores are considerably more difficult to fake. Using first and second level fingerprint information it is possible to overlay the finger with a forged fingerprint which is essentially a fingerprint which is artificially created, usually in latex, and used to create a print. As sweat pores do not lend themselves to such counterfeit techniques, fingerprint technology using sweat pore information, such as the current invention, is inherently more reliable and secure than technologies based on first and second level data.
Description of the related art
The methods for identification of individuals using fingerprints based upon examination of ridge-based data are well known.
Figure 2 is a flow chart showing the steps generally performed by a typical prior art system.
At step one, depending on the application, a fingerprint image is acquired through either scanning an inked print or a live finger. Once the image is acquired into the computer memory or on a hard disk, it is often put through an enhancement process to improve the quality of the ridge pattern. This normally includes contrast enhancement, noise removal, filtering and smoothing. Some of the prior art systems also extract the foreground, i.e. ridge pattern from the background, at this step. At step three, either an image correlation method or a feature extraction process will be employed.
Figure 3 is a flow chart showing a commonly adopted feature extraction technique proposed in Adaptive flow orientation based feature extraction in fingerprint images", Journal of Pattern recognition, Vol. 28, no 11, pp1657-l 672 Nov. 1995 and in US Patent 6,049,621.
Firstly, the image is divided into set of blocks and the principal ridge direction of each block is then estimated. A foreground/background segmentation technique is then used to separate the finger part of the image from the background part of the image. At the next step, a binarisation technique is often used to extract the ridge features (labelled as 1) from non-ridge features (labelled as 0). The ridge feature is often more than 1 pixel wide and may contain noisy artefacts. Those artefacts will be removed at the smoothing step and the longer structures are smoothed. At the next step, the smoothed ridge structured is thinned to 1 pixel wide. The location and orientation of the minutiae features are then extracted from the thinned ridge structures. In some systems, a cleanup post-processing step is employed to remove spurious minutiae features.
The fourth step of the matching flow is normally an alignment step. Most of the prior art systems use the minutiae locations or cross-correlation information to identify a global affine transformation to eliminate the geometric variation including shift, and rotation between the query fingerprint and the template fingerprint.
Stosz, et al proposed a method of using sweat pore locations to identify individuals in a paper entitled Automated System for Fingerprint Authentication (SPIE Vol 2277 p 210-233). They proposed a muitiievei verification process wherein pore locations and minutiae data are used separately to confirm or crosscheck the identity of individuals. Firstly, pore locations from query and template fingerprint images are matched against each other and a correlation score is obtained from the matching which results in either a successful or failed identification result. Next, assuming the pore match indicated a successful identification result, minutiae points are independently matched to verify the identification established by pore matching.
Some other prior art systems, e.g. W099/06942 proposed a method using a combination of sweat pores and ridge based information to increase the amount of distinctive features extracted from fingerprint images thus reducing the error rate.
W099/06942 disclose a method of and a device for identifying individuals from association of finger sweat pores and Marcofeatures (e.g. minutiae). The method comprises obtaining from an individual during a registration process, a fingerprint image having at least one registration pore and at least one registration macrofeature; wherein registration pore data is derived from the registration pores and registration macrofeature data is derived from the registration macrofeature. In the subsequent step, bid associated data is derived from associating the bid pore data with the bid macrofeature data and constructing registration associated data derived from associating the registration pore data with the registration macrofeature data. In the matching step, it compares the bid associated data to the registration associated data to produce a correlation score; arriving at a successful or failed identification result based on comparison of the correlation score to a predetermined threshold value.
W02005/022446 discloses a method and device of identifying an individual finger from intra skin images where the ridge pattern and sweat pores are clearly visible whereas the oil, dirt and noise are minimized. By matching said pore locations with reference pore locations of a reference intra skin image, a pore correlation score is produced. A decision for successful or failed pore-based identification is subsequently made by comparison of the pore matching score with a predetermined pore threshold.
Deficiencies of Drior art systems The matching of a password or pin number to another password or pin number involves the comparison of two absolutely defined parameters, facilitating potentially exact matches. The matching of fingerprints or any biometric system, on the other hand, involves the comparison of highly complex functions and is inherently more difficult.
Measurements and matching within biometric systems are subject to two types of errors: a False Match or a False Non Match. The imperfect accuracy of performance is mainly due to the large variability in different impressions of the same finger (intra-ciass variation), caused by displacement, rotation, partial overlap, elastic deformation, variable pressure, varying skin condition, lighting effects, noise and feature extraction errors. Reliably matching fingerprint images becomes a hard problem when fingerprints from the same finger may look very different, or when fingerprints from different fingers may appear quite similar.
One draw back of prior art systems that only uses first and second level features is they only use partial information present in fingerprint images, mainly the ridge pattern and minutiae information. Due to the difficulty of consistent detection and establishing correspondence of sweat pores, those permanent, immutable and individual characteristics are ignored by prior art automatic fingerprint recognition systems. Third level features (sweat pores) are manually identified by fingerprint experts in forensic science because it is sometimes only possible to acquire partial prints, and there may be insufficient second level features present. For certainty skin types or manual labourers, the number of minutiae that can be reliably detected is small and often result a failure at the enrolment stage.
Incorporation of sweat pores can thus enrich the features space by a multiple, and when complimented with minutiae, can reduce the fail-to-enrol rate and increase the matching accuracy. However, although third level analyses are thus considerably more accurate, they also require high resolution scanning. In view of the high density of sweat pores, a consistent feature extraction and correspondence between prints can be difficult to achieve. The high scanning requirements (e.g. 1000 dpi above) has prevented widespread application of third level analyses. However, due to the reducing cost and improving technology of high resolution fingerprint imaging devices, such as WO 2005/022446, there is renewed interest in combining sweat pore representations in order to supplement information from the minutiae features and improve accuracy and usability of fingerprints in non forensic applications.
As mentioned in the previous section, some prior art (Stosz et al) has proposed using pore location information to identify individual fingerprints. One draw back of this approach is that the sweat pores are highly dense features and simple tests show that a 50% correlation score between two large sets of randomly located pores is possible when used with a location proximity threshold set to about the length of the maximum pore size. It shows that methods that use pore location only will have difficulty discriminating between two fingerprints which both have large number of pores.
To overcome this issue, some prior art systems W099/06942 proposed a method that associate sweat pore information with second level features to establish correspondence. A correlation score is then produced by comparing the associated second and third level information. The score is subsequently compared with a predetermined threshold to declare a successful or failed match.
One draw back of this approach is the successful correspondence of third level information between the query image and the template is highly dependent on the second level feature extraction. The inconsistency of the extracted minutiae points and ridge structures between the query fingerprints and template fingerprints can therefore be propagated and cause errors.
Another draw back of prior art systems is they oniy uses the sweat pore locations but ignore the information of the shape of sweat pores, the relative position between each sweat pore and its adjacent ridges. The false positives of pore matching can be potentially reduced by using this extra information in addition to the pore locations.
Prior to any matching of the second or third level features, fingerprint systems typically include an alignment process to ensure that images are aligned or justified in order to facilitate an accurate comparison of any two images. This alignment process is usually a combination of translations and rotations, which together form transformation parameters which defines the overall alignment. Prior art alignment methods that depend on minutiae and/or third level features will inevitably fall into a combinatorial problem of two unknowns, i.e. the correspondence between minutiae or sweat pore points and the transformation between the images. The transformation parameter calculation depends on the correspondence and the establishment of correspondence rely on an accurate estimation of the transformation parameters. Any errors in either estimate will propagate and degrade the accuracy of the subsequent matching.
Furthermore, because the transformation parameter calculation from the prior art systems rely on the comparison between query and template fingerprints, the process has to be carried out in a pair wise fashion. i.e. the alignment process needs to repeated for each comparison between the query prints and all the templates. In identification applications, when the system is trying to find out who the query fingerprints belongs to from a large database, the alignment procedure has to be repeated multiple times until the identity of the query prints is established.
Another draw back of the prior art systems, which utilize the third level information, is that the decision process is over simplified. Prior art systems obtain a matching score by comparing the sweat pore locations in query and template fingerprint images. If the location between corresponding sweat pores overlap or the distance between corresponding sweat pores are smaller than a predetermined threshold, the corresponding sweat pores will be regarded as being from same fingerprint. in reality, the sweat pores various greatly in size and shapes and this information should be used in conjunction with the location when comparing corresponding sweat pores. However, due to the oil, dirt and noise on the skin, the difference of pressure applied to the scanning surface during the image acquisition, the visibility and appearance of sweat pores can vary and therefore increase the difficulty in consistently obtained the additional information such as shape and size.
As mentioned above, because the sweat pore is densely populated, systems that only use its locations are prone to error and more sophisticated decision process is needed to reduce false positive sweat pore matches.
Therefore, in the light of the above, there is a need for a method which overcomes the deficiencies of prior art systems, by eliminating or reducing intra-class variation in a normalized framework, and has the ability to compliment sweat pore information from the minutiae, which then enables a more reliable and accurate fingerprint match.
Summary of the invention
The present invention is an image processing method which can detect the location, size, shape of sweat pores and their, relative position to adjacent ridge structures. The present invention also provides a method of matching sweat pores.
One aspect of the invention involves the construction of the canonical framework.
Another aspect of the invention involves the detection of sweat pore features from the fingerprint images. Another aspect of the invention involves a nonlinear alignment method to correct elastic deformation. Another involves an enrolment procedure and the other aspect of the invention involves a matching procedure.
In a main embodiment of the invention an apparatus is provided for processing fingerprint images having an intensity profile of a fingerprint having ridges, valleys and sweat pores, the apparatus comprising: a means for identifying candidate sweat pore regions from said intensity profile; a means for modelling the intensity profile in at least one dimension in each candidate pore region and a means for identifying at least one of the centre, the size and the shape of the sweat pore from said modelled intensity profile. Advantageously, the identifying means may filter out valleys in the image and to thereafter identify regions of high intensity as candidate pore regions The modelling means comprises a means for fitting to the said intensity profile a model profile of predetermined type, to provide a specific model profile of that type which approximates to the said intensity profile, this being a Hermite model profile or a Gaussian intensity model profile or any other suitable model profile. An error measure may be calculated, the error measure being the difference between the modelled pore using the intensity profile and unmodelled intensity profile and by minimising the error measure using a a Minimum Mean Square Error model profile, an iterative maximum likelihood model or any other suitable modelling error calculating method thereby approximating said intensity profile.
The apparatus identifies and filters out any divergences from the ideal model and eliminating divergences above a predetermined threshold, for example by applying Principal Component Analysis (PCA) on the modelled intensity profile.
The invention also facilitates the matching of one fingerprint against another, eg a suspect fingerprint against a template fingerprint taken from a stock of fingerprints, library or database. The system comprises a matching facility for comparing query images (Q) against stored template images (T) in order to identify matches.
Moreover, the system advantageously includes means for for calculating the probability that the two images are from the same fingerprint, said probability being a function of location, shape and size of the pores. The probability P_rn of a match between a pore at location (x,y) in the query image (0) and the corresponding pore in stored template image (1) is, which may be computed as P_rn = P(locationjQ,T)P(shapejQ,T).
Preferably the apparatus may be comprise means for identifying fingerprint ridges within the image. The centres of said candidate pore regions and the centres of said ridges may be defined by predetermined intensity levels: candidate regions for which the distance between the centre of said candidate pore regions and corresponding ridge centreline is more than a predetermined threshold may be eliminated. Similarly the modelling means may exclude those candidate pore regions which occur at a frequency other than the fundamental frequency of pore regions on a fingerprint ridge.
Advantageously, the apparatus may also calculate a pore distance between a centre of a pore at location (x,y) in the query image (0) and those of a corresponding pore in stored template image (T). As part of the matching process, the apparatus may classify a pore at location (x,y) in the query image (Q) as a match with the corresponding pore in stored template image (T) only if the corresponding pore distance is less than a predetermined first threshold and not considered to match if the pore distance is equal to or exceeds said predetermined threshold.
Where the pore distance is less than a predetermined first threshold, then the matching means may classify a pore at location (x,y) in the query image (Q) as being matched with the corresponding pore in stored template image (T) only if a second error measure, being the difference between size and shape of the corresponding pores in respective images, is less than a predetermined second threshold and are not considered to match if the said second error measure is equal to or exceeds said predetermined second threshold. If the second error measure is equal to or exceeds a predetermined second threshold, then the matching means may classify a pore at location (x,y) in the query image (Q) as being matched with the corresponding pore in stored template image (T) only if difference the size, shape and position of pores neighbouring those used in the computation of the second error measure of the query image (Q) and corresponding neighbouring pores in the template image (T) are below the predetermined second threshold. The apparatus may advantageously comprise contribution means for controlling the contribution of the second error measure to a correlation score, wherein the error measure contributes to the said correlation score for matches and does not contribute to the correlation score for non-matches.
In a further aspect of the invention the system may also comprise means for recovering low visibility pores which is operable to interpolate and to extrapolate the locations of visible pores in the modelled fingerprint and provide the locations of low visibility pores.
The system may also comprise image acquisition means for recording fingerprint images and for forwarding, either them directly to the identifying means, or to a storage means and then to the identifying means.
In another aspect of the invention the system includes an alignment means, which may comprise a means for identifying the biological centre and the biological axis of the fingerprint in the image, a means for setting a common reference point and common reference axis, a means for translating the image so that the biological centre of the fingerprint is re-located at the common reference point and a means for rotating the image so that the biological axis of the fingerprint orientation coincides with the common reference axis.
If the biological centre of the fingerprint in question is not present in the image, the off-image location of the biological centre is estimated using a combination of extrapolation of ridges in the on-image portion of the fingerprint pattern and known patterns of fingerprints.
In a further embodiment of the invention a method of processing fingerprint images having an intensity profile of a fingerprint having ridges, valleys and sweat pores, is provided the method comprising the step of, for each image, identifying candidate sweat pore regions from said intensity profile, modelling the intensity profile in at least one dimension in each candidate pore region and identifying at least one of the centre, the size and the shape of the sweat pore from said modelled intensity profile. As part of the identifying step the method of the invention may include filtering out valleys in the image and thereafter identifying regions of high intensity as candidate pore regions. The modelling step comprises fitting a model to the said intensity profile, the model model profile being of predetermined type, to provide a specific model profile of that type which approximates to the said intensity profile. The model form may be a Hermite polynomial, a Gaussian intensity function or any other suitable model.
The method may further comprising an error measure calculating step in which an estimation error is determined, the error measure being the difference between the modelled pore using the intensity profile and unmodelled intensity profile. The method may also comprise fitting the model to the said intensity profile by 0 minimising said error measure, using Minimum Mean Square Error, Iterative Maximum likelihood or any other error minimisation process to improve the model and improve the fit of the model to the profile data.
The method may also comprise identifying the magnitude of any divergences from the ideal model and eliminating divergences above a predetermined threshold, for example by means of Principal Component Analysis (P CA) is applied on the modelled intensity profile to eliminate regions that diverge more than a predetermined threshold from the ideal model.
The invention also foresees a matching process in which query images (Q) are compared against stored template images (T) in order to identify matches. The probability that the two images are from the same fingerprint may be calculated, said probability being a function of location, shape and size of the pores. The probability calculating step may comprises calculating the probability P_rn of a match between a pore at location (x,y) in the query image (Q) and the corresponding pore in stored template image (1) is P_rn, using the posterior probability function defined as P_rn = P(locationjQ,T)P(ShaPeIQ,T) The method may comprise identifying fingerprint ridges within the image. The centres of said candidate pore regions and the centreline of said ridges may be defined by predetermined intensity levels, the method further comprising step of eliminating candidate regions, for which the distance between the centre of said candidate pore regions and corresponding ridge centreline is mare than a predetermined threshold. The method may also include the step of eliminating candidate regions, which occur at a frequency other than the fundamental frequency of pore regions on a fingerprint ridge.
The pore distance between a centre of a pore at location (x,y) in the query image (0) and those of a corresponding pore in stored template image (T) may be calculated. In one embodiment, corresponding pores in the query image (0) are classified as a match with the corresponding pores in the template image (T) only if the pore distance is less than a predetermined first threshold and not classified as a match if the pore distance is equal to or exceeds said predetermined threshold In a further aspect, if the pore distance is less than a predetermined first threshold, then the method includes classifying a pore at location (x,y) in the query image (0) as being matched with the corresponding pore in stored template image (T) only if a second error measure, being the difference between size and shape of the corresponding pores in respective images, is less than a predetermined second threshold and are not classified as a match if the said second error measure is equal to or exceeds said predetermined second threshold If the second error measure is equal to or exceeds a predetermined second threshold, then a pore at location (x,y) in the query image (Q) may be as being matched with the corresponding pore in stored template image (1) only if difference the size, shape and position of pores neighbouring those used in the computation of the second error measure of the query image (Q) and corresponding neighbouring pores in the template image (T) are below the predetermined second threshold. The contribution of the error measure to a correlation score may be controlled, wherein the error measure contributes to the said correlation score for matches and does not contribute to the correlation score for non-matches.
In another embodiment the method may comprise a step of recovering low visibility pores, which is operable to interpolate and to extrapolate the locations of visible pores in the modelled fingerprint and provide the locations of low visibility, pores.
The method may also comprise an image acquisition step, wherein for recording fingerprint images and for forwarding, either them directly to the identifying means, or to a storage means and then to the identifying means.
In a another aspect the method may also comprise an alignment step, which advantageously comprises identifying the biological centre and the biological axis of the fingerprint in the image, setting a common reference point and common reference axis, translating the image so that the biological centre of the fingerprint is re-located at the common reference point, rotating the image so that the biological axis of the fingerprint orientation coincides with the common reference axis.
If the biological centre of the fingerprint in question is not present in the image, the off-image location of the biological centre may be estimated using a combination of extrapolation of ridges in the on-image portion of the fingerprint pattern and known patterns of fingerprints.
In another aspect of the invention the processes and method steps described above are automated in a computer system. The computer instructions for implementing the above method is stored on a computer program product comprising a readable medium.
In an embodiment of the invention a method of processing fingerprint images is provided, comprising the steps of, for each image to be processed, justifying the image by translation and/or rotation and partitioning the image into a number of regions, and, for each region of the image, measuring at least one of the following parameters, the prevailing ridge orientation, the average ridge separation and the phase, and storing said measurement values, and, for all the processed images.
projecting the said measured values for each region into a multidimensional first coordinate system and representing the images in said first coordinate system, wherein a representation distance between representations of corresponding parameters of the two images is indicative of the dissimilarity of the corresponding images.
The justifying step may comprise identifying the biological centre and the biological axis of the fingerprint in the image, setting a common reference point and common reference axis, translating the image so that the biological centre of the fingerprint is re-located at the common reference point and rotating the image so that the biological axis of the fingerprint orientation coincides with the common reference axis. If the biological centre of the fingerprint in question is not present in the image, the off-image location of the biological centre may be estimated using a combination of extrapolation of ridges in the on-image portion of the fingerprint pattern and known patterns of fingerprints.
A periodic wave function, possibly sinusoidal, may be used as a model to simulate that part of the image in the region, wherein the said parameters are measured on said model image and/or on said real image. An estimation error may be computed, the estimation error being the difference between parameter measurements on the model and on the unmodelled image. If the estimation error in a particular region exceeds a predetermined threshold, then a further partitioning step may be applied to that region to create sub-regions of the region and the measuring step is applied within the sub-regions.
Images may be represented in a first coordinate system by vectors V corresponding to the measured parameter values of each region of each image, the coordinate system forming a vector space. The variance or visibility of the said representation distance may be enhanced by various techniques. In one embodiment of the invention, the measurement data is projected into a second coordination system, wherein the representation distance between representations of two images in the second coordinate system is greater than the representation distance between representations of two images in the first coordinate system. In another embodiment the variance or visibflity may enhanced by dimension reduction, such as Principal Component Analysis (PCA), wherein at least one dimension is eliminated from the first coordinate system. Eigenvalue/Eigenvector decomposition is used on the covariance matrix C of the original set of patterns, V, to produce Eigenvectors V', where the mean pattern vector M=E[V] and C=E[(VM){(VM)}AT].
A variance score may be assigned to each enhanced system according to the representation distance between the representations, the variance score being indicative of dissimilarity between the representations.
in an embodiment of the invention a portion of the dimensions of the coordinate system are excluded from processing and only a non- excluded fraction K of all the dimensions are permitted to be processed, the excluded dimensions being those the elimination of which causes a variance score below that of a predetermined second threshold.
Images may be categorised according to locations of the corresponding representations in the coordinate system. For all images of a particular category of image, a class template image may be determined, this being the mean pattern M_c for that class, c. Each of the images in the class may be partitioned into a number of regions, region size being based on distance from the core.
If the size of the sub-region is below that of a predetermined second threshold, then the region or sub-region is not partitioned and the transformation parameter is applied without further partitioning.
A transformation to apply to each region of an image may be determined, wherein representations of parameters of a candidate image are compared to representations of corresponding parameters in the same regions of the class template image M_c and the representation distance between the candidate image representations and the template image is determined to be the region transformation parameter.
A score may be assigned to each comparison of representations according to the degree of simiiarity between the representations in the regions of the candidate image regions and those in corresponding regions of the class template image. If the score equals or exceeds a predetermined third threshold, the transformation parameter is applied to the parameters in that region of the candidate image, transforming the representations by the transformation parameter. If the score is less than a predetermined third threshold, then a further partitioning step is applied to that region to create sub-regions of the region and the measuring step is applied within the sub-regions and the comparison step is repeated.
Further images may be acquired and the stored data and coordinate system may be updated accordingly. The locations and orientation of minutiae may be identified in each justified fingerprint image and stored. Data relating to sweat pores may be identified in each justified fingerprint image and stored, the data comprising at least location, shape and size of the sweat pore. The sweat pore and/or minutiae data may be projected into the coordinate system which is accordingly updated to include representations of these data.
Representations may be grouped into clusters by application of a clustering technique, which may be k-means clustering or an iterative clustering technique wherein the representations are clustered according to the representation distance between them and the relative spread of current clusters.
In one embodiment of the invention candidate images are compared against stored images in order to identify matches. This may further comprise the following steps: acquiring a candidate image; measuring and storing parameters of the candidate image; projecting measured values into the first coordinate system which is updated accordingly; applying the categorisation, partitioning, transformation identification, scoring and transformation steps on the candidate image; identifying the locations and orientation of minutiae in the candidate image and projected into the first coordinate system; assigning a probability score to the candidate image, the probability score being the probability that the image will qualify into a predetermined class of images; classifying said candidate image into one or more classes of images according to its probability score in those classes; comparing representations of minutiae data of the candidate image to representations of minutiae data of the template image of the same class: identifying sweat pore data in the candidate image image and stored, the data comprising at least location, shape and size of the sweat pore; comparing representations of sweat pore data of the candidate image to representations of sweat pore of the template image of the same class; a matching score assigning step wherein a matching score is assigned to each comparison of representations according to the degree of similarity between the representations in the regions of the candidate image regions and those in corresponding regions of the template image of the same class; declaring a match if the matching score is above a predetermined threshold. The probability assigning step further may comprise comparing the candidate image representations to the mean of the predetermined class, and assessing the probability that the image will qualify in that class taking that mean and the spread of the representations within the class. When the assessment indicates that the candidate image does not qualify into the predetermined class, a non- match may be declared.
In a further main embodiment of the invention an apparatus for processing fingerprint images is provided comprising: means adapted to justify images by translation and/or rotation and partition images into a number of regions, and means for measuring in each region of each image at least one of the following parameters: the prevailing ridge orientation; the average ridge separation; the phase, and means for storing said measurement values, and means for projecting the said measured values for each region into a multidimensional first coordinate system and means for representing the images in said first coordinate system, the representation distance between representations of corresponding parameters of two images being indicative of the dissimilarity of the corresponding images. The apparatus may also comprise means to: identify the biological centre and the biological axis of the fingerprint in the image; set a common reference point and common reference axis; translate the image so that the biological centre of the fingerprint is re-located at the common reference point; rotate the image so that the biological axis of the fingerprint orientation coincides with the common reference axis.
The apparatus may be adapted to estimate the off-image location of the biological centre using a combination of extrapolation of ridges in the on-image portion of the fingerprint pattern and known patterns of fingerprints, if the biological centre of the fingerprint in question is not present in the image.
In one embodiment the apparatus may comprise modelling means for applying a periodic wave function model, such a sinsusoidal function, to simulate that part of the image in the region, wherein the said parameters are measured on said model image and/or on the unniodelled image.
The apparatus may determine an estimation error, the estimation error being the difference between parameter measurements on the model and on the unmodelled image.
The apparatus may comprise a second partitioning means wherein, if the estimation error in a particular region exceeds a predetermined first threshold, then the second partitioning means applies a second partitioning to that region to create sub-regions of the region and the measuring step is applied within the sub-regions.
In an embodiment of the invention the apparatus may advantageously comprise means for representing images in said first coordinate system by vectors V which correspond to the measured parameter values of each region of each image, the coordinate system forming a vector space.
The apparatus may also comprise means for enhancing the visibility of the said representation distance, which may be a means for projecting the measurement data into a second coordination system, wherein the representation distance between representations of two images in the second coordinate system is greater than the representation distance between representations of two images in the first coordinate system.
Alternatively the enhancement means may be a means for reducing the dimensions of the first coordination system, wherein at least one dimension is eliminated from the first coordinate system, which may be a means for applying Principal Component Analysis (PCA) obtained by Eigenvector/Eigenvalue decomposition.
The apparatus may further comprise a variance score assignment means for, after each enhancement step, assigning a variance score to each enhanced system according to the representation distance between the representations, the variance score being indicative of dissimilarity between the representations.
Advantageously, the apparatus may include means for excluding a portion of the dimensions of the coordinate system from processing and only a non-excluded fraction K of all the dimensions are permitted to be processed, the excluded dimensions being those the elimination of which causes a variance score below that of a predetermined second threshold.
A L.....&L. L__I._L _ZL.__L._. Z_ ) I IUI LI IJ ITIUUUlITlIiL UI LI ie ii iVIiUUIi (..OITIpI ISS iiier iS lOt (_LgOl Silly III according to locations of the corresponding representations in the coordination system and may comprise means for categorising all images of a particular category of image, the step of determining a class template image, this being the mean pattern M_c for that class, c. The embodiment may comprise means for partitioning each of the images in the class into a number of regions, region size being based on distance from the core. The apparatus may be adapted not to partition the region or sub-region and to allow transformation parameter to be applied without further partitioning if the size of the sub-region is below that of a predetermined second threshold. The embodiment may also comprise means for identifying a transformation, the means being adapted to: compare representations of parameters of a candidate image to representations of corresponding parameters in the same regions of the class template image M_c and determine whether the representation distance between the candidate image representations and the template image is to be the region transformation parameter.
There may also be a means for assigning a score to each comparison of representations, the score being assigned according to the degree of similarity between the representations in the regions of the candidate image regions and those in corresponding regions of the class template image.
The apparatus may be further adapted to apply the transformation parameter to the parameters in that region of the candidate image and transform the representations by the transformation parameter, if the score equals or exceeds a predetermined third threshold. it may also be further adapted to apply further partition that region to create sub-regions of the region and apply the measuring step within the sub-regions and repeating the comparison, if the score is less than the predetermined third threshold.
In another embodiment of the invention there is an image acquisition means for acquiring further images and updating the stored data and coordinate system accordingly. This may comprise a minutiae locating means for identifying the locations and orientation of minutiae in each justified fingerprint image and stored. It I:..... :.,I........:c. .:........ ..L..i.... ..I...e: J IIIcy dIU LUIIIF)lIbt d WL puie IUdLIII IIIdlI IUI IUIILIIIII Uclld IIdLIII Wec2L pores in each justified fingerprint image and stored, the data comprising at least location, shape and size of the sweat pore. The apparatus may include means for projecting sweat pore and/or minutiae data into the coordinate system which is accordingly updated to include representations of these data.
In another aspect of the invention there is also a clustering means for grouping representations into clusters by application of a clustering technique, which may be k-means clustering. Alternatively the technique may an iterative clustering technique wherein the representations are clustered according to the representation distance between them and the relative spread of current clusters.
An embodiment of the invention further comprises matching means adapted to compare candidate images against stored images in order to identify matches. This may be adapted to: acquire a candidate image, measure and store parameters of the candidate image, and project measured values into the first coordinate system which is updated accordingly. This matching means may comprise means for: applying the categorisation, partitioning, transformation identification, scoring and transformation steps on the candidate image; identifying the locations and orientation of minutiae in the candidate image and projected into the first coordinate system; assigning a probability score to the candidate image, the probability score being the probability that the image will qualify into a predetermined class of images; classifying said candidate image into one or more classes of images according to its probability score in those classes; comparing representations of minutiae data of the candidate image to representations of minutiae aata of the template image of the same class; identifying sweat pore data in the candidate image image and stored, the data comprising at least location, shape and size of the sweat pore; comparing representations of sweat pore data of the candidate image to representations of sweat pore of the template image of the same class; a matching score assigning step wherein a matching score is assigned to each comparison of representations according to the degree of similarity between the representations in the regions of the candidate image regions and those in corresponding regions of the template image of the same class and declaring a match if the matching score is above a predetermined threshold. In this embodiment the probability assigning means may f �....J.L...... �...... _...__. �L.. __._J:_i._L_ L... 4....&:.......... 1..... 41......
J iUtLiiti iuiiiptist UItdliS IUI UUI1Ip1IIuIy LlIt dIlUIUdLt iiiyt LI.) LIIt mean of the predetermined class, and assessing the probability that the image will qualify in that class taking that mean and the spread of the representations within the class. The assessment means may comprise means for declaring a non-match when the candidate image does not qualify into the predetermined class, according to the probability assessment.
In a further main embodiment a computer program product comprisesg a readable medium containing instructions for implementing the method herein described.
More specifically, the construction of a canonical frame involves the step of: dividing the input image into a set of blocks (regions); using a parametric modelling technique to model the feature of interest within the block, specifically for fingerprint images, it is to model the ridge direction and separation within the block; identifying the intrinsic centre and orientation of the image; aligning the direction and separation pattern according to the intrinsic centre and orientation of the impression; reducing the dimensionality of the direction and separation pattern by transforming them into a new co-ordinate system; and projecting the reduced vector onto this co-ordinate system.
Another aspect of the invention provides a method that extracts the sweat pore information. The method includes the step of: identifying the possible sweat pore locations; modelling the local intensity information around the sweat pore candidates; removing the spurious sweat pores by combining the local intensity information and its relative distance and orientation from tile adjacent ridges.
Another aspect of the invention provides a method that removes elastic deformation between two images. The method includes the step of: dividing the images into a set of local regions; estimating the transformation parameter between the data (query image) and the target (template image); applying the transformation parameter to each region in the query image and obtaining an alignment error; if the error is sufficiently large, which suggests that there is still an elastic deformation within the region, a subdivision of the region into a set of smaller regions is then carried out and the estimation for each smaller region is repeated; the estimation process wili not stop until the error for each region is sufficiently low; applying the final transformation parameters to the corresponding region in the query image and therefore transform it into the template frame.
According to another aspect of the present invention, a system of enrolling fingerprint images is provided that includes the steps of: acquiring a fingerprint images, modelling the ridge structures; projecting the model parameter onto the canonical frame; extracting a minutiae set from the image; extracting the sweat pore information; projecting the minutiae information onto the canonical frame; constructing and storing the template for the future use in the matching procedure; and classifying the templates based on their distance in the canonical feature space.
A further aspect of present invention provides a method that identifies the query fingerprints from one or more stored templates. The method includes the step of: acquiring the query fingerprint images; modelling the ridge structures; identifying the intrinsic centre and orientation; projecting the model parameter onto the canonical feature space; calculating the probability that the query image belongs to template class; estimating the elastic deformation between the query print and each mean of the template class when the probability is high enough; applying the global and local deformation to the query fingerprints and normalising it to the mean of the corresponding class; extracting the normalized minutiae information; determining a minutiae matching score by comparing the normalized minutiae information and the information stored in each template within the class; generating an overall score based on a combination of the probability and minutiae matching information; making a matching decision comparing the overall score with a predefined value.
Brief Description of the Drawing
The above and other aspects, features and advantages will be understood more fully from the following detailed description of the preferred embodiment of the invention, which, however should not be taken to be the restriction to the invention, L. . :. t_._._.._.I I..
J UUL lb 101 tAI.JldllcZLlOll jJUl.)U Ulily.
Fig. I is a diagram showing a digitized example fingerprint images illustrating an intrinsic centre (core), orientation, ridge, minutiae and sweat pore.
Fig. 2 is a flowchart showing the prior art steps of a typical fingerprint matching system.
Fig. 3 is a flowchart illustrating the prior art steps of a typical feature extraction method.
Fig. 4 is a diagram showing one embodiment of tessellating a fingerprint image using a multi-resolution method.
Fig. 5 is a flowchart showing the steps of one embodiment of constructing a canonical frame using the method of the present invention.
Fig. 6 is a flowchart showing the steps of one embodiment of extracting and modelling sweat pore information using the method of the present invention.
Fig. 7 is a flowchart showing the steps of one embodiment of a nonlinear alignment process using the method of the present invention.
Fig. 8 is a flowchart showing the steps of one embodiment of extracting and modelling sweat pore information using the method of the present invention.
Fig. 9 is a flowchart showing the steps of one embodiment of identifying the query fingerprint image against one or more stored templates using the method of the present invention.
Detailed description of the invention
The present invention will be more clearly understood from the detailed description and figures of preferred embodiments given below.
In the following description, well-known functions and operators are not described in great detail to avoid obscuring the invention with unnecessary detail.
In the following section we shall explain a method and apparatus for processing fingerprint images where templates may be converted to points in a feature space that is invariant to presentation of the prints: the space is referred to as a canonical representation, meaning that the fingerprint images are located in a standardised space. Furthermore, the feature space representation may be compact and robust to noise in the acquired images (such as scratches). We also describe a non-linear extension to the space, which allows us to handle elastic deformation of the presented prints. Subsequent passages demonstrate the use of the canonical feature space in complete image processing methods.
An important advantage of the invention is that may be applied exclusively to the model of the ridge structures (their local directions and spacing). Second level ridge features (minutiae) and third level fingerprint features (sweat pores) are not essential to convert the fingerprint into canonical form. This canonical representation is independent of the 2 and 3rd level features. Thus it avoids the dilemma of having to establish the location and correspondence of such features between the test and template, while simultaneously estimating the alignment.
Advantageously, in one particular embodiment, the feature space may also be readily partitioned into prints belonging to the first level (pattern types such as arch, whorl etc), thus reducing the computational complexity of any 1:N pattern search.
Construction of the canonical feature sDace: According to an embodiment of present inventions, Fig 5 is a flowchart showing the steps of constructing the canonical framework. Prior art studies show that there are limited number of topological configurations of the ridge pattern, such as left loop, right loop, whorl, arch and tented arch. The formation of the feature space can be done in an offline mode i.e. process a collection of pre-stored fingerprint images to construct the initial feature space, or in an online mode where the construction of the feature space proceeds incrementally when enrolling and matching the prints in real time. In both cases, the process begins by using a parametric model to represent the ridge patterns.
According to one embodiment of the current invention, after a fingerprint image is acquired from a scanning device or a stored template, a hierarchical tessellation technique is used to divide image into a set of blocks (step 502), a parametric model is then used to model the ridge segments for each block (step 503). The blocks may be square, although any convenient partition of the image may be chosen.
For each block of the image, an estimate of characteristic parameters of the periodic ridge pattern within the block is derived by modelling the pattern to a suitable modelling functions. In a preferred embodiment, a sinusoidal model is adopted to represent the local ridge features, although other periodic smooth wave function can be used as the mathematical model. By fitting the model to the image an estimate of the parameters of the pattern can be derived from corresponding measurements on the model. The orientation and frequency of the model can be estimated by many alternative techniques (step 504). Thus, in a preferred embodiment, each block is transformed to the frequency domain and the orientation and frequency of the signal can be estimated by locating the peak of the magnitude spectrum. The ridge pattern within the block may then synthesized using the sinusoidal model with the estimated parameter of frequency and orientation. In a preferred embodiment, the phase of the ridge segments is also estimated, by calculating an inner product between the data and the synthesized model.
In one embodiment of the invention an estimation error can also be calculated be comparing the synthesized model and the real data. The error may vary from region to region, as a result of non-uniform effects, such as displacement, rotation, partial overlap, elastic deformation, variable pressure, varying skin condition, lighting effects, noise and feature extraction errors, as indicated earlier. The error may be estimated for each region: in regions where the error is higher than a predetermined level, this may be considered as an indicator that the data in the region is too complex to be estimated by the current model. Where this occurs the region can be sub-divided into a set of smaller regions. Step 504 may be repeated until all the regions of the image are modelled and all the modelling error is lower than a pre-determined level in each region.
Steps 505 and 506 are centring and alignment steps: these aim to re-centre the biological centre of the fingerprint with the image's geometric centre and to re-align the fingerprint orientation with a general axis. Figure 5 illustrates these as occurring after Step 504, but these may also take place before Step 504. A fingerprint core (intrinsic centre) is an area located within the innermost ridges. Normally it is located in the middle of the fingerprint, however, depending on the scanned area, it might not be positioned in the middle of the image, or indeed might not even present in the image. When the core is present in the image, it can be detected by many alternative techniques. In a preferred embodiment of our invention, a set of circular symmetric functions is used to locate the location and orientation of the core (step 505). One embodiment is to convolve the spacing/direction values of the local regions represented by a 2D array of vector values (s, d) with a 19 x19 vector valued filter kernel which is circular symmetric and has values (i, j), 9 <= <= 9.
The output of the convolution is taken as the sum of the absolute values of the dot products of the (S, D) image and the kernel (I, J). The image position of the maximum value is taken as the nominal intrinsic centre. The principal orientation of the print is estimated by the modal value of a histogram of the directions (D) in circular region around the centre of radius 96 pixels. The centre (x, y) of the core and the principal direction P are stored for the print.
When the core is not present, its position can only be estimated by convolution (step 505) with respect to a representative template prototype of each partition (step 509). This prototype pattern is the mean of a population of learnt templates.
The output is therefore one or more core locations (x, y) and principal directions P each of which are subsequently characterized (step 510).
Once the orientation and location of the intrinsic centre (core) is located, the collection of estimated ridge directions and separations are shifted and rotated with respect a common origin (core) and principal orientation (step 506).
In a typical fingerprint image, there are around 3000 regions, although it may vary greatly depending on the scan fling device and the fingerprint size. The parameters measured in each region may be represented by a three dimensional vector corresponding to the derived estimates of ridge orientation, separation and phase from the mathematical modelling of the ridge pattern in Steps 503 and 504. The total dimensions of the pattern vector for all regions of each complete fingerprint may be of the order of hundreds or thousands, each finger print being represented by a collection of regional vectors. By duplicating this mapping process for each vector group of each fingerprint, the entire set of stored fingerprints may be regenerated in a corresponding vector space within a multi-dimensional coordinate system.
The matching process (not part of Figure 5), whether verification or identification, is essentially a comparison between two fingerprints (see earlier). In the context of a vector space the comparison between prints requires a distance measure between the collection of vectors for each print. This involves calculations in a vector space of very high dimensions, which requires considerable computing capacity, which may be inefficient and expensive.
An object of the invention is that it advantageously lowers the computation requirement to within normal computing capacity. In an aspect of the invention the diniensionality of the feature space may be reduced by various techniques (step 507), the collection of vectors is projected into a new co-ordinate system such that the greatest variance by any projection of the data comes to lie on the first axis (call the first principal component), the second greatest variance on the second axis, and so on.
In an aspect of the invention a technique known as Principal Component Analysis (PCA), is applied to the fingerprint vector space to reduce the dimensions to a manageable level and thereby to facilitate analysis of the fingerprint vectors therein. To describe the implementation in more detail, each aligned pattern feature vector is averaged to produced a mean pattern vector, M E[VJ. The covariance of the pattern vectors, C = E[(V-M)(V-My'T], which is the expected values of the outer products of the difference of the patterns from the mean, V -M, is calculated, A1 being the transpose of the vector or matrix and EQ the expectation operator. Then a new set of principal feature directions, V', is obtained by PCA.
The set of eigenvalues of the eigenvectors produced by the PCA, E, then form the basis set of the canonical feature space.
The feature vectors V' may be made compact by only taking a subset which I.C.L.
eriapsuiue sutrI pweiiwy, r, vi ui VdIiLIOII, t.g. r (.d11 ue utuiiciuuy bL LU be 95% of the total variation. V' = (El, E2, E3, E4... EK)(al, a2, a3, a4, ... aK)'T, where al. .aK are a set of scalars, and El. .EK are the unit length eigenvectors of the covariance matrix C (AT is the transpose as before). K is selected in step 507 such that the total variation is less than some percentage e.g. 95%. The scalars al,.aK are calculated in the standard way by measuring the projection orV on the said eigenvectors, i.e. aj = VAT Ej, for the jth scalar and jth eigenvector.
In another embodiment of the method, the unsupeivised mode, the system may use the successful matches to learn and update the parameters of the stored template (its mean and covariance in the canonical feature space). The more data that is learnt by the system, the better it will be at distinguishing prints: the inter-to intra-class variability can be better maximized using a classifier such as Linear Discriminant Analysis (LDA) or a non-linear kernel learning method, such as kernel LDA.
The aligned vectors, which represent each fingerprint image, can thus be projected onto a vector space V and thereafter, using the above dimension reduction techniques or other alternative techniques, onto a common feature space V', which is invariant to the presentation of the print (step 508).
After processing all the images either in an offline mode or online mode, each image will be presented as a point in the reduced feature space. Depending on the dimension reduced, different variations between points may become apparent, thereby enhancing or suppressing similarities between points.
An advantage of the invention is that by projecting values of fingerprint parameters, measured region by region, into a fingerprint space containing, for example vectors, representing those parameters, the data can be processed flexibly Enhancing the data by the methods indicated above, such as dimensionality reduction, allows the user to bring out or diminish similarities between fingerprints in a way vastly more convenient than any prior art techniques.
By grouping points according to their location in any particular reduced dimension space, proximity between points may be utilised as a proxy for parameter similarities. Many clustering techniques can then be applied to partition the space of prints (step 509). One method is k- means clustering, which is a two step procedure: each template is first associated or labelled to the closest prototype of an initial set of M cluster prototypes; the locations of these prototypes is then updated by moving to the current labelling. Another iterative method is to use both the distance between templates and the relative spread of the current set of clusters. In another method, the clusters are discovered by hierarchical grouping into larger and larger clusters. In some methods, the number of clusters M may be input to the algorithm, or they me be discovered, as is the case of certain hierarchical agglomeration clustering methods.
As will be appreciated by the reader, the invention offers the advantage of managing a large and highly complex data set. By projecting the data associated with regions of the fingerprint into a canonical feature space, ie a representative multidimensional coordinate system, the data can be manipulated more conveniently and dissimilarities between images brought out more easily. The effects of noise, scratches may be eliminated relatively easily in the canonical feature space.
A further advantage of the invention as proposed is the non-reliance on minutiae and sweat pores, which, in many prior art systems, are essential for achieving any degree of accuracy in measuring or matching fingerprints -the invention disclosed may indeed be combined with data related to sweat pores and minutiae, but in its simplest form is independent of tnese.
These components (steps) of constructing a canonical feature space could be implemented using alternative techniques. These alternative combinations are within the contemplation of the inventors.
Detecting the sweat pore information The present invention relates to feature extraction methods for fingerprint images.
More specifically, the present invention relates to methods of extracting third level �.....&. . :..i:...... . ...L..J. ....:.... &L:.. :.............: 3 Ic*LUI IIIIUIIIIdLIUII bUUII d bWtdL UW dIIU uuiiy LtII IIIIUIIIIdLIUII dIUIIt UI cilUlly with minutiae information for enrolment, matching, storage of fingerprint templates The invention may also be used in combination with various methods for detecting whether the sample directly taken from a human finger or if the print is taken from an artificially produced or forged fingerprint using for example latex "fingerprints" applied over fingertips. Such detection methods are not part of the invention and are not described further According to an embodiment of one aspect of present invention, Fig 6 isa flowchart showing the steps of detecting the sweat pore information. Using the steps previously proposed (step, 502, 503, 504), ridge patterns can be explicitly represented by some periodical mathematical model. The ridge information can thus be removed from the original image by subtracting the reconstruction of the ridge pattern from the original data (step 601). The residual information will contain sweat pores and other background feature noise. The generic profile of sweat pores is believed to be round (blob) shaped type of features and generally have higher intensity values than the background, however, their size and shape can vary and sometimes the boundary can be highly irregular. A consistent and robust identification of sweat pore features that measures not only the location but also their size and shape is therefore challenging.
In one preferred embodiment of the present invention, a 2D Hermite polynomial is used to model the sweat pore features. The candidate or putative region may be first identified by locating the pixels with high intensity values, a window is placed around those pixels and are then labelled as a candidate or putative region (step 602). In the next step of the embodiment, a parametric model is use to represent the putative pore in each region. (step 603). In one embodiment of the current invention, the original data within the region is first modelled by a Gaussian intensity profile where the parameters, i.e., mean and co-vanance of the intensity model, can be estimated using an iterative maximum likelihood method. An alternative embodiment is to use a Minimum Mean Square Error technique to estimate the parameters. Both techniques are obvious to the skilled people in the art. The Gaussian profile can very accurately model the round shape features, however it is insufficient for features with irregular shapes. The use of a Hermite polynomial is to increase the flexibility of the model and therefore improve the modelling accuracy. In a preferred embodiment, only the first few Hermite coefficients are used to represent sweat pore features.
For whatever model to be applied, the model is applied in at least one dimension or direction: however normally all images will be modelled in two dimensions.
Ideally, we can use all the features identified and modelled at (step 603) for matching fingerprints, however, many other background features including, scanning noise, dirt or oil on the either skin or scanner surface etc will also produce pore like features. To improve the consistency of pore detection, a filtering step (step 604) is proposed on an embodiment of the present invention.
A filtering of the putative pores based on their shape can be applied. In a preferred embodiment, the covanance of the Hermite polynomial parameters estimated in the pore modelling step 603 are analysed using Principal Component Analysis.
The principal modes that encapsulate some proportion of the total variation, e.g. 90% or 95%, are used to filter out pores that have projections of their Hermite polynomial coefficients that lie outside the chosen region of variation in the feature (shape) space calculated by the PCA. A user-defined threshold can be used to control the strictness of the shape filtering which results. This particular embodiment is a linear method of filtering unlikely shape differences, but the invention does not exclude the use of non-linear shape modelling techniques such as kernel PCA.
In secondary filtering step, the location of the pores are considered in relation to the ridge patterns of the impression (step 503) and the distance between sweat pores that lie on the same ridge. Firstly, since pores are always located on the ridges, only putative pores that overlap with the ridges are considered. For the preferred embodiment, those putative pores whose nominal area significantly overlaps a ridge are passed on to the next filter; those that do not are removed.
The second location filter considers those pores that lie more or less on the centre IL.. -...c LL ---i..i__... LL -J:___L........ � 1.L..:j..... % A..lI......LL:I:4. * :...
) III ie UI LI It I iuyt IdIOI I LI It UIIt(.LIUI I UI LI I Iuyt). P I I Idil VdI Ic*UIIILy III.IUILIUI perpendicular to the ridge direction is allowed in proportion to the nominal ridge spacing is allowed. This is a predefined parameter. The next location filter considers the relative frequency of the pores along the entire ridge and a harmonic expansion of the pore locations along the ridge (along the arc length of the ridge centre line) is used to determine the fundamental periodic frequency. Pore locations at higher harmonic frequencies are deleted. The fundamental frequency can be learned using similar, high quality impressions, or learnt from the current impression. In the preferred embodiment, a Founer analysis of the 1 D signal of putative pore positions along the arc length can be used to perform the harmonic analysis.
The intensity profiles of sweat pores are sensitive to noise and pressure. It is possible that some sweat pores have very low visibility or are even invisible due to the low pressure applied when the fingerprint image is acquired. Using the estimate sweat pore frequency along the ridge and the shape and size variation from the neighbouring sweat pores, missing pore can also be recovered. (step 606).
In one embodiment, the locality around each visible pore from good quality enrolled data is associated with an estimate of the pore separations along each ridge by the harmonic analysis. These separations can be interpolated to produce a per-pixel pore-ridge separation map. Then from any corresponding visible pore location after alignment, the putative location of neighbouring pores can be inferred. The inference can be made by associating a putative location a prior probability from any visible pore on the test fingerprint and then combining it with a likelihood, also expressed as a probability, given the intensity and morphology of any adjacent putative pore location being tested.
In the preferred embodiment, these pore shape and pore location filters may be applied in this order, although other combinations of ordering can be envisaged. In the preferred embodiment, both shape and location filters can be used.
As mentioned above, due to the high density of sweat pore features, reflably matching sweat pores between template and query images is extremely challenging. Before the matching procedure can be carried out, the correspondence between sweat pores has to be established. One aspect of this invention is to construct a canonical framework and remove the linear and non-linear deformation independent of minutiae or sweat pore features. The alignment method will be described in detailed in the following section. Once the geometric variation has been removed by the alignment process, with the ridge information, the correspondence can be established by comparing the location of each sweat pores in the query and template images that lie on the same ridge. In prior art systems (WO 99/06942, W02005/022446), the decision process of matching sweat pores is over simplified (Figure 10). After establishing the correspondence, the systems simply calculate the distance between corresponding sweat pores and if the distance is higher than a predetermined value, the pair of corresponding sweat pores are considered as from same fingerprint and therefore contribute to the score. Otherwise, they are classed as non-matching and will not contribute to the score. Although it has been shown that using the sweat pore matching score can reduce the identification error, especially when combined with minutiae information, prior art systems using sweat pores are based on a simplified model in which the sweat pores are represented as point locations. However the appearance of sweat pore and their contrast are highly related to the pressure one applies when the fingerprint is scanned. Some of the sweat pores might not be visible due to the light pressure which will cause false negative errors. On the other hand, due to the high density of those features, even when comparing the sweat pores from different fingerprints, the chance that one oi them is co-located or very near by are relatively high, therefore the simple count on distance measure only are sub-optimal and prone to high false positive errors.
In one preferred embodiment of the present invention (figure 11), we use not only the location of the sweat pore, but also their shape and size. Furthermore, the local variation of the shape, size and frequency of sweat pores lying on the same ridge is also used to recover the missing sweat pores due to noise, low contrast and pressure variation. After establishing the sweat pore correspondence from the template (T) and query image (Q)(step 1101), a distance between corresponding pores is calculated (step 1102). lIthe distance is greater than a predefincd threshold, it is considered as a non-match and does not contribute to the correlation score(step 1108). If the distance is not greater than a predefined threshold, an error measure is calculated using the estimated Gaussian and Herrnite parameters of corresponding sweat pores, which represents its size and shape (step 1104). If the error is less than a predefined threshold, the corresponding sweat pores are considered a match and will contribute to the score (step 1109). lIthe error is greater than the threshold, neighbouring sweat pores that lie on the same ridge either side of the corresponding pair are compared by the same flow (step 1101-1109) (figure 11). II both neighbouring corresponding sweat pores are considered as a match and their local variation in size and shape also support the match hypothesis, the sweat pore is considered as a missing sweat pore from query or template image and therefore a new correspondence is established and the pair is regarded as from the same finger and contribute to the score (step 1109) otherwise it is considered as a non match and does not contribute to the score (step 1108). The score is calculated as S = Number of (matched pores)/Number of (corresponding pore) In another embodiment of the present invention, the distance, shape, size are all considered at the same time and a probability of whether the corresponding sweat pores are from the same finger is calculated. A final score can be derived by a combination of the probabilities of each corresponding sweat pores.
The probability of a match between a pore at location (x,y) in the query image (Q) and the corresponding pore in template image (T) is P_rn, which may be computed as follows: P_m = P(locationlQ,T)P(shapelQ,T); le a product of the location posterior P(locationlQ,T) and the shape posterior P(shapelQ,T).
The following passage considers each of the constituent parts of P_rn: 1. Firstly, the location posterior P(locationQ,T), which may be computed as follows: P(location jQ,T) = Prior(location) L(location IQ,T) le a dot-product of two sub-constituents 1.1.The first of these sub-constituents is the Prior(location), which may be expressed as follows Prior (location) = P(Iocatiori I RS(T))P(location I mw(T), s_w(T)) P(Iocation jRS(Q))P(location m_w(Q), s_w(Q)) where: -P(locationRsçfl) is a location prior where RS((x,y), T) is the parameter of the pore separation along the ridge for the template image T. -P(locationm_w(T), s_w(T)) is the likelihood of the variation of the position perpendicular to the ridge direction, where m_w(T) and s_w(T) are the mean and standard deviation of the pixel variation across the ridge width in image 1.
-P(locationlRS(Q)) is a location prior where where RS((x,y), 0) is the parameter of the pore separation along the ridge for the template image 0.
-P(locationjmw(Q), s_w(Q)) is the likelihood of the variation of the position perpendicular to the ridge direction, where m_w(Q) and s_w(Q) are the mean and standard deviation of the pixel variation (ie expressed as a number of pixels) across the ridge width in image 0 1.2. The second sub-constituent is likelihood function L which may be expressed as: L(locationjQ,T) = P(l_Q I_Tim_I, s_I)
C
which expresses the likelihood by taking the product of the intensity profile in 0 and I that the pore intensities, l_Q and I_I, are sufficiently similar given estimated mean and standard deviation of the intensity variation across the template and query image.
2. Secondly, the posterior on the shape is similarly expressed as the difference in shape variation between the two query and template pores given estimates of the likely variation: P(S_Q-S_TI H_I, H_Q), where H_I, H_0 are parameters of the Gaussian/Hermite modelling of pore shapes from image I and 0.
The advantages of this approach over the prior art step by step discrete matching of pores, is that it provides a probability continuum (between 0-1) of the matching scores. These can be easily combined for all putative pores on the query image against matched locations on the template and weakly visible pores can still contribute to a matching score through the priors that use the ridge separation parameters (RS), and the intensity variation. This embodiment employs probability methods the basics of which requires no further elaboration and should familiar to practitioners in methods of statistical image analysis. However the exact form and combination of the prior and likelihoods may be changed whilst maintaining the proposed method for pore matching and, for example, the variation in choices of probability density functions are within the contemplation of the inventors.
By employing the above mathematical model, e.g. Gaussian/Hermite modelling technique, not only the location, but also combines this with the estimated shape and size of the sweat pore. This allows us to overcome the shortcoming of prior art systems i.e. an over-simplified decision process (fig 10). It also takes into account the variation of sweat pore appearances due to the pressure difference.
More importantly, it reduces the false positive rate by not only comparing the location but also the shape and size of sweat pores. (fig 11) The process (figure 11) is repeated for each pair of corresponding sweat pores and a final score can then be calculated as a combination of the scores for each corresponding sweat pore.
Image alignment Image alignment removes the geometric variation among prints that is caused by scanning the fingerprint images acquired at different angles and displacements on the scanning device. As described above, prior art systems align the fingerprint images based on the minutiae information, which inevitably fall into a combinatorial problem of two unknowns, i.e. the transformation between the minutiae set and the correspondence between the minutiae points.
Image alignment according to the invention may be based on the previously mentioned ridge parameters and may therefore be independent of minutiae distributions. In addition to the global transformation, the alignment method of the invention, as described below, can advantageously also remove the elastic deformation caused by scanning distortion, uneven downward pressure when the fingerprint is placed to the scanning surface, twisting and various other factors.
Under prior art minutiae alignment techniques, such deformations and distortions are difficult to accommodate in a systematic way and can lead to inaccuracies.
Fig 7 is a flowchart showing an embodiment according to the image alignment method of the current invention. According to one embodiment of the current invention, the two input of fig 7 are representations of a query image and the mean of the class the query image belongs to. According to an embodiment of the current invention, the two in puts of Fig 7 is a representation of query image or candidate image and a representation of a image template that has previously been enrolled. This latter implementation is a preferred embodiment for 1:1 matching (verification). In both embodiments, the representation of the image is the multidimensional feature set (separation, direction and phase} generated by using (step 501-506). c
At step 701, the candidate and stored template may be tessellated, thereby creating a number of regions for each image. The feature set for each input is grouped into a set based on its relative position to the core location. Each group will represent a region of the image. The size of the region may thus be dependent on the distance from the core.
The goal of step 702 is to estimate the transformation parameters (scale, rotation, shift} to be applied to each group (region) to offset the deformation and distortion processes indicated above. According to a preferred embodiment of the current invention, a recursive filtering technique is used to estimate the transformation parameters, this being the transformation to be applied to each region. The candidate image is essentially compared to a template representing a given class of images, with which the candidate image is associated. The representation points for each region of the candidate image are compared to the corresponding representations in the same regions of the class template, the difference between them being indicative of the transformation required to eliminate the distortion in that particular region.
The estimation of the transformation process is an iterative process, with a prediction and update step at each iteration. The prediction step is based on the estimates at previous iterations and the updating is based on the error in the new measurement, according to the prediction. The above implementation may be considered to have converged when the difference between the estimate at the previous and current iterations is sufficiently small. According to the above implementation, optimal transformation parameters that best align two corresponding regions in query and template images are thus obtained.
The estimated transformation parameters are then applied to the corresponding regions of a query image to align the region to the corresponding one in the template (step 703). An alignment score for each region is then calculated at the next step (step 704) by comparing the similarity between the corresponding region in a template image and aligned query image. If the alignment score is not high enough, it suggests that there is still some elastic deformation within the region and it hniiIr h rIi,idr1 intr niimhr f n,IIr rriirn nd Qtr (7fl'77fld\ * ** ,, * * *%I * * I V' repeated until the alignment score for each region is high enough or the number of data too few to carry out the recursive calculation at step 702. Once all subgroups (regions) are processed, the transformation parameters for both global and elastic deformations are obtained by interpolating the transformation parameters estimated for each region (step 707). The interpolated transformation is then applied to the query image and aligned it to either the mean of the class or the individual template depending on the implementation.
The advantage of the alignment system according to the invention is that distortions in the fingerpnnt image, due to excess or uneven pressure during extraction, finger roll etc (as described earlier), may be compensated for more easily than in prior art systems. The subsequent tessellation, whereby further partitioning into subregions occurs, according to the distance of the region from the core of the fingerprint, permits greater accuracy in modelling the areas of the print most likely to be subject to such distortions. The improvement in alignment enhances the normalisation effect throughout the data set and minimises the chances of, for example, a mismatch of two images of the same finger.
Enrolment Fingerprint recognition systems have two principal modes of operation: enrolment and matching. During enrolment, acquired fingerprints are stored in a template database, where only those features of the print that are distinguishing are extracted and represented in some form. Fig 8 is a flowchart showing a preferred 4I,... ..J 1.I...--...
ImuouametlL uruuuuy LV) we euiiuiiireii iiiuue uu the uureiut uhIVeflLIuhI.
The enrolment process according to an embodiment of the invention starts from acquiring a fingerprint image from a scanning device (step 801). At the next step (step 802), it is then divided into a set of regions (step 502) and the ridge pattern of each region is modelled, according to previously described methods (at step 503) and the ridge orientation and separation of each region is estimated. The location and orientation of the core is also located the same way as described at (step 505).
The parameter set {separation, direction} is then projected to the canonical feature space as suggested at Fig 5.
In an embodiment of the invention, if the location of the projected pattern overlaps or is very near to a existing pattern from a template enrolled previously, then step (809) is carried out and a duplicate enrolment is declared, i.e. the fingerprint has already been enrolled in the database. Otherwise, the feature space is updated with the new candidate and the classification and clustering steps (509, 510) will also need to be re-calculated with the information from the new image. At the next step (806), the minutiae information, i.e. the location, orientation and ridge count may be extracted from the enrolment image. This information may be also aligned to the canonical feature space. Step 807 is an optional step and it is to extract the pore information from the input image. The information includes the location, shape and distribution of sweat pore features. A detailed embodiment is given above and shown in Fig 6. If the number of detected minutiae and the number of detected sweat pores are lower than a predefined value, which is configurable by the end user, step 811 is perlormed again and an enrolment failure is declared. Otherwise, the location of the image in the canonical feature space, the information of which class and sub-class it belongs to, the ridge parameters, the location and orientation of its core, the minutiae and sweat pore information are encoded and stored as a template image. To maintain the interoperability of the template, it can be generated in a multi-layer fashion so that any part of the information can be used by any foreign method or computer program that can only utilize that part of the information, such as location, type and/or the orientation of the minutiae.
Matching The process of fingerprint matching involves comparing a query print with a set of one or more template prints. There are primarily two modes in this part of the process: a verification mode, i.e. a one to one match is sought and; an identification mode, i.e. a one to many match is sought. The following disclosure is based on the identification mode, however, with only a slight modification of the flow, the implementation can be easily adapted to the verification mode. Fig 9 is a flowchart showing a preferred embodiment of current invention.
After acquiring the image (query) from the scanning device, it is divided into a set of local regions and each region is modelled and ridge information is identified (step 901) and further mapped into the canonical feature space (step 902). Both steps use the same methodology as the one explained in the enrolment process.
At step 903, based on its position in the feature space, the probability that the query image belongs to certain class and sub-class is calculated. The probability of membership of a given query print, Q, belonging to a particular class, c, can be expressed as the posterior probability P(c I V(Q), M_c, C_c), where M_c is the mean of pattern class c and Cc is the covariance of pattern class c. V(Q) is the projection of Q into the canonical feature space V (step 508). In one embodiment, the probability can be obtained without taking into account the spread of each pattern class, C_c (i.e. its covariance). Then, the probability can be based simply on the Euclidean distance, e.g. P(c) = Exp[-II V(Q) -M_c J, ExpJ is the exponentiation function and 11.11 the Euclidean vector norm or distance. Otherwise, to take into account the spread of each class, a probability model, such a normal or Gaussian distribution, can be assumed. Then the posterior probability is a quantity calculated from a Gaussian probability distribution function, e.g. Exp[(V(Q)M_c)AT C_c"-l (V(Q)-M_c)J, where T is the transpose and C_cA1 is the inverse of the covariance matrix C_c. Such a formulation is familiar to experts in pattern classification. The actual embodiment of step 509 will suggest other ways to calculate the membership or classification probability: P(c I V(Q), H_c), where this time H_c is a set of parameters under a different probability model for the distribution of patterns in the class c. The choice of probability model, whether parametric or non-parametric, may be connected with the partitioning process used at step 510. Such variations are under the contemplation of this invention.
If there is no qualified class, step 914 is performed and a non-match declared.
Otherwise the system will proceed to step 905. At this step, for each qualified class, the ridge parameter set of the query image is compared with the mean parameter set for each class. And both linear and non-linear deformation between the two parameters is removed using a method described at image alignment step.
The estimated transformation parameters are then applied to the query image.
At step 906, the minutiae information may be extracted from the aligned query image and compared with the corresponding information in each stored template within the partition class. If the number of minutiae is two low, then algorithm may proceed to step 908, which extracts the sweat pore information (explained in detail at the description of Fig 6). If there are sufficient minutiae present at step 906, the query and stored templates are compared and a matching score is generated purely from minutiae and the probability, step (909). Otherwise the sweat pore information is used to compare the query and the corresponding information in each template within the class.
A combined matching score (909) may be then generated with the combination of the probability measurement calculated at step 903, minutiae and/or sweat pore information. In a preferred embodiment, the two scores: one that expresses the belonging to the nearest patter class c' and; the other expressing the similarity of the minutiae/sweat pore patterns between the query print Q and each template R_c in class c, are combined by taking the product of the two scores expressed as probabilities. Thus, let P(Q matches R_c) = P(c I V(Q), H_c) x P(minutiae score of Q and R_c). The first probability is the same as estimated in step 903, and there are many known ways in which a similarity score based on minutiae and!or pore locations can be readily expressed as a probability. If any of the matching score within the class are higher than a predefined threshold, the system will declare a match between the query image and corresponding template. Otherwise, step 905 is returned to and the templates within another qualifying class are processed. In the preferred embodiment, the order of processing subsequent qualifying classes (and the templates contained within) is determined by taking the largest of the belonging' scores calculated at step 903 first. When all the templates in all the qualifying classes have been compared against the query image, and none of the matching scores are higher than the predefined threshold value, the system will declare a non-match.
It is to be understood that the present description is merely an example of the principles of the invention and should not be taken to be restrictive to the invention and is intended for explanation and understanding purpose only. To those skilled in the art, it will obvious from the following disclosure that alternative embodiments of elements and methods illustrated can be employed without departing from the principle of the invention described herein, the invention being defined in the accompanying claims.

Claims (55)

1. An apparatus for processing fingerprint images having an intensity profile of a fingerprint having ridges, valleys and sweat pores, the apparatus comprising: -a means for identifying candidate sweat pore regions from said intensity profile -a means for modelling the intensity profile in at least one dimension in each candidate pore region -a means for identifying at least one of the centre, the size and the shape of the sweat pore from said modelled intensity profile.
2. An apparatus as in Claim 1 wherein the identifying means is operable to filter out valleys in the image and to thereafter identify regions of high intensity as candidate pore regions.
3. An apparatus as in Claim I or 2 wherein the modelling means comprises a means for fitting to the said intensity profile a model profile of predetermined type, to provide a specific model profile of that type which approximates to the said intensity profile.
4. An apparatus as in Claim 3 wherein the modelling means is operable to apply and fit a Hermite model profile.
5. An apparatus as in Claim 3 wherein the modelling means is operable to apply and fit a Gaussian intensity model profile.
6. An apparatus as in any previous claim further comprising an error measure calculating means for calculating an error measure, the error measure being the difference between the modelled pore using the intensity profile and unmodelled intensity profile.
7. An apparatus as in Claim 3 wherein the fitting means further comprises a means for approximating to the said intensity profile by minimising said error measure
8. An apparatus as in Claim 7 wherein said fitting means is operable to apply a Minimum Mean Square Error model profile.
9. An apparatus as in Claim 7 wherein said fitting means is operable to apply an iterative maximum likelihood model.
10.An apparatus as in any previous claim wherein the modelling means further comprises filter means for identifying the magnitude of any divergences from the ideal model and eliminating divergences above a predetermined threshold.
11.An apparatus as in Claim 10 wherein the filter means comprises means for performing Principal Component Analysis (PCA) on the modelled intensity profile.
12.An apparatus according to any previous claim further comprises matching means adapted to compare query images (0) against stored tempate images (T) in order to identify matches.
13.An apparatus as in Claim 12 further comprising probability calculating means for calculating the probability that the two images are from the same fingerprint, said probability being a function of location, shape and size of the pores.
14.An apparatus as in Claim 13 wherein the probability calculating means is operable to compute the probability of a match between a pore at location (x,y) in the query image (0) and the corresponding pore in stored template image (1) is P_rn, which is computed as follows: P_rn = P(locationlQ,T)P(shapelQ,T);
15.An apparatus as in any previous claim further comprising means for r1,ntifnri fin rrrinf ridncic a,ifhr thc *%A#I IbuIJII I III Uj%I.J uUSSSIfl.AJJ 1111 SI I 1111
16.An apparatus as in Claim 15 wherein the centres of said candidate pore regions and the centres of said ridges are defined by predetermined intensity levels, the apparatus further comprising means for eliminating candidate regions which the distance between the centre of said candidate pore regions and corresponding ridge centreline is more than a predetermined threshold.
17.An apparatus as in Claim 16 further comprising means for withholding from the modelling means those candidate pore regions which occur at a frequency other than the fundamental frequency of pore regions on a fingerprint ridge.
18.An apparatus as in any one of Claims 12 to 18 further comprising means for calculating a pore distance between a centre of a pore at location (x,y) in the query image (Q) and those of a corresponding pore in stored template image (T).
19.An apparatus as in any one of Claim 18 wherein the matching means is operable to is operable to classify a pore at location (x,y) in the query image (Q) as a match with the corresponding pore in stored template image (T) only if the corresponding pore distance is less than a predetermined first threshold and not considered to match if the pore distance is equal to or exceeds said predetermined threshold
20.An apparatus as in any one of Claims 19 wherein, if the pore distance is less than a predetermined first threshold, then matching means is operable to classify a pore at location (x,y) in the query image (0) as being matched with the corresponding pore in stored template image (1) only if a second error measure, being the difference between size and shape of the corresponding pores in respective images, is less than a predetermined second threshold and are not considered to match if the said second error measure is equal to or exceeds said predetermined second threshold
21.An apparatus as in Claim 20 wherein, if the second error measure is equal to or exceeds a predetermined second threshold, then the matching means is operable to classify a pore at location (x,y) in the query image (Q) as being matched with the corresponding pore in stored template image(T) oryff uiiieieriue uie IId dIIU UILIUII UI}.IUWS iieiyriuuuiiriy LIIU UU III the computation of the second error measure of the query image (Q) and corresponding neighbouring pores in the template image (T) are below the predetermined second threshold.
22.An apparatus as in any one of claims 17 to 21 further comprising contribution means for controlling the contribution of the second error measure to a correlation score, wherein the error measure contributes to the said correlation score for matches and does not contribute to the correlation score for non-matches.
23.An apparatus as in any previous claim further comprising a means for recovering low visibility pores which is operable to interpolate and to extrapolate the locations of visible pores in the modelled fingerprint and provide the locations of low visibility pores.
24.An apparatus as in any previous claim further comprising image acquisition means for recording fingerprint images and for forwarding, either them directly to the identifying means, or to a storage means and then to the identifying means.
25.An apparatus according to any previous claim further comprising an alignment means.
26.An apparatus as in Claim 25 wherein the alignment means comprises: -a means for identifying the biological centre and the biological axis of the fingerprint in the image -a means for setting a common reference point and common reference axis -a means for translating the image so that the biological centre of the fingerprint is re-located at the common reference point -a means for rotating the image so that the biological axis of the fingerprint orientation coincides with the common reference axis.
27.An apparatus as in Claim 26 wherein, if the biological centre of the fingerprint in question is riot present in the image, the off-image location of the biological centre is estimated using a combination of extrapolation of ridges in the on-image portion of the fingerprint pattern and known patterns of fingerprints.
S
28.A method of processing fingerprint images having an intensity profile of a fingerprint having ridges, valleys and sweat pores, the method comprising the step of: For each image, -identifying candidate sweat pore regions from said intensity profile -Modelling the intensity profile in at least one dimension in each candidate pore region -Identifying at least one of the centre, the size and the shape of the sweat pore from said modelled intensity profile.
29.A method as in Claim 28 wherein the identifying step comprises -Filtering out valleys in the image -Thereafter identifying regions of high intensity as candidate pore regions
30.A method as in Claim 28 or 29 further comprising a model for fitting to the said intensity profile a model profile of predetermined type, to provide a specific model profile of that type which approximates to the said intensity profile.
31.A method as in Claim 30 wherein the model means is of a form of a Hermite polynomial.
32.A method as in Claim 30 wherein the model means is of a form of a Gaussian intensity function.
33.A method as in any one of Claims 28 to 32 further comprising an error measure calculating step in which an estimation error is determined, the error measure being the difference between the modelled pore using the intensity profile and unmodelled intensity profile.
34.A method as in Claim 30 further comprises of fitting the model to the said intensity profile by minimising said error measure
35.A method as in Claim 34 further comprises applying a Minimum Mean Square Error to fit the model to the said intensity profile.
36.A method as in Claim 34 further comprises applying an Iterative Maximum likelihood to fit the model to the said intensity profile.
37.A method as in Claims 28 to 34 further comprises identifying the magnitude of any divergences from the ideal model and eliminating divergences above a predetermined threshold.
38.A method as in Claim 37, wherein Principal Component Analysis (PCA) on the modelled intensity profile is applied to eliminate regions that diverge more than a predetermined threshold from the ideal model.
39.A method according to any one of Claims 28 to 38 further comprises comparing query images (Q) against stored template images (T) in order to identify matches.
40.A method as in Claim 39 further comprising calculating the probability that the two images are from the same fingerprint, said probability being a function of location, shape and size of the pores.
41.A method as in Claim 40 wherein the probability calculating step comprises calculating the probability of a match between a pore at location (x,y) in the query image (Q) and the corresponding pore in stored template image (T) is P_m, using the posterior probability function defined as follows: P_rn = P(locationlQ,T)P(shapelQ,T);
42.A method as in any one of Claims 28 to 41 further comprising step of identifying fingerprint ridges within the image.
43.A method as in Claim 42 wherein the centres of said candidate pore regions and the centreline of said ridges are defined by predetermined intensity levels, the method further comprising step of eliminating candidate regions, which the distance between the centre of said candidate pore regions and corresponding ridge centreline is more than a predetermined threshold.
44.A method as in Claim 43 further comprising the step of eliminating candidate regions, which occur at a frequency other than the fundamental frequency of pore regions on a fingerprint ridge.
45.A method as in any one of Claims 39 to 45 further comprising a step of calculating a pore distance between a centre of a pore at location (x,y) in the query image (Q) and those of a corresponding pore in stored template image (T).
46.A method as in Claim 45 wherein the corresponding pores in the query image (Q) are classified as a match with the corresponding pores in the template image (1) only if the pore distance is less than a predetermined first threshold and not classified as a match if the pore distance is equal to or exceeds said predetermined threshold
47.A method as in any one of Claims 46 wherein, if the pore distance is less than a predetermined first threshold, then classifying a pore at location (x,y) in the query image (Q) as being matched with the corresponding pore in stored template image (T) only if a second error measure, being the difference between size and shape of the corresponding pores in respective images, is less than a predetermined second threshold and are not classified as a match if the said second error measure is equal to or exceeds said predetermined second threshold
48.A method as in Claim 47 wherein, if the second error measure is equal to or exceeds a predetermined second threshold, then classifying a pore at location (x,y) in the query image (Q) as being matched with the corresponding pore in stored template image (T) only if difference the size, shape and position of pores neighbouring those used ifl the computation of the second error measure of the query image (Q) and corresponding neighbouring pores in the template image (T) are below the predetermined second threshold.
49.A method as in any one of claims 45 to 48 further comprising controlling the contribution of the error measure to a correlation score, wherein the error measure contributes to the said correlation score for matches and does not contribute to the correlation score for non-matches.
50.A method as in any one of claims 28-49 further comprising a step of recovering low visibility pores, which is operable to interpolate and to extrapolate the locations of visible pores in the modelled fingerprint and provide the locations of low visibility, pores.
51.A method as in any one of claims 28-50 further comprising a image acquisition step, wherein for recording fingerprint images and for forwarding, either them directly to the identifying means, or to a storage means and then to the identifying means.
52.A method according to any one of claims 28-51 further comprising an alignment step
53.A method as in Claim 52 wherein the alignment step comprises: -identifying the biological centre and the biological axis of the fingerprint in the image -setting a common reference point and common reference axis -translating the image so that the biological centre of the fingerprint is re-located at the common reference point -rotating the image so that the biological axis of the fingerprint orientation coincides with the common reference axis.
54.A method as in Claim 53 wherein, if the biological centre of the fingerprint in question is not present in the image, the off-image location of the biological centre is estimated using a combination of extrapolation of ridges in the on-image portion of the fingerprint pattern and known patterns of fingerprints.
55.A computer program product comprising a readable medium for storing instructions for implementing the method of Claims 28-54.
GB0716021A 2007-08-17 2007-08-17 Processing fingerprints to identify sweat pores, i.e. third level information, from ridge structures, i.e. macroscopic first level information. Withdrawn GB2451888A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0716021A GB2451888A (en) 2007-08-17 2007-08-17 Processing fingerprints to identify sweat pores, i.e. third level information, from ridge structures, i.e. macroscopic first level information.
PCT/GB2008/050670 WO2009024811A1 (en) 2007-08-17 2008-08-06 Method and apparatus for identifying and matching fingerprints using sweat pores

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0716021A GB2451888A (en) 2007-08-17 2007-08-17 Processing fingerprints to identify sweat pores, i.e. third level information, from ridge structures, i.e. macroscopic first level information.

Publications (2)

Publication Number Publication Date
GB0716021D0 GB0716021D0 (en) 2007-09-26
GB2451888A true GB2451888A (en) 2009-02-18

Family

ID=38566520

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0716021A Withdrawn GB2451888A (en) 2007-08-17 2007-08-17 Processing fingerprints to identify sweat pores, i.e. third level information, from ridge structures, i.e. macroscopic first level information.

Country Status (2)

Country Link
GB (1) GB2451888A (en)
WO (1) WO2009024811A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105993023A (en) * 2014-02-14 2016-10-05 韩国科泰高科株式会社 Electronic device comprising minimum sensing area and fingerprint information processing method therefor
RU2683979C2 (en) * 2016-02-17 2019-04-03 Бейджин Сяоми Мобайл Софтвэар Ко., Лтд. Method and device for detecting pressure

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110796638B (en) * 2019-09-29 2023-03-24 合肥方程式电子科技有限公司 Pore detection method
CN112766395B (en) * 2021-01-27 2023-11-28 中国地质大学(北京) Image matching method and device, electronic equipment and readable storage medium
CN113657145B (en) * 2021-06-30 2023-07-14 深圳市人工智能与机器人研究院 Fingerprint retrieval method based on sweat pore characteristics and neural network

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999006942A1 (en) * 1997-07-29 1999-02-11 Smarttouch, Inc. Identification of individuals from association of finger pores and macrofeatures
US6049621A (en) * 1997-08-22 2000-04-11 International Business Machines Corporation Determining a point correspondence between two points in two respective (fingerprint) images
WO2005022446A1 (en) * 2003-08-29 2005-03-10 Koninklijke Philips Electronics N.V. Biometrical identification device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999006942A1 (en) * 1997-07-29 1999-02-11 Smarttouch, Inc. Identification of individuals from association of finger pores and macrofeatures
US6049621A (en) * 1997-08-22 2000-04-11 International Business Machines Corporation Determining a point correspondence between two points in two respective (fingerprint) images
WO2005022446A1 (en) * 2003-08-29 2005-03-10 Koninklijke Philips Electronics N.V. Biometrical identification device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105993023A (en) * 2014-02-14 2016-10-05 韩国科泰高科株式会社 Electronic device comprising minimum sensing area and fingerprint information processing method therefor
EP3107037A4 (en) * 2014-02-14 2017-11-08 Canvasbio Co., Ltd. Electronic device comprising minimum sensing area and fingerprint information processing method therefor
CN105993023B (en) * 2014-02-14 2019-05-10 韩国科泰高科株式会社 Electronic device comprising minimum sensing region and its finger print information processing method
RU2683979C2 (en) * 2016-02-17 2019-04-03 Бейджин Сяоми Мобайл Софтвэар Ко., Лтд. Method and device for detecting pressure
US10402619B2 (en) 2016-02-17 2019-09-03 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for detecting pressure

Also Published As

Publication number Publication date
WO2009024811A1 (en) 2009-02-26
GB0716021D0 (en) 2007-09-26

Similar Documents

Publication Publication Date Title
EP2174261B1 (en) Fingerprint matching method and apparatus
Umer et al. Iris recognition using multiscale morphologic features
JP2815045B2 (en) Image feature extraction device, image feature analysis device, and image matching system
Badrinath et al. Palmprint based recognition system using phase-difference information
Zhao et al. Fingerprint image synthesis based on statistical feature models
US20080279416A1 (en) Print matching method and system using phase correlation
US20080013803A1 (en) Method and apparatus for determining print image quality
Doublet et al. Robust grayscale distribution estimation for contactless palmprint recognition
Garg et al. Biometric authentication using finger nail surface
Ribaric et al. Palmprint recognition based on local Haralick features
GB2451888A (en) Processing fingerprints to identify sweat pores, i.e. third level information, from ridge structures, i.e. macroscopic first level information.
Wu Advanced feature extraction algorithms for automatic fingerprint recognition systems
Giachetti Effective characterization of relief patterns
Gnanasivam et al. An efficient algorithm for fingerprint preprocessing and feature extraction
Kuban et al. A NOVEL MODIFICATION OF SURF ALGORITHM FOR FINGERPRINT MATCHING.
Turroni Fingerprint Recognition: Enhancement, Feature Extraction and Automatic Evaluation of Algorithms
Alotaibi et al. Increasing the Efficiency of Iris Recognition Systems by Using Multi-Channel Frequencies of Gabor Filter
Yao et al. Fingerprint quality assessment with multiple segmentation
Brasileiro et al. A novel method for fingerprint image segmentation based on adaptative gabor filters
Mohamed-Abdul-Cader et al. Minutiae Triangle Graphs: A New Fingerprint Representation with Invariance Properties
Sahu et al. Fingerprint Identification System using Tree based Matching
Pillai et al. Fingerprint liveness detection with feature level fusion techniques using svm and deep neural network
Yao Digital Fingerprint Quality Assessment
Olsen Fingerprint image quality: predicting biometric performance
Nezhadian et al. Inner-knuckle-print for human authentication by using ring and middle fingers

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)