CN109325455B - Iris positioning and feature extraction method and system - Google Patents
Iris positioning and feature extraction method and system Download PDFInfo
- Publication number
- CN109325455B CN109325455B CN201811139801.0A CN201811139801A CN109325455B CN 109325455 B CN109325455 B CN 109325455B CN 201811139801 A CN201811139801 A CN 201811139801A CN 109325455 B CN109325455 B CN 109325455B
- Authority
- CN
- China
- Prior art keywords
- iris
- area
- gray
- function
- threshold
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the application provides an iris positioning and feature extraction method and system, wherein the method comprises the following steps: s1, selecting iris region segmentation threshold values based on a pre-constructed local gray distribution statistical model; s2, screening the area where the iris is located by using a double-threshold coupling classifier constructed based on iris area segmentation thresholds to obtain a pupil area image and an iris area image; s3, determining an effective iris area by using a boundary detector constructed based on the pupil area image and the iris area image; s4, coding effective iris pixels of the effective iris area by using a local wavelet high-frequency energy tower-type transition model constructed based on the effective iris area to obtain iris feature codes. The scheme overcomes the influence of noise interference and unstable characteristics in low-quality iris images, thereby being beneficial to improving the accuracy and robustness of the iris recognition system.
Description
Technical Field
The application relates to the field of iris biological recognition, in particular to a method and a system for iris accurate positioning and stable feature extraction by combining a local wavelet high-frequency energy tower-type transition model on the basis of determining an effective iris region.
Background
The iris recognition has become a key research direction and development trend in the field of biological recognition with the remarkable advantages of accuracy, stability, safety, non-contact property and the like. However, because the iris size is small and is easily interfered by noise and influenced by user posture, in the acquired iris image, the iris region is always polluted and deformed, so that the problem of accurate positioning and stable feature extraction of the iris of a low-quality image is the key and difficult point of iris recognition. The current typical iris feature extraction and matching method has the following defects:
1. the method has the advantages that an iris positioning fitting model with strong adaptability is lacked, and for low-quality images, the iris positioning is greatly influenced by noise interference and is inaccurate in positioning, so that the extraction of stable features of the iris is more difficult;
2. the method has the advantages that iris feature measurement operators with strong robustness are lacked, for low-quality images, feature extraction is greatly influenced by iris deformation, stable iris features are difficult to extract, and the accuracy of iris identification is greatly reduced;
3. for low-quality images, in order to avoid the reduction of the accuracy of iris recognition, a strategy of discarding the current image for reacquisition is adopted, so that the efficiency of an iris recognition system is seriously influenced.
Disclosure of Invention
In order to solve one of the problems, the application provides an iris positioning and feature extraction method, which solves the problems of accurate iris positioning and stable feature extraction of low-quality images, thereby effectively enhancing the accuracy and robustness of iris recognition.
According to a first aspect of the embodiments of the present application, there is provided an iris positioning and feature extraction method, including:
s1, selecting iris region segmentation threshold values based on a pre-constructed local gray distribution statistical model;
s2, screening the area where the iris is located by using a double-threshold coupling classifier constructed based on iris area segmentation thresholds to obtain a pupil area image and an iris area image;
s3, determining an effective iris area by using a boundary detector constructed based on the pupil area image and the iris area image;
s4, coding effective iris pixels of the effective iris area by using a local wavelet high-frequency energy tower-type transition model constructed based on the effective iris area to obtain iris feature codes.
According to a second aspect of the embodiments of the present application, there is provided an iris positioning and feature extraction system, including:
the threshold selecting module is used for selecting iris region segmentation thresholds based on a pre-constructed local gray distribution statistical model;
the image screening module is used for screening the area where the iris is located by using a double-threshold coupling classifier constructed based on iris area segmentation thresholds to obtain a pupil area image and an iris area image;
the effective area determining module is used for determining an effective iris area by utilizing a boundary detector constructed based on the pupil area image and the iris area image;
and the characteristic extraction module is used for coding effective iris pixels of the effective iris area by utilizing a local wavelet high-frequency energy tower-type transition model constructed based on the effective iris area to obtain iris characteristic codes.
According to the technical scheme, iris region segmentation thresholds are selected in a self-adaptive mode according to specific local gray scale statistical distribution of an iris image, then effective iris regions are accurately screened and positioned by combining a double-threshold coupling classifier and a finely designed iris boundary detector, and finally stable iris features are extracted and coded by constructing a local wavelet high-frequency energy tower type transition model. The scheme overcomes the influence of noise interference and unstable characteristics in low-quality iris images, thereby being beneficial to improving the accuracy and robustness of the iris recognition system.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 shows a schematic diagram of an iris localization and feature extraction method according to the present application.
Detailed Description
In order to make the technical solutions and advantages of the embodiments of the present application more apparent, the following further detailed description of the exemplary embodiments of the present application with reference to the accompanying drawings makes it clear that the described embodiments are only a part of the embodiments of the present application, and are not exhaustive of all embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
The core idea of the scheme is to select an iris region segmentation threshold from an iris image, determine an effective iris region through a constructed classifier and a boundary detector, and finally extract stable iris characteristics and encode through constructing a local wavelet high-frequency energy tower-type transition model. By the method, the adaptability of low-quality image iris positioning and the robustness of feature extraction can be effectively enhanced, so that the accuracy and the recognition efficiency of an iris recognition system are improved.
The scheme discloses an iris accurate positioning and stable feature extraction method, which can overcome the influence of noise interference and unstable features in low-quality iris images, thereby being beneficial to improving the accuracy and the robustness of an iris recognition system. The present solution is described in detail below by a specific set of examples. The method comprises the following steps:
firstly, an iris region segmentation threshold is selected in a self-adaptive mode by constructing a local gray scale distribution statistical model.
In order to realize the adaptive iris region segmentation, iris region segmentation thresholds, including a speckle detection high gray threshold and a pupil detection low gray threshold, need to be adaptively determined first.
Setting the local gray level mean value statistical operator as
The repmat represents a two-dimensional expansion function, and num is the row and column expansion frequency number. When num is H, obtaining local gray scale statistic operator FM (H) with high gray scale threshold, and setting statistic step length as SH(ii) a When num is L, obtaining a low-gray threshold local gray scale statistic operator FM (L), and setting the statistic step length as SL. Specifically, H is 7, L is 11, SH=5,SL=5。
Let the iris image be IRM×NAnd M × N is the pixel resolution. The statistical model of local gray scale with high gray scale threshold is
Where hist represents a statistical function of the gray distribution,representing a convolution operation, SHRepresents a sliding step; the low-gray threshold local gray statistic model is
Where hist represents a statistical function of the gray distribution,representing a convolution operation, SLIndicating the step size of the slide.
STH、STLRespectively high and low gray threshold local gray statistical distribution sequence, and taking STHThe median of the h highest gray values corresponding to the gray distribution is the high gray threshold
TH=median(gs(STH(end-h+1:end))) (4)
Wherein mean represents taking a median function, gs represents taking a gray value function, and specifically, h is 7; get STLThe median of the first l lowest gray values corresponding to the gray distribution is the low gray threshold
TL=median(gs(STL(1:l))) (5)
Where mean denotes taking the median function, gs denotes taking the gray value function, and specifically, l is 9. Thus, a high gray threshold T for iris region segmentation is obtainedHAnd a low gray level threshold TL。
And secondly, screening out the area of the iris by using a double-threshold coupling classifier.
Constructing a dual-threshold coupled classifier according to high and low gray threshold values of iris region segmentation
IS=repmat(isinner(TL,TH),S) (6)
Where repmat represents a two-dimensional spreading function, S represents the row and column size of the classifier IS, specifically, S IS taken to be 21, isinner represents a gray scale interval relation operator, and the current gray scale value IS located in an interval (T)L,TH) Then, return to 0, the current gray value is in the interval [0, TL]Then, return to-1, the current gray value is in the interval [ TH,255]Then, 1 is returned.
Segmenting out the area where the pupil IS located from the iris image by using an iris area dual-threshold coupling classifier IS
Where numel represents a non-zero element count function, find represents a relationship matching function,representing a convolution operation, nh、nlRespectively indicating high and low gray detection number thresholds, specifically, taking nhWhen the expression is 20, n is takenlQ is 240, which indicates the area where the pupil is located detected from the iris image IR, and IP is the pupil area image block.
Extension of iris area by IP
Wherein, size represents taking an image matrix two-dimensional size function, getcore represents taking an image center pixel block function, Φ represents an iris region partitioned from an iris image IR, IB is an iris region image block, m and n respectively represent the number of pixels of a row and a column of IB, and specifically, m is 240, and n is 320.
And thirdly, accurately positioning the effective iris area by constructing the inner and outer boundary detectors of the iris.
And establishing an iris inner boundary detector and an iris outer boundary detector in the divided pupil area image block IP and the divided iris area image block IB respectively to accurately position iris boundaries to obtain an effective iris area.
Constructing an inner boundary detector in the iris region
Wherein rot45 represents a counterclockwise rotation function, α represents a detection direction weight vector, specifically, α ═ 3/8, 1/8, 3/8, 1/8] is taken, and EI is an iris inner boundary detector; constructing an iris region outer boundary detector
Where rot45 denotes a counterclockwise rotation function, β denotes a detection direction weight vector, and specifically, β ═ 3/8, 1/8, 3/8, 1/8] is taken, and EO is an iris inner boundary detector.
Locating inner boundary of iris
Where sum represents the sum function,representing a convolution operation, epExpressing the threshold of the gradient jump of the inner boundary neighborhood gray level of the iris, specifically, taking epWhen the iris is in the iris inner boundary, the omega is the pixel point of the iris inner boundary, and the EP is the iris inner boundary; locating an outer iris boundary
Where sum represents the sum function,representing a convolution operation, erExpressing the threshold of the gradient jump of the gray level in the neighborhood of the outer boundary of the iris, specifically, taking er128, psi is the iris outer boundary pixel point, and ER is the iris outer boundary.
Thus, an effective iris region is obtained as IC ═ (EP ═ ER) andquidib.
And fourthly, extracting the stable characteristics of the effective iris area by constructing a local wavelet high-frequency energy tower-type transition model, and coding.
The effective iris area IC is transformed by radial and arc-wise sampling into an effective iris pixel block RB. Assuming that the number of radial and arc sampling points is r and c, the resolution of the row and column pixels of RB is r and c, specifically, r is 24 and c is 128.
Partitioning RB on three resolution scales of lv x lv, 2lv x 2lv and 4lv x 4lv respectively, specifically, taking lv as 6, then performing 1, 2 and 3-level wavelet decomposition on subblocks of different resolution scales respectively, and constructing a local wavelet high-frequency energy tower
Where square represents the squaring function, dwt2 represents the two-dimensional discrete wavelet transform function, dbiRepresents the wavelet basis, i represents the wavelet decomposition level, gethfb represents the function of taking wavelet high frequency subband coefficients, and j represents the level of the wavelet high frequency subband.
Local wavelet high-frequency energy tower type transition model constructed by ET
Where mean represents the averaging function, square represents the squaring function, dwt2 represents the two-dimensional discrete wavelet transform function, dbiRepresenting the wavelet basis, i represents the wavelet decomposition level, gethfb represents the function of taking wavelet high frequency subband coefficient, j represents the wavelet high frequency subband coefficientAnd (2) the isgrease represents a coefficient energy relation operator, when the energy value of the current wavelet high-frequency coefficient is smaller than the average energy value of the high-frequency subband coefficient, 0 is returned, otherwise, 1 is returned.
Encoding RB by EM to obtain an iris feature encoded array RC, having
Wherein map represents a coordinate mapping function from RC to EM, and (x, y) represents a coordinate pair of RC, specifically, the size of row and column of RC is 3 × 2736, and the size of the feature code template is 1 kB.
In order to cooperate with the implementation of the method of the present invention, the present invention further provides an iris positioning and feature extraction system, which includes: the device comprises a threshold selecting module, an image screening module, an effective area determining module and a characteristic extracting module. The threshold selecting module selects iris region segmentation thresholds by utilizing a pre-constructed local gray distribution statistical model; the image screening module screens the area where the iris is located by using a double-threshold coupling classifier constructed based on iris area segmentation thresholds to obtain a pupil area image and an iris area image; the effective area determining module determines an effective iris area by using a boundary detector constructed based on the pupil area image and the iris area image; the feature extraction module is used for coding effective iris pixels of the effective iris area by utilizing a local wavelet high-frequency energy tower-type transition model constructed based on the effective iris area to obtain iris feature codes.
In this scheme, the threshold selecting module includes: a model construction unit and a threshold determination unit. The model construction unit respectively obtains a high-gray threshold local gray statistical model and a low-gray threshold local gray statistical model according to the set local gray mean statistical operator and the pixel resolution of the iris image; the threshold determining unit utilizes a high gray threshold local gray statistical model STHAnd low-gray threshold local gray statistic model STLRespectively determining:
light spot detection high gray level threshold: t isH=median(gs(STH(end-h +1: end))), wherein mean represents taking the median function, gs represents taking the gray value function, and h is STHCorresponding last h highest gray values in the previous step, end is the index of the last element;
pupil detection low gray threshold: t isL=median(gs(STL(1: l))), wherein mean represents the median function, gs represents the gray scale value function, and l is STLThe corresponding first l lowest gray values in (a).
In this scheme, the image screening module includes: the device comprises a classifier building unit, a segmentation unit and an expansion unit. The classifier building unit builds a double-threshold coupling classifier according to the high gray threshold and the low gray threshold segmented by the iris region: IS (repmat (T)L,TH) S), wherein repmat represents a two-dimensional spread function, S represents the row and column sizes of the classifier IS, and isinner represents a gray scale interval relation operator; the segmentation unit segments the area where the pupil is located from the iris image by using the dual-threshold coupling classifier:
where numel represents a non-zero element count function, find represents a relationship matching function,expressing convolution operation, wherein nh and nl respectively express high and low gray detection number thresholds, omega expresses the area of a pupil detected from the IR of the iris image, and IP is the image block of the pupil area;
the expansion unit expands the iris area by using the pupil area image block IP:
wherein, size represents taking a two-dimensional size function of an image matrix, getcore represents taking a central pixel block function of an image, Φ represents an iris area partitioned from an iris image IR, IB is an iris area image block, and m and n respectively represent the number of pixels of a row and a column of IB.
In this scheme, the effective area determining module includes: border construction unit and positioning unit
A boundary device constructing unit which constructs an iris inner boundary detector and an iris outer boundary detector in the pupil area image block IP and the iris area image block IB respectively;
inner boundary detector in iris area:wherein rot45 represents a counterclockwise rotation function, α represents a detection direction weight vector, and EI is an iris inner boundary detector;
an iris region outer boundary detector; wherein rot45 represents a counterclockwise rotation function, β represents a detection direction weight vector, and EO is an iris outer boundary detector;
the positioning unit positions the inner iris boundary based on the inner iris region boundary detector and the outer iris region boundary detector:and an iris outer boundary:where sum represents the sum function,representing a convolution operation, epRepresenting the threshold of the transition of the neighborhood gray gradient of the inner boundary of the iris, omega is the pixel point of the inner boundary of the iris, EP is the inner boundary of the iris, erExpressing the iris outer boundary neighborhood gray gradient jump threshold, psi is an iris outer boundary pixel point, and ER is the iris outer boundary;
according to the inner and outer boundaries of the iris, the determined effective iris area is as follows: IC ═ EP @ (E.U.ER) andU.B.
In this scheme, the feature extraction module includes: the device comprises a transformation unit, an energy tower construction unit, a transition model construction unit and a coding unit. The transformation unit transforms the effective iris area IC into an effective iris pixel block RB through radial and arc sampling; the energy tower construction unit divides RB into blocks on three resolution scales of lv x lv, 2lv x 2lv and 4lv x 4lv respectively, then carries out wavelet decomposition of 1, 2 and 3 levels on sub-blocks with different resolution scales respectively, and constructs a local wavelet high-frequency energy tower:
where square represents the squaring function, dwt2 represents the two-dimensional discrete wavelet transform function, dbiRepresenting wavelet base, i representing wavelet decomposition level, gethfb representing taking wavelet high frequency sub-band coefficient function, and j representing wavelet high frequency sub-band level;
the transition model building unit is used for building a local wavelet high-frequency energy tower type transition model according to the local wavelet high-frequency energy tower:
wherein mean represents a mean function, square represents a square function, dwt2 represents a two-dimensional discrete wavelet transform function, dbi represents a wavelet basis, i represents a wavelet decomposition series, gethfb represents a wavelet high-frequency subband coefficient function, j represents a wavelet high-frequency subband level, and isgrease represents a coefficient energy relation operator; determining information fusion precision by using the maximum value of the fusion error;
and the coding unit is used for coding the effective iris pixel block RB by using a local wavelet high-frequency energy tower-type transition model to obtain an iris characteristic coding array RC.
The method can also be stored in the electronic equipment in a memory mode. The electronic device includes: a memory, one or more processors; the memory is connected with the processor through a communication bus; the processor is configured to execute instructions in the memory; the storage medium stores instructions for performing the steps of the iris localization and feature extraction method.
The method according to the present solution may also be stored in a computer-readable storage medium in the form of a program, which when executed by a processor implements the steps of the iris localization and feature extraction method.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The present invention is not limited to the above embodiments, and any modifications, equivalent replacements, improvements, etc. made within the spirit and principle of the present invention are included in the scope of the claims of the present invention which are filed as the application.
Claims (8)
1. An iris positioning and feature extraction method is characterized by comprising the following steps:
s1, selecting iris region segmentation threshold values based on a pre-constructed local gray distribution statistical model;
s2, screening the area where the iris is located by using a double-threshold coupling classifier constructed based on iris area segmentation thresholds to obtain a pupil area image and an iris area image;
s3, determining an effective iris area by using a boundary detector constructed based on the pupil area image and the iris area image;
s4, coding effective iris pixels of the effective iris area by using a local wavelet high-frequency energy tower-type transition model constructed based on the effective iris area to obtain iris feature codes;
wherein the step S4 includes:
the effective iris area IC is sampled in the radial direction and the arc direction and is converted into an effective iris pixel block RB;
partitioning RB on three resolution scales of lv x lv, 2lv x 2lv and 4lv x 4lv respectively, then performing 1, 2 and 3-level wavelet decomposition on subblocks with different resolution scales respectively, and constructing a local wavelet high-frequency energy tower:
where square represents the squaring function, dwt2 represents the two-dimensional discrete wavelet transform function, dbiRepresenting wavelet base, i representing wavelet decomposition level, gethfb representing taking wavelet high frequency sub-band coefficient function, and j representing wavelet high frequency sub-band level;
according to the local wavelet high-frequency energy tower, constructing a local wavelet high-frequency energy tower type transition model:
wherein mean represents a mean function, square represents a square function, dwt2 represents a two-dimensional discrete wavelet transform function, dbi represents a wavelet basis, i represents a wavelet decomposition series, gethfb represents a wavelet high-frequency subband coefficient function, j represents a wavelet high-frequency subband level, and isgrease represents a coefficient energy relation operator;
and coding the effective iris pixel block RB by using a local wavelet high-frequency energy tower-type transition model to obtain an iris characteristic coding array RC.
2. The method according to claim 1, wherein the step S1 includes:
respectively obtaining a high gray threshold local gray statistical model and a low gray threshold local gray statistical model according to the set local gray mean statistical operator and the pixel resolution of the iris image;
local gray statistical model ST using high gray thresholdHAnd low-gray threshold local gray statistic model STLRespectively determining:
light spot detection high gray level threshold: t isH=median(gs(STH(end-h +1: end))), wherein mean represents taking the median function, gs represents taking the gray value function, and h is STHCorresponding last h highest gray values in the previous step, end is the index of the last element;
pupil detection low gray threshold: t isL=median(gs(STL(1: l))), wherein mean represents the median function, gs represents the gray scale value function, and l is STLThe corresponding first l lowest gray values in (a).
3. The method according to claim 2, wherein the step S2 includes:
high grayscale threshold and low based on iris region segmentationConstructing a double-threshold coupling classifier by using gray threshold: IS (repmat (T)L,TH) S), wherein repmat represents a two-dimensional spread function, S represents the row and column sizes of the classifier IS, and isinner represents a gray scale interval relation operator;
and segmenting the area where the pupil is located from the iris image by using the dual-threshold coupling classifier:
where numel represents a non-zero element count function, find represents a relationship matching function,representing a convolution operation, nh、nlRespectively representing high and low gray detection number thresholds, wherein omega represents the area where the pupil is detected from the IR of the iris image, and IP is the image block of the pupil area;
expanding an iris area by utilizing the pupil area image block IP:
wherein, size represents taking a two-dimensional size function of an image matrix, getcore represents taking a central pixel block function of an image, Φ represents an iris area partitioned from an iris image IR, IB is an iris area image block, and m and n respectively represent the number of pixels of a row and a column of IB.
4. The method according to claim 3, wherein the step S3 includes:
respectively constructing iris inner and outer boundary detectors in the pupil area image block IP and the iris area image block IB;
inner boundary detector in iris area:wherein rot45 represents a counterclockwise rotation function, α represents a detection direction weight vector, and EI is an iris inner boundary detector;
an iris region outer boundary detector; wherein rot45 represents a counterclockwise rotation function, β represents a detection direction weight vector, and EO is an iris outer boundary detector;
based on the iris region inner boundary detector and the iris region outer boundary detector, locating the iris inner boundary:and an iris outer boundary:where sum represents the sum function,representing a convolution operation, epRepresenting the threshold of the transition of the neighborhood gray gradient of the inner boundary of the iris, omega is the pixel point of the inner boundary of the iris, EP is the inner boundary of the iris, erExpressing the iris outer boundary neighborhood gray gradient jump threshold, psi is an iris outer boundary pixel point, and ER is the iris outer boundary;
according to the inner and outer boundaries of the iris, determining the effective iris area as follows: IC ═ EP @ (E.U.ER) andU.B.
5. An iris localization and feature extraction system, comprising:
the threshold selecting module is used for selecting iris region segmentation thresholds based on a pre-constructed local gray distribution statistical model;
the image screening module is used for screening the area where the iris is located by using a double-threshold coupling classifier constructed based on iris area segmentation thresholds to obtain a pupil area image and an iris area image;
the effective area determining module is used for determining an effective iris area by utilizing a boundary detector constructed based on the pupil area image and the iris area image;
the characteristic extraction module is used for coding effective iris pixels of the effective iris area by utilizing a local wavelet high-frequency energy tower-type transition model constructed based on the effective iris area to obtain iris characteristic codes;
wherein the feature extraction module comprises:
the transformation unit is used for transforming the effective iris area IC into an effective iris pixel block RB through radial and arc sampling;
the energy tower construction unit is used for partitioning RB on three resolution scales of lv x lv, 2lv x 2lv and 4lv x 4lv respectively, then performing 1, 2 and 3-level wavelet decomposition on sub-blocks with different resolution scales respectively, and constructing a local wavelet high-frequency energy tower:
where square represents the squaring function, dwt2 represents the two-dimensional discrete wavelet transform function, dbiRepresenting wavelet base, i representing wavelet decomposition level, gethfb representing taking wavelet high frequency sub-band coefficient function, and j representing wavelet high frequency sub-band level;
the transition model building unit is used for building a local wavelet high-frequency energy tower type transition model according to the local wavelet high-frequency energy tower:
wherein mean represents a mean function, square represents a square function, dwt2 represents a two-dimensional discrete wavelet transform function, dbi represents a wavelet basis, i represents a wavelet decomposition series, gethfb represents a wavelet high-frequency subband coefficient function, j represents a wavelet high-frequency subband level, and isgrease represents a coefficient energy relation operator; determining information fusion precision by using the maximum value of the fusion error;
and the coding unit is used for coding the effective iris pixel block RB by using a local wavelet high-frequency energy tower-type transition model to obtain an iris characteristic coding array RC.
6. The system of claim 5, wherein the threshold selection module comprises:
the model construction unit is used for respectively obtaining a high-gray threshold local gray scale statistical model and a low-gray threshold local gray scale statistical model according to the set local gray scale mean statistical operator and the pixel resolution of the iris image;
threshold determination unit using local gray statistical model ST of high gray thresholdHAnd low-gray threshold local gray statistic model STLRespectively determining:
light spot detection high gray level threshold: t isH=median(gs(STH(end-h +1: end))), wherein mean represents taking the median function, gs represents taking the gray value function, and h is STHCorresponding last h highest gray values in the previous step, end is the index of the last element;
pupil detection low gray threshold: t isL=median(gs(STL(1: l))), wherein mean represents the median function, gs represents the gray scale value function, and l is STLThe corresponding first l lowest gray values in (a).
7. The system of claim 6, wherein the image filtering module comprises:
the classifier building unit builds a double-threshold coupling classifier according to the high gray threshold and the low gray threshold segmented by the iris region: IS (repmat (T)L,TH) S), wherein repmat represents a two-dimensional spread function, S represents the row and column sizes of the classifier IS, and isinner represents a gray scale interval relation operator;
the segmentation unit is used for segmenting the region where the pupil is located from the iris image by using the dual-threshold coupling classifier:
where numel represents a non-zero element count function, find represents a relationship matching function,representing a convolution operation, nh、nlRespectively representing high and low gray detection number thresholds, wherein omega represents the area where the pupil is detected from the IR of the iris image, and IP is the image block of the pupil area;
an expansion unit which expands the iris area by using the pupil area image block IP:
wherein, size represents taking a two-dimensional size function of an image matrix, getcore represents taking a central pixel block function of an image, Φ represents an iris area partitioned from an iris image IR, IB is an iris area image block, and m and n respectively represent the number of pixels of a row and a column of IB.
8. The system of claim 7, wherein the active area determination module comprises:
a boundary device constructing unit which constructs an iris inner boundary detector and an iris outer boundary detector in the pupil area image block IP and the iris area image block IB respectively;
inner boundary detector in iris area:wherein rot45 represents a counterclockwise rotation function, α represents a detection direction weight vector, and EI is an iris inner boundary detector;
an iris region outer boundary detector; wherein rot45 represents a counterclockwise rotation function, β represents a detection direction weight vector, and EO is an iris outer boundary detector;
a positioning unit that positions the inner iris boundary based on the inner iris region boundary detector and the outer iris region boundary detector:and an iris outer boundary:where sum represents the sum function,representing a convolution operation, epRepresenting the threshold of the transition of the neighborhood gray gradient of the inner boundary of the iris, omega is the pixel point of the inner boundary of the iris, EP is the inner boundary of the iris, erExpressing the iris outer boundary neighborhood gray gradient jump threshold, psi is an iris outer boundary pixel point, and ER is the iris outer boundary;
according to the inner and outer boundaries of the iris, the determined effective iris area is as follows: IC ═ EP @ (E.U.ER) andU.B.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811139801.0A CN109325455B (en) | 2018-09-28 | 2018-09-28 | Iris positioning and feature extraction method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811139801.0A CN109325455B (en) | 2018-09-28 | 2018-09-28 | Iris positioning and feature extraction method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109325455A CN109325455A (en) | 2019-02-12 |
CN109325455B true CN109325455B (en) | 2021-11-30 |
Family
ID=65265985
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811139801.0A Active CN109325455B (en) | 2018-09-28 | 2018-09-28 | Iris positioning and feature extraction method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109325455B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113837993B (en) * | 2021-07-29 | 2024-01-30 | 天津中科智能识别产业技术研究院有限公司 | Lightweight iris image segmentation method and device, electronic equipment and storage medium |
CN113903077A (en) * | 2021-10-22 | 2022-01-07 | 深圳市集虹鼎源科技有限公司 | Colorful iris ring positioning and extracting method based on pupil positioning |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101266645A (en) * | 2008-01-24 | 2008-09-17 | 电子科技大学中山学院 | Iris positioning method based on multi-resolutions analysis |
CN107844736A (en) * | 2016-09-19 | 2018-03-27 | 北京眼神科技有限公司 | iris locating method and device |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2929487A4 (en) * | 2012-12-10 | 2016-08-10 | Stanford Res Inst Int | Iris biometric matching system |
-
2018
- 2018-09-28 CN CN201811139801.0A patent/CN109325455B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101266645A (en) * | 2008-01-24 | 2008-09-17 | 电子科技大学中山学院 | Iris positioning method based on multi-resolutions analysis |
CN107844736A (en) * | 2016-09-19 | 2018-03-27 | 北京眼神科技有限公司 | iris locating method and device |
Non-Patent Citations (3)
Title |
---|
Image Compression Based on Compressed Sensing Theory and Wavelet Packet Analysis;Huijie Guo et al.;《2011 Cross Strait Quad-Regional Radio Science and Wireless Technology Conference》;20111010;全文 * |
基于变分水平集模型的虹膜图像分割方法;张荷萍 等.;《计算机工程》;20131031;全文 * |
虹膜识别技术综述;韩一梁 等.;《2015国防无线电&电学计量与测试学术交流会》;20151231;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN109325455A (en) | 2019-02-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113591967B (en) | Image processing method, device, equipment and computer storage medium | |
US9799115B2 (en) | Apparatus and method for automatically registering landmarks in three-dimensional medical image | |
CN102541954B (en) | Method and system for searching trademarks | |
CN109961446B (en) | CT/MR three-dimensional image segmentation processing method, device, equipment and medium | |
CN109190742B (en) | Decoding method of coding feature points based on gray feature | |
CN109325455B (en) | Iris positioning and feature extraction method and system | |
Won et al. | Automatic object segmentation in images with low depth of field | |
CN101999231A (en) | System and method for enhancing the visibility of an object in a digital picture | |
CN108960280B (en) | Picture similarity detection method and system | |
CN104463814A (en) | Image enhancement method based on local texture directionality | |
CN105335952A (en) | Matching cost calculation method and apparatus, and parallax value calculation method and equipment | |
KR20170040983A (en) | Method and apparatus of image denoising using multi-scale block region detection | |
CN111160477A (en) | Image template matching method based on feature point detection | |
CN114241444A (en) | Lane line recognition method and apparatus, storage medium, and electronic apparatus | |
CN112258532B (en) | Positioning and segmentation method for callus in ultrasonic image | |
CN112258449A (en) | Rapid nodule matching method based on nodule characteristics | |
US9659227B2 (en) | Detecting object from image data using feature quantities | |
CN113822818B (en) | Speckle extraction method, device, electronic device, and storage medium | |
CN111986078B (en) | Multi-scale core CT image fusion reconstruction method based on guide data | |
CN113538483B (en) | Coding and decoding method and measuring method of high-precision close-range photogrammetry mark | |
CN112258534B (en) | Method for positioning and segmenting small brain earthworm parts in ultrasonic image | |
CN111753723B (en) | Fingerprint identification method and device based on density calibration | |
JP4560434B2 (en) | Change region extraction method and program of the method | |
CN112950652A (en) | Robot and hand image segmentation method and device thereof | |
Halheit et al. | Rigid Image Registration using Mutual Informationand Wavelet Transform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |