CN109410160A - The infrared polarization image interfusion method driven based on multiple features and feature difference - Google Patents

The infrared polarization image interfusion method driven based on multiple features and feature difference Download PDF

Info

Publication number
CN109410160A
CN109410160A CN201811180813.8A CN201811180813A CN109410160A CN 109410160 A CN109410160 A CN 109410160A CN 201811180813 A CN201811180813 A CN 201811180813A CN 109410160 A CN109410160 A CN 109410160A
Authority
CN
China
Prior art keywords
image
minutia
polarization
dark
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811180813.8A
Other languages
Chinese (zh)
Other versions
CN109410160B (en
Inventor
冉骏
宋斌
陈蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Source Letter Photoelectric Polytron Technologies Inc
Original Assignee
Hunan Source Letter Photoelectric Polytron Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Source Letter Photoelectric Polytron Technologies Inc filed Critical Hunan Source Letter Photoelectric Polytron Technologies Inc
Priority to CN201811180813.8A priority Critical patent/CN109410160B/en
Publication of CN109410160A publication Critical patent/CN109410160A/en
Application granted granted Critical
Publication of CN109410160B publication Critical patent/CN109410160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present invention provides a kind of infrared polarization image interfusion method driven based on multiple features and feature difference, comprising steps of the polarization of light is indicated using Stokes vector, calculating degree of polarization image P and angle of polarization image R;Angle of polarization image R and U is subjected to linear weighted function and obtains image R ';The respective exclusive part for removing common portion between image R ', I and P is calculated, R is denoted as1、I1And P1;By image P1、I1And R1The channel R, the channel G and channel B in rgb space are mapped to obtain RGB image, RGB image is converted to extract light intensity level Y after YUV image;Pass through the method blending image I separated based on multiple features1And P1, obtain F;F replacement Y is obtained into replaced YUV image, inverse transformation is carried out to replaced YUV image and obtains RGB image, i.e. Polarization Image Fusion result.Multiple polarization images of infrared polarization image have been merged, so that fused image scene is more abundant, have helped to identify camouflaged target.The present invention is applied to computer vision field.

Description

The infrared polarization image interfusion method driven based on multiple features and feature difference
Technical field
The present invention relates to computer vision field more particularly to a kind of infrared polarization driven based on multiple features and feature difference Image interfusion method.
Background technique
Now, as infrared imaging detection technology is in application fields demands such as military, medical treatment, security protection and earth observations Greatly develop, traditional infrared detection technique seem under the upgrading of precision, complex environment and camouflage some power not from The heart.Traditional infrared imaging system is mainly that the infrared intensity of scenery is imaged, mainly with the temperature of scenery, radiance Etc. related.When placing the identical noise source of temperature around object, then existing thermal infrared imager just can not to target into Row identification, infrared imagery technique face serious limitation and challenge.
Compared with traditional infrared imaging, the polarization imaging of light can reduce the degradation effects of complex scene, while can be with Obtain the structure and range information of scene.Infrared Polarization Imaging Technology can detect the infrared strong information of object scene and obtain The polarization information for obtaining object scene, can significantly improve the contrast between target object and natural background, have and embody object The ability of profile and details can be tested with to improve the quality of infrared image using polarization means under complex background Signal.
Polarization is used as a kind of essential characteristic of light, can not be observed directly by human eye, it is therefore desirable to by polarization information with certain Kind form shows that one side human eye perceives or facilitate computer disposal.The polarization of light is indicated using Stokes vector State, Stokes vector describe the polarization state and intensity of light using four stokes parameters, they are the time of light intensity Average value, the dimension with intensity, can directly be detected by detector.The representation of Stokes vector S are as follows:
In formula, I1、I2、I3And I4Respectively represent the luminous intensity that collected polarization direction is 0 °, 45 °, 90 ° and 135 ° degree Image;I represents the overall strength of light;Q represents the intensity difference between horizontal polarization and vertical polarization, U represent polarization direction at 45 ° and Intensity difference between 135 °;V represents the left-handed intensity difference between right-hand circular polarization component of light.
Wherein, angle of polarization image can preferably describe different surface orientations, and degree of polarization image contains the polarization of object Information, the contrast that can preferably characterize man-made target, improve target and background;Total light intensity image reflects the intensity letter of scene Breath.Existing some polarization image fusion methods only consider that the Image Fusion of single difference characteristic cannot be to the institute in image There is characteristics of image that is uncertain and changing at random effectively to be described, causes to lose some valuable letters in fusion process Breath causes the failure of fusion with identification;Meanwhile it is thin there is also being difficult to take into account contrast, bright features and edge in fusion process The problem of saving feature.
Summary of the invention
For only considering the Image Fusion of single difference characteristic in the prior art, and then causing cannot be in image The problem of characteristics of image that is all uncertain and changing at random is effectively described, the object of the present invention is to provide one kind based on more The infrared polarization image interfusion method of feature and feature difference driving, has merged multiple polarization images of infrared polarization image, more A amount of polarization includes image Q, image U, image V, total light intensity image I, degree of polarization image P and angle of polarization image R, so that after fusion Image scene it is more abundant, while characteristics of image that are multiple uncertain and changing at random are combined, so that fusion process In taken into account the multiple features of image, enhance the edge detail information of image, improve the contrast of image, fusion results facilitate It identifies camouflaged target, while passing through during image co-registration and obtaining the respective exclusive part of each polarization image and efficiently solve Information redundancy problem between each amount of polarization.
The technical solution adopted by the present invention is that:
A kind of infrared polarization image interfusion method driven based on multiple features and feature difference, specifically includes the following steps:
S1, the polarization that light is indicated using Stokes vector, i.e. S=(I, Q, U, V), and degree of polarization is calculated according to S vector Image P and angle of polarization image R;
S2, the angle of polarization image R and U progress linear weighted function are obtained into image R ';
It is removed between S3, calculating image R ', total light intensity image I and degree of polarization image P respective exclusive except common portion Part is denoted as R respectively1、I1And P1
S4, by image P1、I1And R1The channel R, the channel G and the channel B being respectively mapped in rgb space are to obtain RGB figure RGB image is converted to extract light intensity level Y after YUV image by picture;
S5, the method blending image I by being separated based on multiple features1And P1, obtain fusion results F;
S6, the luminance component Y in the fusion results F replacement step S4 of step S5 is obtained into replaced YUV image, then right Replaced YUV image carries out inverse transformation and obtains RGB image, i.e., final Polarization Image Fusion result.
As a further improvement of the above technical scheme, step S1 is specifically included:
S11, degree of polarization image P is calculated:
In formula, Q indicates that the intensity difference between horizontal polarization and vertical polarization, U indicate polarization direction between 45 ° and 135 ° Intensity difference, I indicate total light intensity image;
S12, calculated angle of polarization image R:
As a further improvement of the above technical scheme, step S3 is specifically included:
Common portion Co between S31, calculating image R ', total light intensity image I and degree of polarization image P:
Co=R' ∩ I ∩ P=min { R', I, P };
S32, image R ', total light intensity image I and the respective exclusive part R of degree of polarization image P are calculated1、I1And P1:
As a further improvement of the above technical scheme, step S4 is specifically included:
S41, by image P1、I1And R1The channel R, the channel G and the channel B being respectively mapped in rgb space obtain RGB figure Picture;
S42, RGB image is converted to YUV image:
S43, extract light intensity level Y:
Y=0.299R+0.587G+0.114B.
As a further improvement of the above technical scheme, step S5 is specifically included:
S51, to image I1And P1Multiple features separation is carried out, image I is obtained1Bright characteristic image, dark characteristic image and details Characteristic image and image P1Bright characteristic image, dark characteristic image and minutia image;
S52, blending image I1Bright characteristic image and image P1Bright characteristic image, obtain bright Fusion Features result FL
S53, blending image I1Dark characteristic image and image P1Dark characteristic image, obtain dark Fusion Features result FD
S54, blending image I1Minutia image and image P1Minutia image, obtain minutia fusion knot Fruit FDIF
S55, fusion FL、FDWith FDIF, obtain fusion results F.
As a further improvement of the above technical scheme, it in step S51, is separated using the multiple features based on dark primary theory Method is respectively to image I1And P1Multiple features separation is carried out, is specifically included:
S511, image I is sought1And P1Dark primary image:
In formula,For image I1Dark primary image,For image P1Dark primary image, C be image I1Or P1's Three Color Channels R, G, B, N (x) is the pixel in the window appli centered on pixel x, (I1)C(y) and (P1)C(y) divide It is not expressed as image I1And P1A Color Channel figure;
S512, to image I1With image P1It negates respectively, obtains imageWith imageBy dark primary imageWith Respectively with imageWithIt takes small rule to be merged according to absolute value, obtains image I1Dark characteristic imageWith image P1 Dark characteristic image
S513, by dark primary imageWithRespectively with corresponding dark characteristic imageWithIt is poor to make, and obtains image I1 Bright characteristic imageWith image P1Bright characteristic image
S514, by image I1And P1Respectively with corresponding dark primary imageWithIt is poor to make, and obtains image I1Details Characteristic imageWith image P1Minutia image
As a further improvement of the above technical scheme, in step S52, using the matching based on energy of local area feature Method blending image I1Bright characteristic image and image P1Bright characteristic image, specifically include:
S521, bright characteristic image is soughtWithGauss weight local energy:
In formula, k=I1Or P1,Represent bright characteristic imageOrGauss weighting office centered on point (m, n) Portion's energy, w (i, j) are gaussian filtering matrix, and N is the size in region, t=(N-1)/2;
S522, bright characteristic image is soughtWithGauss weight local energy matching degree:
In formula, ME(m, n) indicates bright characteristic imageWithGauss weight local energy matching degree,Generation The bright characteristic image of tableGauss centered on point (m, n) weights local energy,Represent bright characteristic imageWith point Gauss centered on (m, n) weights local energy;
S523, local energy and the bright characteristic image of Gauss weighting local energy matching degree fusion are weighted by GaussWith
In formula, FL(m, n) is bright characteristic imageWithFusion results, TlFor the threshold of bright Fusion Features similitude judgement Value, if ME(m, n) < Tl, then bright characteristic imageWithRegion centered on point (m, n) is dissimilar, bright characteristic image WithFusion results choose Gauss weighted area energy the greater, otherwise, bright characteristic imageWithFusion results be Number weighted average.
As a further improvement of the above technical scheme, in step S53, using based on regional area weighted variance feature Matching process blending image I1Dark characteristic image and image P1Dark characteristic image, specifically include:
S531, dark characteristic image is soughtWithRegional area weighted variance energy:
In formula, k=I1Or P1,Represent dark characteristic imageOrRegional area centered on point (m, n) Weighted variance energy, w (i, j) be gaussian filtering matrix, N be region size, t=(N-1)/2,Represent with point (m, N) the regional area average value centered on;
S532, dark characteristic image is soughtWithRegional area weighted variance energy matching degree:
In formula, MV(m, n) indicates dark characteristic imageWithRegional area weighted variance energy matching degree,Represent dark characteristic imageRegional area weighted variance energy centered on point (m, n),It represents dark Characteristic imageRegional area weighted variance energy centered on point (m, n);
S533, to merge two width by regional area weighted variance energy and local sub-region right variance energy match degree secretly special Levy imageWith
In formula, FD(m, n) is dark characteristic imageWithFusion results, ThFor the judgement of dark Fusion Features similitude Threshold value, if ME(m, n) < Th, then two imagesWithRegion centered on point (m, n) is dissimilar, two images WithFusion results choose the big person of regional area weighted variance energy;No person, two imagesWithFusion results be Coefficient weighted average.
As a further improvement of the above technical scheme, in step S54, using fuzzy logic and feature difference driving fusion Image I1Minutia image and image P1Minutia image, specifically include:
S541, minutia image is soughtWithPartial gradient:
In formula, k=I1Or P1,Represent minutia imageOrPart ladder at middle pixel (m, n) Degree,It respectively represents and is obtained using the horizontal and vertical template of Sobel operator with minutia image convolution The horizontal and vertical edge image obtained;
S542, minutia image is soughtWithRegional area weighted variance energy:
In formula, k=I1Or P1, Vk P(m, n) represents minutia imageOrRegional area centered on point (m, n) Weighted variance energy, w (i, j) be gaussian filtering matrix, N be region size, t=(N-1)/2,Represent with point (m, N) the regional area average value centered on;
S543, minutia image is soughtWithLocal difference gradient delta T (m, n), local difference variance Δ V (m, N), partial gradient matching degree MT(m, n) and local weighted variance matching degree MV1(m, n):
In formula,Represent minutia imagePartial gradient at middle pixel (m, n),It represents Minutia imagePartial gradient at middle pixel (m, n),Represent minutia imageIt is with point (m, n) The regional area weighted variance energy at center,Represent minutia imagePartial zones centered on point (m, n) Domain weighted variance energy;
S544, decision diagram pixel-based is sought according to local difference gradient and local difference variance, according to partial gradient Matching degree and local weighted variance matching degree seek feature difference degree decision diagram:
In formula, PDG (m, n) is decision diagram pixel-based, g1~g9Expression meets above-mentioned respective conditions time point (m, n) Location of pixels is 1, and the decision diagram that other location of pixels are 0, DDG (m, n) is feature difference degree decision diagram, d1And d2In satisfaction The location of pixels for stating respective conditions time point (m, n) is 1, the decision diagram that other location of pixels are 0;
S545, details is judged according to decision diagram PDG (m, n) pixel-based and feature difference degree decision diagram DDG (m, n) Characteristic image PI1And PP1Determination region and uncertain region: g1、g2、g3、g4、g5、g6、g7And g8Belong to determining region, g9Belong to In uncertain region;
S546, fusion minutia image P is driven using feature differenceI1And PP1Determination region:
DIF (m, n)=Δ T (m, n) Δ V (m, n)
In formula,Represent minutia imageWithDetermine the blending image in region, DIF (m, n) represents true Determine region fusion driving factors, " * " represents the product of value at respective pixel position in matrix;
S547, minutia image is merged using fuzzy logic theoryWithUncertain region;
μT∩V(Pk(m, n))=min [μT(Pk(m,n)),μV(Pk(m,n))]
In formula,Represent minutia imageWithThe blending image of uncertain region, " * " represent matrix The product of value at middle respective pixel position ,/being divided by for value at respective pixel position is represented in matrix,Generation Table minutia imagePixel value at the place of position (m, n) is subordinate to letter to uncertain region blending image significance level Number,Represent minutia imageThe place of position (m, n) pixel value to uncertain region blending image The membership function of significance level, μT(Pk(m, n)) represent " minutia imageWithPartial gradient be big " situation Membership function, μV(Pk(m, n)) represent " minutia imageWithLocal weighted variance be big " situation is subordinate to letter Number, k=I1Or P1
S548, fusionWithObtain minutia imageWithFusion results:
In formula, FDIF(m, n) represents minutia imageWithBlending image.
S549, to FDIF(m, n) carries out consistency desired result:
Using the window of size 3 × 3 in image FDIFIt is moved on (m, n), with the pixel of thereabout come authentication center picture Element, if center pixel fromWithIn one of image, and surrounding s (4 < s < 8) a pixel of the center pixel It both is from another image, then the center pixel value is just changed to the pixel value of another image in the position, window Mouth traversal whole image FDIF(m, n) obtains the F correctedDIF(m,n)。
As a further improvement of the above technical scheme, in step S55, fusion results F's seeks mode are as follows:
F=α FL+βFD+γFDIF
In formula, α, β and γ are fusion weight coefficient.
Advantageous effects of the invention:
1, the method for the present invention has merged multiple amount of polarization of infrared polarization image, and multiple amount of polarization include image Q, image U, Image V, total light intensity image I, degree of polarization image P and angle of polarization image R are helped so that fused image scene is more abundant Pass through in identification camouflaged target, while during image co-registration each except removing common portion between each polarization image of acquisition The information redundancy between each amount of polarization is efficiently solved the problems, such as from exclusive part.
2, the method for the present invention has separated multiple features of image, and it is special to combine image that is multiple uncertain and changing at random Sign, so that having taken into account the multiple features of image in fusion process, enhances the edge detail information of image, improves pair of image Degree of ratio.
Detailed description of the invention
Fig. 1 is the infrared polarization image interfusion method flow chart that the present embodiment is driven based on multiple features and feature difference;
Fig. 2 is the flow chart of the fusion method based on multiple features separation described in the present embodiment.
Specific embodiment
For the ease of implementation of the invention, it is further described below with reference to specific example.
A kind of infrared polarization image interfusion method driven based on multiple features and feature difference as shown in Figure 1, it is specific to wrap Include following steps:
S1, the polarization that light is indicated using Stokes vector, i.e. S=(I, Q, U, V), and degree of polarization is calculated according to S vector Image P and angle of polarization image R:
Often do not have to phase delay device in actually polarization, stokes parameter only can be obtained by rotation linear polarizer. Therefore the degree of polarization image P and angle of polarization image R of polarised light can be indicated are as follows:
S2, the angle of polarization image R and U progress linear weighted function are obtained into image R ':
R'=(R+U)/2.
It is removed between S3, calculating image R ', total light intensity image I and degree of polarization image P respective exclusive except common portion Part is denoted as R respectively1、I1And P1, it specifically includes:
There are the information of redundancy and complementation between S31, image R ', total light intensity image I and degree of polarization image P, following formula is utilized Calculate the common portion Co between image R ', total light intensity image I and degree of polarization image P:
Co=R' ∩ I ∩ P=min { R', I, P };
S32, image R ', total light intensity image I and the respective exclusive part R of degree of polarization image P are calculated1、I1And P1:
S4, by image P1、I1And R1The channel R, the channel G and the channel B being respectively mapped in rgb space are to obtain RGB figure RGB image is converted to extract light intensity level Y after YUV image by picture, and UV is color component in YUV image, and Y is luminance component:
S41, by image P1It is mapped to the channel R in rgb space, by image I1Be mapped to the channel G in rgb space, will figure As R1The channel B being mapped in rgb space obtains RGB image;
S42, RGB image is converted to YUV image:
S43, extract light intensity level Y:
Y=0.299R+0.587G+0.114B.
S5, with reference to Fig. 2, pass through the method blending image I separated based on multiple features1And P1, fusion results F is obtained, wherein base In the method that multiple features separate i.e. first to image I1And P1Carry out multiple features separation, subsequent blending image image I1And P1It is identical Feature is then again merged the fusion results of different characteristic again, i.e. completion image I1And P1Fusion, in the present embodiment The method based on multiple features separation separated the dark feature, bright feature and minutia of image, combine energy of local area Multiple characteristics of image for not knowing and changing at random of feature, regional area Variance feature and regional area gradient, it is contemplated that Relationship between image pixel enhances the edge detail information of image so that having taken into account bright dark feature in fusion process, The contrast for improving image, specifically includes:
S51, using the multiple features separation method based on dark primary theory respectively to image I1And P1Multiple features separation is carried out, Obtain image I1Bright characteristic image, dark characteristic image and minutia image and image P1Bright characteristic image, dark feature Image and minutia image, dark primary are that He etc. is used to estimate the transmissivity in atmospherical scattering model, are realized to natural image Quick demisting, in natural image, being influenced apparent region by mist is usually pixel most bright in dark primary, and fogless region Pixel value in dark primary is very low.Therefore for gray level image, dark primary figure contains bright areas in original image, body Original image low frequency part is showed, that is, has remained in original image that grey scale change is than more gentle region, so that bright dark feature difference is more For protrusion, the local region information that the variation of lose gray level value is relatively more violent, contrast is high, especially edge detail information are sought Process specifically includes:
S511, image I is sought1And P1Dark primary image:
In formula,For image I1Dark primary image,For image P1Dark primary image, C be image I1Or P1's Three Color Channels R, G, B, N (x) is the pixel in the window appli centered on pixel x, (I1)C(y) and (P1)C(y) divide It is not expressed as image I1And P1A Color Channel figure;
S512, to image I1With image P1It negates respectively, obtains imageWith imageBy dark primary imageWithRespectively with imageWithIt takes small rule to be merged according to absolute value, obtains image I1Dark characteristic imageAnd figure As P1Dark characteristic image
S513, by dark primary imageWithRespectively with corresponding dark characteristic imageWithIt is poor to make, and obtains image I1Bright characteristic imageWith image P1Bright characteristic image
S514, by image I1And P1Respectively with corresponding dark primary image
WithIt is poor to make, and obtains image I1Minutia imageWith image P1Minutia image
S52, using the matching process blending image I based on energy of local area feature1Bright characteristic image and image P1's Bright characteristic image obtains bright Fusion Features result FL, bright characteristic information concentrated the bright areas in original image, embodied original image Low frequency component as in, calculating process specifically include:
S521, bright characteristic image is soughtWithGauss weight local energy:
In formula, k=I1Or P1,Represent bright characteristic imageOrGauss weighting office centered on point (m, n) Portion's energy, w (i, j) are gaussian filtering matrix, and N is the size in region, t=(N-1)/2;
S522, bright characteristic image is soughtWithGauss weight local energy matching degree:
In formula, ME(m, n) indicates bright characteristic imageWithGauss weight local energy matching degree,Generation The bright characteristic image of tableGauss centered on point (m, n) weights local energy,Represent bright characteristic imageWith point Gauss centered on (m, n) weights local energy;
S523, local energy and the bright characteristic image of Gauss weighting local energy matching degree fusion are weighted by GaussWith
In formula, FL(m, n) is bright characteristic imageWithFusion results, TlFor the threshold of bright Fusion Features similitude judgement Value, value is 0~0.5, if ME(m, n) < Tl, then bright characteristic imageWithRegion centered on point (m, n) not phase Seemingly, bright characteristic imageWithFusion results choose Gauss weighted area energy the greater, otherwise, bright characteristic imageWithFusion results be coefficient weighted average.
S53, using the matching process blending image I based on regional area weighted variance feature1Dark characteristic image and figure As P1Dark characteristic image, obtain dark Fusion Features result FD, dark characteristic image lacks the bright areas in source images, but still The approximate image that source images can so be regarded as, contains the main energetic of image, and embodies the elementary contour of image, calculates Process specifically includes:
S531, dark characteristic image is soughtWithRegional area weighted variance energy:
In formula, k=I1Or P1,Represent dark characteristic imageOrRegional area centered on point (m, n) Weighted variance energy, w (i, j) be gaussian filtering matrix, N be region size, t=(N-1)/2,Represent with point (m, N) the regional area average value centered on;
S532, dark characteristic image is soughtWithRegional area weighted variance energy matching degree:
In formula, MV(m, n) indicates dark characteristic imageWithRegional area weighted variance energy matching degree,Represent dark characteristic imageRegional area weighted variance energy centered on point (m, n),It represents dark special Levy imageRegional area weighted variance energy centered on point (m, n);
S533, to merge two width by regional area weighted variance energy and local sub-region right variance energy match degree secretly special Levy imageWith
In formula, FD(m, n) is dark characteristic imageWithFusion results, ThFor the threshold of dark Fusion Features similitude judgement Value, value is 0.5~1, if ME(m, n) < Th, then two imagesWithRegion centered on point (m, n) is dissimilar, Two imagesWithFusion results choose the big person of regional area weighted variance energy;No person, two imagesWith Fusion results be coefficient weighted average.
S54, partial gradient and local variance can be well reflected the detailed information of image, express the clear of image Degree.In order to retain the detailed information of minutia image as much as possible, clarity is promoted, is driven using fuzzy logic and feature difference Dynamic blending image I1Minutia image and image P1Minutia image, obtain minutia fusion results FDIF, calculate Process specifically includes:
S541, minutia image is soughtWithPartial gradient:
In formula, k=I1Or P1,Represent minutia imageOrPart ladder at middle pixel (m, n) Degree,It respectively represents and is obtained using the horizontal and vertical template of Sobel operator with minutia image convolution The horizontal and vertical edge image obtained;
S542, minutia image is soughtWithRegional area weighted variance energy:
In formula, k=I1Or P1,Represent minutia imageOrRegional area centered on point (m, n) Weighted variance energy, w (i, j) be gaussian filtering matrix, N be region size, t=(N-1)/2,Represent with point (m, N) the regional area average value centered on;
S543, minutia image is soughtWithLocal difference gradient delta T (m, n), local difference variance Δ V (m, N), partial gradient matching degree MT(m, n) and local weighted variance matching degree MV1(m, n):
In formula,Represent minutia imagePartial gradient at middle pixel (m, n),It represents Minutia imagePartial gradient at middle pixel (m, n),Represent minutia imageIt is with point (m, n) The regional area weighted variance energy at center,Represent minutia imagePartial zones centered on point (m, n) Domain weighted variance energy;
S544, decision diagram pixel-based is sought according to local difference gradient and local difference variance, according to partial gradient Matching degree and local weighted variance matching degree seek feature difference degree decision diagram:
In formula, PDG (m, n) is decision diagram pixel-based, g1~g9Expression meets above-mentioned respective conditions time point (m, n) Location of pixels is 1, and the decision diagram that other location of pixels are 0, DDG (m, n) is feature difference degree decision diagram, d1And d2In satisfaction The location of pixels for stating respective conditions time point (m, n) is 1, the decision diagram that other location of pixels are 0;
S545, details is judged according to decision diagram PDG (m, n) pixel-based and feature difference degree decision diagram DDG (m, n) Characteristic imageWithDetermination region and uncertain region:
Here determination region indicates to reflect using any one in PDG (m, n) or DDG (m, n) decision diagram The gray value of pixel can be retained in the pixel of blending image out, uncertain region domain representation using PDG (m, n) or DDG (m, N) gray value that any one in decision diagram not can reflect pixel can be retained in the pixel of blending image, specifically :
For g1And g2For both of these case, local difference gradient delta T (m, n), part difference variance Δ V (m, n) can Reflect whether the gray value of corresponding pixel is retained in blending image, therefore g1And g2Belong to determining region;
For g3And g4For both of these case, two image locals spy can be determined using feature difference degree decision diagram DDG The difference degree size of sign, then chooses the biggish difference characteristic of difference degree, this difference characteristic is able to reflect out corresponding picture Whether the gray value of vegetarian refreshments is retained in blending image, therefore g3And g4Belong to determining region;
For g5、g6、g7And g8For these four situations, local difference gradient delta T (m, n) and local difference variance Δ V Any one in (m, n) can reflect whether the gray value of corresponding pixel is retained in blending image, therefore g5、g6、 g7And g8Belong to determining region;
For g9For such case, the gray scale of corresponding pixel not can reflect according to two decision diagrams PDG and DDG Whether value is retained in blending image, therefore g9Belong to uncertain region.
S546, fusion minutia image is driven using feature differenceWithDetermination region:
DIF (m, n)=Δ T (m, n) Δ V (m, n)
In formula,Represent minutia imageWithDetermine the blending image in region, DIF (m, n) represents true Determine region fusion driving factors, with the product representation of local difference gradient delta T (m, n) and part difference variance Δ V (m, n), " * " represents the product of value at respective pixel position in matrix;
S547, minutia image is merged using fuzzy logic theoryWithUncertain region.
For minutia imageWithThe partial gradient that need to consider minutia image is big or details The local weighted variance of characteristic image be it is big, for this group of relationship, the subordinating degree function of details of construction characteristic image.Assuming that " minutia imageWithPartial gradient be big " membership function be respectivelyWith " minutia imageWithLocal weighted variance be big " membership function be respectivelyWithThen have;
In formula, k=I1Or P1
Minutia image can be calculated separately using the friendship operation rule of fuzzy logicWithAt the place of position (m, n) Pixel value is respectively to the membership function of uncertain region blending image significance levelWith
μT∩V(Pk(m, n))=min [μT(Pk(m,n)),μV(Pk(m,n))]
In formula, k=I1Or P1
The blending image of two images minutia image uncertain region are as follows:
In formula,Represent minutia imageWithThe blending image of uncertain region, " * " represent matrix The product of value at middle respective pixel position ,/being divided by for value at respective pixel position is represented in matrix,Generation Table minutia imagePixel value at the place of position (m, n) is subordinate to letter to uncertain region blending image significance level Number,Represent minutia imageThe place of position (m, n) pixel value to uncertain region blending image The membership function of significance level;
S548, fusionWithObtain minutia imageWithFusion results:
In formula, FDIF(m, n) represents minutia imageWithBlending image;
S549, to FDIF(m, n) carries out consistency desired result:
Using the window of size 3 × 3 in image FDIFIt is moved on (m, n), with the pixel of thereabout come authentication center picture Element.If center pixel fromWithIn one of image, and surrounding s (4 < s < 8) a pixel of the center pixel It both is from another image, then the center pixel value is just changed to the pixel value of another image in the position, window Mouth traversal whole image FDIF(m, n) obtains the F correctedDIF(m,n)。
S55, fusion FL、FDWith FDIF, obtain fusion results F:
F=α FL+βFD+γFDIF
In formula, α, β and γ be fusion weight coefficient, value range be [0,1], originally implemented in for less fusion figure The supersaturation of picture simultaneously improves contrast, and it be 0.3, γ value is 1 that α value, which is 1, β value,.
S6, the luminance component Y in the fusion results F replacement step S4 of step S5 is obtained into replaced YUV image, then right Replaced YUV image carries out inverse transformation and obtains RGB image, i.e., final Polarization Image Fusion result:
The luminance contrast of image is reduced in RGB color mapping process, so needing to carry out gray scale increasing to luminance component By force, it has originally implemented and has replaced luminance component to enhance brightness using grayscale fusion image, i.e., by the fusion results F of step S5 Luminance component Y is replaced, to obtain final Polarization Image Fusion result.
Contain the explanation of the preferred embodiment of the present invention above, this be for the technical characteristic that the present invention will be described in detail, and Be not intended to for summary of the invention being limited in concrete form described in embodiment, according to the present invention content purport carry out other Modifications and variations are also protected by this patent.The purport of the content of present invention is to be defined by the claims, rather than by embodiment Specific descriptions are defined.

Claims (10)

1. the infrared polarization image interfusion method driven based on multiple features and feature difference, which is characterized in that specifically include following Step:
S1, the polarization that light is indicated using Stokes vector, i.e. S=(I, Q, U, V), and degree of polarization image P is calculated according to S vector With angle of polarization image R;
S2, the angle of polarization image R and U progress linear weighted function are obtained into image R ';
S3, the respective exclusive part removed except common portion between image R ', total light intensity image I and degree of polarization image P is calculated, It is denoted as R respectively1、I1And P1
S4, by image P1、I1And R1The channel R, the channel G and the channel B being respectively mapped in rgb space, will to obtain RGB image RGB image is converted to extract light intensity level Y after YUV image;
S5, the method blending image I by being separated based on multiple features1And P1, obtain fusion results F;
S6, the luminance component Y in the fusion results F replacement step S4 of step S5 is obtained into replaced YUV image, then to replacement YUV image afterwards carries out inverse transformation and obtains RGB image, i.e., final Polarization Image Fusion result.
2. the infrared polarization image interfusion method driven according to claim 1 based on multiple features and feature difference, feature It is, step S1 is specifically included:
S11, degree of polarization image P is calculated:
In formula, Q indicates that the intensity difference between horizontal polarization and vertical polarization, U indicate that polarization direction is strong between 45 ° and 135 ° It is poor to spend, and I indicates total light intensity image;
S12, calculated angle of polarization image R:
3. the infrared polarization image interfusion method driven according to claim 1 based on multiple features and feature difference, feature It is, step S3 is specifically included:
Common portion Co between S31, calculating image R ', total light intensity image I and degree of polarization image P:
Co=R' ∩ I ∩ P=min { R', I, P };
S32, image R ', total light intensity image I and the respective exclusive part R of degree of polarization image P are calculated1、I1And P1:
4. the infrared polarization image interfusion method driven according to claim 1 based on multiple features and feature difference, feature It is, step S4 is specifically included:
S41, by image P1、I1And R1The channel R, the channel G and the channel B being respectively mapped in rgb space obtain RGB image;
S42, RGB image is converted to YUV image:
S43, extract light intensity level Y:
Y=0.299R+0.587G+0.114B.
5. the infrared polarization image interfusion method driven according to claim 1 based on multiple features and feature difference, feature It is, step S5 is specifically included:
S51, to image I1And P1Multiple features separation is carried out, image I is obtained1Bright characteristic image, dark characteristic image and minutia Image and image P1Bright characteristic image, dark characteristic image and minutia image;
S52, blending image I1Bright characteristic image and image P1Bright characteristic image, obtain bright Fusion Features result FL
S53, blending image I1Dark characteristic image and image P1Dark characteristic image, obtain dark Fusion Features result FD
S54, blending image I1Minutia image and image P1Minutia image, obtain minutia fusion results FDIF
S55, fusion FL、FDWith FDIF, obtain fusion results F.
6. the infrared polarization image interfusion method driven according to claim 5 based on multiple features and feature difference, feature It is, in step S51, using the multiple features separation method based on dark primary theory respectively to image I1And P1Carry out multiple features point From specifically including:
S511, image I is sought1And P1Dark primary image:
In formula,For image I1Dark primary image,For image P1Dark primary image, C be image I1Or P1Three A Color Channel R, G, B, N (x) are the pixel in the window appli centered on pixel x, (I1)C(y) and (P1)C(y) respectively It is expressed as image I1And P1A Color Channel figure;
S512, to image I1With image P1It negates respectively, obtains imageWith imageBy dark primary imageWithRespectively With imageWithIt takes small rule to be merged according to absolute value, obtains image I1Dark characteristic imageWith image P1It is dark Characteristic image
S513, by dark primary imageWithRespectively with corresponding dark characteristic imageWithIt is poor to make, and obtains image I1It is bright Characteristic imageWith image P1Bright characteristic image
S514, by image I1And P1Respectively with corresponding dark primary imageWithIt is poor to make, and obtains image I1Minutia ImageWith image P1Minutia image
7. the infrared polarization image interfusion method driven according to claim 5 based on multiple features and feature difference, feature It is, in step S52, using the matching process blending image I based on energy of local area feature1Bright characteristic image and image P1Bright characteristic image, specifically include:
S521, bright characteristic image is soughtWithGauss weight local energy:
In formula, k=I1Or P1,Represent bright characteristic imageOrGauss centered on point (m, n) weights local energy Amount, w (i, j) are gaussian filtering matrix, and N is the size in region, t=(N-1)/2;
S522, bright characteristic image is soughtWithGauss weight local energy matching degree:
In formula, ME(m, n) indicates bright characteristic imageWithGauss weight local energy matching degree,It represents bright Characteristic imageGauss centered on point (m, n) weights local energy,Represent bright characteristic imageWith point (m, n) Centered on Gauss weight local energy;
S523, local energy and the bright characteristic image of Gauss weighting local energy matching degree fusion are weighted by GaussWith
In formula, FL(m, n) is bright characteristic imageWithFusion results, TlFor bright Fusion Features similitude judgement threshold value, if It is ME(m, n) < Tl, then bright characteristic imageWithRegion centered on point (m, n) is dissimilar, bright characteristic imageWith Fusion results choose Gauss weighted area energy the greater, otherwise, bright characteristic imageWithFusion results add for coefficient Weight average.
8. the infrared polarization image interfusion method driven according to claim 5 based on multiple features and feature difference, feature It is, in step S53, using the matching process blending image I based on regional area weighted variance feature1Dark characteristic image with Image P1Dark characteristic image, specifically include:
S531, dark characteristic image is soughtWithRegional area weighted variance energy:
In formula, k=I1Or P1,Represent dark characteristic imageOrRegional area weighting centered on point (m, n) Variance energy, w (i, j) be gaussian filtering matrix, N be region size, t=(N-1)/2,It represents with point (m, n) and is The regional area average value at center;
S532, dark characteristic image is soughtWithRegional area weighted variance energy matching degree:
In formula, MV(m, n) indicates dark characteristic imageWithRegional area weighted variance energy matching degree,Generation The dark characteristic image of tableRegional area weighted variance energy centered on point (m, n),Represent dark characteristic image Regional area weighted variance energy centered on point (m, n);
S533, the dark characteristic pattern of two width is merged by regional area weighted variance energy and local sub-region right variance energy match degree PictureWith
In formula, FD(m, n) is dark characteristic imageWithFusion results, ThFor dark Fusion Features similitude judgement threshold value, If ME(m, n) < Th, then two imagesWithRegion centered on point (m, n) is dissimilar, two imagesWithFusion results choose the big person of regional area weighted variance energy;No person, two imagesWithFusion results be Coefficient weighted average.
9. the infrared polarization image interfusion method driven according to claim 5 based on multiple features and feature difference, feature It is, in step S54, blending image I is driven using fuzzy logic and feature difference1Minutia image and image P1It is thin Characteristic image is saved, is specifically included:
S541, minutia image is soughtWithPartial gradient:
In formula, k=I1Or P1,Represent minutia imageOrPartial gradient at middle pixel (m, n),It respectively represents and is obtained using the horizontal and vertical template and minutia image convolution of Sobel operator Horizontal and vertical edge image;
S542, minutia image is soughtWithRegional area weighted variance energy:
In formula, k=I1Or P1,Represent minutia imageOrRegional area weighting centered on point (m, n) Variance energy, w (i, j) be gaussian filtering matrix, N be region size, t=(N-1)/2,It represents with point (m, n) and is The regional area average value at center;
S543, minutia image is soughtWithLocal difference gradient delta T (m, n), part difference variance Δ V (m, n), office Portion gradient matching degree MT(m, n) and local weighted variance matching degree MV1(m, n):
In formula,Represent minutia imagePartial gradient at middle pixel (m, n),Represent details spy Levy imagePartial gradient at middle pixel (m, n),Represent minutia imageCentered on point (m, n) Regional area weighted variance energy,Represent minutia imageRegional area weighting side centered on point (m, n) Poor energy;
S544, decision diagram pixel-based is sought according to local difference gradient and local difference variance, is matched according to partial gradient Degree seeks feature difference degree decision diagram with local weighted variance matching degree:
In formula, PDG (m, n) is decision diagram pixel-based, g1~g9Expression meets the pixel of above-mentioned respective conditions time point (m, n) Position is 1, and the decision diagram that other location of pixels are 0, DDG (m, n) is feature difference degree decision diagram, d1And d2It is above-mentioned right to meet The location of pixels for answering condition time point (m, n) is 1, the decision diagram that other location of pixels are 0;
S545, minutia is judged according to decision diagram PDG (m, n) pixel-based and feature difference degree decision diagram DDG (m, n) ImageWithDetermination region and uncertain region: g1、g2、g3、g4、g5、g6、g7And g8Belong to determining region, g9Belong to not really Determine region;
S546, fusion minutia image is driven using feature differenceWithDetermination region:
DIF (m, n)=Δ T (m, n) Δ V (m, n)
In formula,Represent minutia imageWithDetermine the blending image in region, DIF (m, n), which is represented, determines area Driving factors are merged in domain, and " * " represents the product of value at respective pixel position in matrix;
S547, minutia image is merged using fuzzy logic theoryWithUncertain region;
μT∩V(Pk(m, n))=min [μT(Pk(m,n)),μV(Pk(m,n))]
In formula,Represent minutia imageWithThe blending image of uncertain region, " * " represent right in matrix The product of pixel position value is answered ,/being divided by for value at respective pixel position is represented in matrix,It represents thin Save characteristic imageThe place of position (m, n) pixel value to the membership function of uncertain region blending image significance level,Represent minutia imageThe place of position (m, n) pixel value to uncertain region blending image weight Want the membership function of degree, μT(Pk(m, n)) represent " minutia imageWithPartial gradient be big " person in servitude of situation Membership fuction, μV(Pk(m, n)) represent " minutia imageWithLocal weighted variance be big " membership function of situation, K=I1Or P1
S548, fusionWithObtain minutia imageWithFusion results:
In formula, FDIF(m, n) represents minutia imageWithBlending image.
S549, to FDIF(m, n) carries out consistency desired result:
Using the window of size 3 × 3 in image FDIFIt is moved on (m, n), with the pixel of thereabout come authentication center pixel, such as Fruit center pixel fromWithIn one of image, and surrounding s (4 < s < 8) a pixel of the center pixel is all come From in another image, then the center pixel value is just changed to the pixel value of another image in the position, window time Go through whole image FDIF(m, n) obtains the F correctedDIF(m,n)。
10. the infrared polarization image interfusion method driven according to claim 5 based on multiple features and feature difference, feature It is, in step S55, fusion results F's seeks mode are as follows:
F=α FL+βFD+γFDIF
In formula, α, β and γ are fusion weight coefficient.
CN201811180813.8A 2018-10-09 2018-10-09 Infrared polarization image fusion method based on multi-feature and feature difference driving Active CN109410160B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811180813.8A CN109410160B (en) 2018-10-09 2018-10-09 Infrared polarization image fusion method based on multi-feature and feature difference driving

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811180813.8A CN109410160B (en) 2018-10-09 2018-10-09 Infrared polarization image fusion method based on multi-feature and feature difference driving

Publications (2)

Publication Number Publication Date
CN109410160A true CN109410160A (en) 2019-03-01
CN109410160B CN109410160B (en) 2020-09-22

Family

ID=65467599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811180813.8A Active CN109410160B (en) 2018-10-09 2018-10-09 Infrared polarization image fusion method based on multi-feature and feature difference driving

Country Status (1)

Country Link
CN (1) CN109410160B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111292279A (en) * 2020-01-17 2020-06-16 中国科学院上海技术物理研究所 Polarization image visualization method based on color image fusion
CN116091361A (en) * 2023-03-23 2023-05-09 长春理工大学 Multi-polarization parameter image fusion method, system and terrain exploration monitor
WO2024031643A1 (en) * 2022-08-10 2024-02-15 天津恒宇医疗科技有限公司 Ps-oct visibility improvement method and system based on polarization multi-parameter fusion

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110293179A1 (en) * 2010-05-31 2011-12-01 Mert Dikmen Systems and methods for illumination correction of an image
CN102682443A (en) * 2012-05-10 2012-09-19 合肥工业大学 Rapid defogging algorithm based on polarization image guide
CN104835113A (en) * 2015-04-30 2015-08-12 北京环境特性研究所 Polarization image fusion method based on super-resolution image reconstruction
CN104978724A (en) * 2015-04-02 2015-10-14 中国人民解放军63655部队 Infrared polarization fusion method based on multi-scale transformation and pulse coupled neural network
CN105139347A (en) * 2015-07-10 2015-12-09 中国科学院西安光学精密机械研究所 Polarized image defogging method combined with dark channel prior principle
CN105279747A (en) * 2015-11-25 2016-01-27 中北大学 Infrared polarization and light intensity image fusing method guided by multi-feature objective function

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110293179A1 (en) * 2010-05-31 2011-12-01 Mert Dikmen Systems and methods for illumination correction of an image
CN102682443A (en) * 2012-05-10 2012-09-19 合肥工业大学 Rapid defogging algorithm based on polarization image guide
CN104978724A (en) * 2015-04-02 2015-10-14 中国人民解放军63655部队 Infrared polarization fusion method based on multi-scale transformation and pulse coupled neural network
CN104835113A (en) * 2015-04-30 2015-08-12 北京环境特性研究所 Polarization image fusion method based on super-resolution image reconstruction
CN105139347A (en) * 2015-07-10 2015-12-09 中国科学院西安光学精密机械研究所 Polarized image defogging method combined with dark channel prior principle
CN105279747A (en) * 2015-11-25 2016-01-27 中北大学 Infrared polarization and light intensity image fusing method guided by multi-feature objective function

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111292279A (en) * 2020-01-17 2020-06-16 中国科学院上海技术物理研究所 Polarization image visualization method based on color image fusion
CN111292279B (en) * 2020-01-17 2022-07-29 中国科学院上海技术物理研究所 Polarization image visualization method based on color image fusion
WO2024031643A1 (en) * 2022-08-10 2024-02-15 天津恒宇医疗科技有限公司 Ps-oct visibility improvement method and system based on polarization multi-parameter fusion
CN116091361A (en) * 2023-03-23 2023-05-09 长春理工大学 Multi-polarization parameter image fusion method, system and terrain exploration monitor

Also Published As

Publication number Publication date
CN109410160B (en) 2020-09-22

Similar Documents

Publication Publication Date Title
Liang et al. Single underwater image enhancement by attenuation map guided color correction and detail preserved dehazing
CN108399632B (en) RGB-D camera depth image restoration method based on color image combination
Ancuti et al. D-hazy: A dataset to evaluate quantitatively dehazing algorithms
Zhang et al. Underwater image enhancement via extended multi-scale Retinex
Tseng et al. Automatic cloud removal from multi-temporal SPOT images
CN108090888B (en) Fusion detection method of infrared image and visible light image based on visual attention model
CN106683080B (en) A kind of retinal fundus images preprocess method
WO2016054904A1 (en) Image processing method, image processing device and display device
CN102306384B (en) Color constancy processing method based on single image
CN109410160A (en) The infrared polarization image interfusion method driven based on multiple features and feature difference
Yu et al. A false color image fusion method based on multi-resolution color transfer in normalization YCBCR space
CN105205794A (en) Synchronous enhancement de-noising method of low-illumination image
CN109410161A (en) A kind of fusion method of the infrared polarization image separated based on YUV and multiple features
Chen et al. Blood vessel enhancement via multi-dictionary and sparse coding: Application to retinal vessel enhancing
Salem et al. Segmentation of retinal blood vessels based on analysis of the hessian matrix and clustering algorithm
CN106558031A (en) A kind of image enchancing method of the colored optical fundus figure based on imaging model
CN107481214A (en) A kind of twilight image and infrared image fusion method
Mahmood et al. Human visual enhancement using multi scale retinex
CN105976342B (en) A kind of adaptive gray level image Pseudo Col ored Image method
Zulfahmi et al. Improved image quality retinal fundus with contrast limited adaptive histogram equalization and filter variation
CN109377468A (en) The pseudo-colours fusion method of infra-red radiation and polarization image based on multiple features
Wei et al. An image fusion dehazing algorithm based on dark channel prior and retinex
Rao et al. An Efficient Contourlet-Transform-Based Algorithm for Video Enhancement.
CN111311503A (en) Night low-brightness image enhancement system
CN109214322A (en) A kind of optimization method and system of file and picture visual effect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant