CN107451984B - Infrared and visible light image fusion algorithm based on mixed multi-scale analysis - Google Patents

Infrared and visible light image fusion algorithm based on mixed multi-scale analysis Download PDF

Info

Publication number
CN107451984B
CN107451984B CN201710621620.0A CN201710621620A CN107451984B CN 107451984 B CN107451984 B CN 107451984B CN 201710621620 A CN201710621620 A CN 201710621620A CN 107451984 B CN107451984 B CN 107451984B
Authority
CN
China
Prior art keywords
image
band
frequency sub
frequency
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710621620.0A
Other languages
Chinese (zh)
Other versions
CN107451984A (en
Inventor
江泽涛
吴辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin University of Electronic Technology
Original Assignee
Guilin University of Electronic Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guilin University of Electronic Technology filed Critical Guilin University of Electronic Technology
Priority to CN201710621620.0A priority Critical patent/CN107451984B/en
Publication of CN107451984A publication Critical patent/CN107451984A/en
Application granted granted Critical
Publication of CN107451984B publication Critical patent/CN107451984B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20064Wavelet transform [DWT]

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an infrared and visible light image fusion algorithm based on mixed multi-scale analysis, which comprises the following steps: step 1: performing NSCT decomposition on the infrared and visible light images to obtain a low-frequency sub-band and a high-frequency sub-band; step 2: adopting static wavelet transformation to the low-frequency sub-band to obtain a low-frequency sub-band and three high-frequency sub-bands, and fusing the low-frequency sub-band and the high-frequency sub-bands respectively by adopting local energy and absolute value combination and a compressive sensing theory; and step 3: judging the definition of the image to be fused, and selecting the number of the enhancement layers of the LSCN according to a judgment criterion; and 4, step 4: adopting a fusion rule that the absolute value of the highest-layer high-frequency sub-band is larger, and adopting an improved PCNN model to fuse the other sub-bands; and 5: and performing NSCT inverse transformation on the fusion result to obtain a final fusion image. The fused image obtained by the method has the advantages of prominent edge, high contrast, prominent target and higher indexes such as average gradient, spatial frequency and the like of the algorithm than the prior art.

Description

Infrared and visible light image fusion algorithm based on mixed multi-scale analysis
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to an infrared and visible light image fusion algorithm based on mixed multi-scale analysis.
Background
The image fusion method based on wavelet transform is a classic fusion algorithm, but wavelets can only represent isotropic objects, and are not an ideal representation tool for characteristics such as image central lines, edges and the like. Contourlet has wide application in image fusion, and through multi-scale and multi-direction decomposition of images, Contourlet can well capture detail characteristics in the images, thereby making up the defects of wavelets. However, since the Contourlet transform uses a down-sampling operation, it has no translation invariance, and is prone to generate a pseudo Gibbs phenomenon in image processing.
The nonsampled Contourlet transform (NSCT) proposed by a.l.cunha et al has translational invariance, can fully retain effective information of an image and generate a better fusion effect, but has the problems of poor sparsity of low-frequency partial images, unfavorable feature extraction and the like.
Disclosure of Invention
Aiming at the defects of the prior art, the invention solves the problems of low contrast, insufficient retention of edge information and the like in the fusion of infrared and visible light images.
In order to solve the technical problems, the technical scheme adopted by the invention is an infrared and visible light image fusion algorithm based on mixed multi-scale analysis, which comprises the following steps:
step 1: respectively carrying out NSCT decomposition on the infrared image and the visible light image to obtain a low-frequency sub-band LJ(x, y) and the high frequency subband Hj,r(x, y), wherein J is the number of decomposition layers, and J and r represent the decomposition scale and the direction number.
Step 2: and performing static wavelet transformation on the low-frequency sub-band to obtain a low-frequency sub-band and three high-frequency sub-bands, fusing the low-frequency sub-band and the high-frequency sub-band by respectively adopting local energy and absolute value combination and a compressive sensing theory, and performing wavelet inverse transformation to obtain the NSCT reconstructed low-frequency sub-band.
The method for fusing the low-frequency sub-bands by combining local energy and absolute value maximization and adopting a compressed sensing theory comprises the following specific steps of:
Figure BDA0001361836150000021
where EN is the local area energy, which is defined as:
Figure BDA0001361836150000022
the method adopts the combination of local energy and absolute value taking and the compressed sensing theory to fuse the high-frequency sub-band, and comprises the following specific steps:
1) image of high frequency sub-band with size of m × n
Figure BDA0001361836150000023
And
Figure BDA0001361836150000024
are decomposed into sub-blocks of the same size, where j is not overlappedThinning each block of sub-block image using sym8 wavelet basis 1,2, 3;
2) designing a measurement matrix phi, and measuring and sampling the input high-frequency sub-band coefficient by using the measurement matrix to obtain a measurement vector
Figure BDA0001361836150000025
And
Figure BDA0001361836150000026
wherein k is 1, 2.., mxn;
3) calculating a measurement vector
Figure BDA0001361836150000027
And
Figure BDA0001361836150000028
standard Deviation of (SD)kAnd definition EAVkAnd obtaining a fused measurement vector by adopting a fusion rule based on the combination of the regional standard deviation, the regional definition and the S function, namely:
Figure BDA0001361836150000029
the image standard deviation formula is:
Figure BDA00013618361500000210
wherein
Figure BDA00013618361500000211
The image sharpness formula is:
Figure BDA0001361836150000031
the weighting coefficient ω is obtained by an S-function, which is:
Figure BDA0001361836150000032
wherein,
Figure BDA0001361836150000033
f is a contraction factor of the S function, is greater than or equal to 1, and is taken as 5;
4) for the fused measurement vector
Figure BDA0001361836150000034
Sparse reconstruction is carried out, and the reconstruction algorithm adopts OMP (object-to-process) so as to obtain the high-frequency sub-band of the fused image
Figure BDA0001361836150000035
SW to be obtainedFAnd
Figure BDA0001361836150000036
and performing static wavelet reconstruction to obtain a low-frequency sub-band finally used for NSCT reconstruction.
And step 3: judging the definition of the image to be fused, and selecting the number of the enhancement layers of the LSCN according to a judgment criterion, wherein the specific method comprises the following steps:
the image sharpness formula is:
Figure BDA0001361836150000037
calculating the image definition according to the formula (8), comparing the image definition with a threshold lambda, and determining the number of layers of high-frequency coefficient enhancement according to the comparison result, namely:
Figure BDA0001361836150000038
wherein J is the number of decomposition layers, S is the comprehensive definition of the source image, and alpha is taken1=α2=0.5,λ=27。
And 4, step 4: and (3) adopting a fusion rule that the absolute value of the highest-layer high-frequency sub-band is larger, and adopting an improved PCNN model to fuse the other sub-bands, wherein the specific fusion rule is as follows:
Figure BDA0001361836150000039
in order to improve the visual impression of the image, the other subbands except the highest-layer high-frequency subband n are fused by adopting an improved PCNN model, and a fusion coefficient is determined by comparing the sum of the firing amplitudes of the PCNN neurons, namely:
Figure BDA0001361836150000041
wherein M isijAnd (n) is the sum of the pulse ignition amplitudes output by the PCNN, j is 1,2, and n-1, wherein epsilon is a custom threshold, and epsilon is 0.002.
Because the output of the traditional PCNN adopts a hard limiting function and cannot reflect the amplitude difference of neuron ignition, the invention adopts a Sigmoid function as the output of the PCNN, which can better depict the difference of the ignition amplitude when the synchronous pulse is excited, and the output of the PCNN is defined as follows:
Figure BDA0001361836150000042
in order to better represent the edge information of the image, modified laplacian energy (SML) and local spatial frequency are selected as external input and linking coefficients of the PCNN, respectively. SML is defined as follows:
Figure BDA0001361836150000043
the spatial frequency is:
Figure BDA0001361836150000044
wherein RF, CF, MDF and SDF represent row frequency, column frequency, main diagonal frequency and sub diagonal frequency, respectively, and the formula is as follows:
Figure BDA0001361836150000045
and 5: and performing NSCT inverse transformation on the low-frequency sub-band and the high-frequency sub-band obtained by fusion to obtain a final fusion image.
The fused image obtained by the technical scheme of the invention has the advantages of prominent edge, higher contrast and brightness, prominent target, and higher average gradient, spatial frequency, standard deviation and information entropy of the algorithm than the method in the prior art, thereby effectively retaining the infrared target, effectively acquiring the spatial domain information of the source image and obtaining better fusion effect.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a source infrared image according to an embodiment;
FIG. 3 is a source visible image according to one embodiment;
fig. 4 is a fused image obtained in example one document 6;
fig. 5 is a fused image obtained in example one document 8;
fig. 6 is a fused image obtained in example one document 12;
FIG. 7 is a fused image obtained by an algorithm of the present invention according to an embodiment of the present invention;
FIG. 8 is an example two-source infrared image;
FIG. 9 is a visible light image from a second source according to an embodiment;
fig. 10 is a fused image obtained in example two document 6;
fig. 11 is a fused image obtained in example two document 8;
fig. 12 is a fused image obtained in example two document 12;
FIG. 13 is a fused image obtained by the algorithm of the second embodiment of the present invention.
Detailed Description
The following description will be made with reference to the accompanying drawings and examples, but the present invention is not limited thereto.
FIG. 1 shows the process of the present invention, an infrared and visible light image fusion algorithm based on mixed multi-scale analysis, comprising the following steps:
step 1: respectively carrying out NSCT decomposition on the infrared image and the visible light image to obtain a low-frequency sub-band LJ(x, y) and the high frequency subband Hj,r(x, y), wherein J is the number of decomposition layers, and J and r represent the decomposition scale and the direction number.
Step 2: and performing static wavelet transformation on the low-frequency sub-band to obtain a low-frequency sub-band and three high-frequency sub-bands, fusing the low-frequency sub-band and the high-frequency sub-band by respectively adopting local energy and absolute value combination and a compressive sensing theory, and performing wavelet inverse transformation to obtain the NSCT reconstructed low-frequency sub-band.
The method for fusing the low-frequency sub-bands by combining local energy and absolute value maximization and adopting a compressed sensing theory comprises the following specific steps of:
Figure BDA0001361836150000061
where EN is the local area energy, which is defined as:
Figure BDA0001361836150000062
the method adopts the combination of local energy and absolute value taking and the compressed sensing theory to fuse the high-frequency sub-band, and comprises the following specific steps:
1) image of high frequency sub-band with size of m × n
Figure BDA0001361836150000063
And
Figure BDA0001361836150000064
decomposing the images into sub-blocks which are not overlapped with each other and have the same size, wherein j is 1,2 and 3, and thinning each sub-block image by using sym8 wavelet base;
2) designing a measurement matrix phi by applying measurement momentsThe array carries out measurement sampling on the input high-frequency sub-band coefficient to obtain a measurement vector
Figure BDA0001361836150000065
And
Figure BDA0001361836150000066
wherein k is 1, 2.., mxn;
3) calculating a measurement vector
Figure BDA0001361836150000067
And
Figure BDA0001361836150000068
standard Deviation of (SD)kAnd definition EAVkAnd obtaining a fused measurement vector by adopting a fusion rule based on the combination of the regional standard deviation, the regional definition and the S function, namely:
Figure BDA0001361836150000069
the image standard deviation formula is:
Figure BDA0001361836150000071
wherein
Figure BDA0001361836150000072
The image sharpness formula is:
Figure BDA0001361836150000073
the weighting coefficient ω is obtained by an S-function, which is:
Figure BDA0001361836150000074
wherein,
Figure BDA0001361836150000075
f is a contraction factor of the S function, is greater than or equal to 1, and is taken as 5;
4) for the fused measurement vector
Figure BDA0001361836150000076
Sparse reconstruction is carried out, and the reconstruction algorithm adopts OMP (object-to-process) so as to obtain the high-frequency sub-band of the fused image
Figure BDA0001361836150000077
SW to be obtainedFAnd
Figure BDA0001361836150000078
and performing static wavelet reconstruction to obtain a low-frequency sub-band finally used for NSCT reconstruction.
And step 3: judging the definition of the image to be fused, and selecting the number of the enhancement layers of the LSCN according to a judgment criterion, wherein the specific method comprises the following steps:
the image sharpness formula is:
Figure BDA0001361836150000079
calculating the image definition according to the formula (8), comparing the image definition with a threshold lambda, and determining the number of layers of high-frequency coefficient enhancement according to the comparison result, namely:
Figure BDA00013618361500000710
wherein J is the number of decomposition layers, S is the comprehensive definition of the source image, and alpha is taken1=α2=0.5,λ=27。
And 4, step 4: and (3) adopting a fusion rule that the absolute value of the highest-layer high-frequency sub-band is larger, and adopting an improved PCNN model to fuse the other sub-bands, wherein the specific fusion rule is as follows:
Figure BDA0001361836150000081
in order to improve the visual impression of the image, the other subbands except the highest-layer high-frequency subband n are fused by adopting an improved PCNN model, and a fusion coefficient is determined by comparing the sum of the firing amplitudes of the PCNN neurons, namely:
Figure BDA0001361836150000082
wherein M isijAnd (n) is the sum of the pulse ignition amplitudes output by the PCNN, j is 1,2, and n-1, wherein epsilon is a custom threshold, and epsilon is 0.002.
Because the output of the traditional PCNN adopts a hard limiting function and cannot reflect the amplitude difference of neuron ignition, the invention adopts a Sigmoid function as the output of the PCNN, which can better depict the difference of the ignition amplitude when the synchronous pulse is excited, and the output of the PCNN is defined as follows:
Figure BDA0001361836150000083
in order to better represent the edge information of the image, modified laplacian energy (SML) and local spatial frequency are selected as external input and linking coefficients of the PCNN, respectively. SML is defined as follows:
Figure BDA0001361836150000084
the spatial frequency is:
Figure BDA0001361836150000085
wherein RF, CF, MDF and SDF represent row frequency, column frequency, main diagonal frequency and sub diagonal frequency, respectively, and the formula is as follows:
Figure BDA0001361836150000091
and 5: and performing NSCT inverse transformation on the low-frequency sub-band and the high-frequency sub-band obtained by fusion to obtain a final fusion image.
The infrared image records the infrared radiation information of a target object, has strong identification capability on a low-illumination or disguised target, but is not sensitive to brightness change. The visible light image is influenced to a greater extent by illumination, and can provide detailed information of a target scene. Therefore, the infrared image and the visible light image are fused, and a complementary image with clear background and prominent target can be obtained by combining the advantages of the infrared image and the visible light image, so that an observer can more accurately and comprehensively describe the scene.
The experimental data for the first and second examples are as follows:
fig. 2 is a source infrared image of the first embodiment, fig. 3 is a source visible light image of the first embodiment, the data of fig. 4 is document 6 of table 1, the data of fig. 5 is document 8 of table 1, the data of fig. 6 is document 12 of table 1, and the data of fig. 7 is the inventive algorithm of table 1.
Fig. 8 is a source infrared image of the second embodiment, fig. 9 is a source visible light image of the second embodiment, the data of fig. 10 is document 6 of table 2, the data of fig. 11 is document 8 of table 2, the data of fig. 12 is document 12 of table 2, and the data of fig. 13 is the inventive algorithm of table 2.
Objective evaluation analysis, it can be seen from tables 1 and 2 that each evaluation index of the method provided by the present embodiment is superior to other methods, and it is known from the above that the fusion effect of the present embodiment more conforms to human visual perception.
Table 1 first set of image fusion results evaluation:
Figure BDA0001361836150000092
Figure BDA0001361836150000101
table 2 second set of image fusion results evaluation:
Figure BDA0001361836150000102
the fused image obtained by the technical scheme of the invention has the advantages of prominent edge, higher contrast and brightness, prominent target, and higher average gradient, spatial frequency, standard deviation and information entropy of the algorithm than the method in the prior art, thereby effectively retaining the infrared target, effectively acquiring the spatial domain information of the source image and obtaining better fusion effect.
The embodiments of the present invention have been described in detail with reference to the accompanying drawings, but the present invention is not limited to the described embodiments. It will be apparent to those skilled in the art that various changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention.

Claims (1)

1. An infrared and visible light image fusion algorithm based on mixed multi-scale analysis is characterized in that: the method comprises the following steps:
step 1: respectively carrying out NSCT decomposition on the infrared image and the visible light image to obtain a low-frequency sub-band LJ(x, y) and the high frequency subband Hj,r(x, y), wherein J is the number of decomposition layers, and J and r represent the decomposition scale and the direction number;
step 2: adopting static wavelet transformation to the low-frequency sub-band to obtain a low-frequency sub-band and three high-frequency sub-bands, respectively adopting local energy and absolute value combination and compressive sensing theory to fuse the low-frequency sub-band and the high-frequency sub-band, and then carrying out wavelet inverse transformation to obtain a NSCT reconstructed low-frequency sub-band;
and step 3: judging the definition of the image to be fused, and selecting the number of the enhancement layers of the LSCN according to a judgment criterion;
and 4, step 4: adopting a fusion rule that the absolute value of the highest-layer high-frequency sub-band is larger, and adopting an improved PCNN model to fuse the other sub-bands;
and 5: performing NSCT inverse transformation on the low-frequency sub-band and the high-frequency sub-band obtained by fusion to obtain a final fusion image;
in step 2, the low-frequency sub-bands are fused by combining the local energy and the absolute value, and the specific method is as follows:
Figure FDA0003040523090000011
where EN is the local area energy, which is defined as:
Figure FDA0003040523090000012
in step 2, the high-frequency sub-bands are fused by adopting a compressed sensing theory, and the method specifically comprises the following steps:
1) image of high frequency sub-band with size of m × n
Figure FDA0003040523090000013
And
Figure FDA0003040523090000014
decomposing the images into sub-blocks which are not overlapped with each other and have the same size, wherein j is 1,2 and 3, and thinning each sub-block image by using sym8 wavelet base;
2) designing a measurement matrix phi, and measuring and sampling the input high-frequency sub-band coefficient by using the measurement matrix to obtain a measurement vector
Figure FDA0003040523090000015
And
Figure FDA0003040523090000016
wherein k is 1, 2.., mxn;
3) calculating a measurement vector
Figure FDA0003040523090000017
And
Figure FDA0003040523090000018
standard Deviation of (SD)kAnd definition EAVkAnd obtaining a fused measurement vector by adopting a fusion rule based on the combination of the regional standard deviation, the regional definition and the S function, namely:
Figure FDA0003040523090000021
the image standard deviation formula is:
Figure FDA0003040523090000022
wherein
Figure FDA0003040523090000023
The image sharpness formula is:
Figure FDA0003040523090000024
the weighting coefficient ω is obtained by an S-function, which is:
Figure FDA0003040523090000025
wherein,
Figure FDA0003040523090000026
f is a contraction factor of the S function, is greater than or equal to 1, and is taken as 5;
4) for the fused measurement vector
Figure FDA0003040523090000027
Sparse reconstruction is carried out, and the reconstruction algorithm adopts OMP (object-to-process) so as to obtain the high-frequency sub-band of the fused image
Figure FDA0003040523090000028
SW to be obtainedFAnd
Figure FDA0003040523090000029
performing static wavelet reconstruction to obtain a low-frequency sub-band finally used for NSCT reconstruction;
in step 3, the specific method is as follows:
the image sharpness formula is:
Figure FDA00030405230900000210
calculating the image definition according to the formula (8), comparing the image definition with a threshold lambda, and determining the number of layers of high-frequency coefficient enhancement according to the comparison result, namely:
Figure FDA0003040523090000031
wherein J is the number of decomposition layers, S is the comprehensive definition of the source image, and alpha is taken1=α2=0.5,λ=27;
In step 4, the specific fusion rule is as follows:
Figure FDA0003040523090000032
in order to improve the visual impression of the image, the other subbands except the highest-layer high-frequency subband n are fused by adopting an improved PCNN model, and a fusion coefficient is determined by comparing the sum of the firing amplitudes of the PCNN neurons, namely:
Figure FDA0003040523090000033
wherein M isij(n) is the sum of pulse ignition amplitudes output by the PCNN, j is 1,2, and n-1, wherein epsilon is a self-defined threshold, and epsilon is 0.002;
because the output of the traditional PCNN adopts a hard limiting function, the amplitude difference of neuron ignition cannot be reflected, and the difference of the ignition amplitude when the synchronous pulse is excited can be better described by adopting a Sigmoid function as the output of the PCNN, wherein the output of the PCNN is defined as follows:
Figure FDA0003040523090000034
in order to better represent the edge information of the image, the improved Laplace energy SML and the local spatial frequency are respectively used as the external input and the link coefficient of the PCNN;
SML is defined as follows:
Figure FDA0003040523090000035
the spatial frequency is:
Figure FDA0003040523090000041
wherein RF, CF, MDF and SDF represent row frequency, column frequency, main diagonal frequency and sub diagonal frequency, respectively, and the formula is as follows:
Figure FDA0003040523090000042
CN201710621620.0A 2017-07-27 2017-07-27 Infrared and visible light image fusion algorithm based on mixed multi-scale analysis Active CN107451984B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710621620.0A CN107451984B (en) 2017-07-27 2017-07-27 Infrared and visible light image fusion algorithm based on mixed multi-scale analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710621620.0A CN107451984B (en) 2017-07-27 2017-07-27 Infrared and visible light image fusion algorithm based on mixed multi-scale analysis

Publications (2)

Publication Number Publication Date
CN107451984A CN107451984A (en) 2017-12-08
CN107451984B true CN107451984B (en) 2021-06-22

Family

ID=60489702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710621620.0A Active CN107451984B (en) 2017-07-27 2017-07-27 Infrared and visible light image fusion algorithm based on mixed multi-scale analysis

Country Status (1)

Country Link
CN (1) CN107451984B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108389158A (en) * 2018-02-12 2018-08-10 河北大学 A kind of infrared and visible light image interfusion method
CN108648174A (en) * 2018-04-04 2018-10-12 上海交通大学 A kind of fusion method of multilayer images and system based on Autofocus Technology
CN109410157B (en) * 2018-06-19 2022-02-08 昆明理工大学 Image fusion method based on low-rank sparse decomposition and PCNN
CN109118460B (en) * 2018-06-27 2020-08-11 河海大学 Method and system for synchronously processing light-splitting polarization spectrum information
CN109166088B (en) * 2018-07-10 2022-01-28 南京理工大学 Dual-waveband gray molten pool image fusion method based on non-downsampling wavelet transform
CN109035189B (en) * 2018-07-17 2021-07-23 桂林电子科技大学 Infrared and weak visible light image fusion method based on Cauchy fuzzy function
CN109191417A (en) * 2018-09-11 2019-01-11 中国科学院长春光学精密机械与物理研究所 It is detected based on conspicuousness and improves twin-channel method for self-adaption amalgamation and device
CN109345788A (en) * 2018-09-26 2019-02-15 国网安徽省电力有限公司铜陵市义安区供电公司 A kind of monitoring early-warning system of view-based access control model feature
CN109242815B (en) * 2018-09-28 2022-03-18 合肥英睿***技术有限公司 Infrared light image and visible light image fusion method and system
CN109360182A (en) * 2018-10-31 2019-02-19 广州供电局有限公司 Image interfusion method, device, computer equipment and storage medium
CN109978802B (en) * 2019-02-13 2023-01-17 中山大学 NSCT (non-subsampled Contourlet transform) and PCNN (pulse coupled neural network) -based high-dynamic-range image fusion method in compressed sensing domain
CN110110786B (en) * 2019-05-06 2023-04-14 电子科技大学 Infrared and visible light image fusion method based on NSCT and DWT
CN111861957B (en) * 2020-07-02 2024-03-08 Tcl华星光电技术有限公司 Image fusion method and device
CN112734683B (en) * 2021-01-07 2024-02-20 西安电子科技大学 Multi-scale SAR and infrared image fusion method based on target enhancement
CN114359687B (en) * 2021-12-07 2024-04-09 华南理工大学 Target detection method, device, equipment and medium based on multi-mode data double fusion
CN114757895B (en) * 2022-03-25 2023-05-02 国网浙江省电力有限公司电力科学研究院 Method and system for judging direct sunlight interference of infrared image of composite insulator
CN116403057B (en) * 2023-06-09 2023-08-18 山东瑞盈智能科技有限公司 Power transmission line inspection method and system based on multi-source image fusion

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254314A (en) * 2011-07-17 2011-11-23 西安电子科技大学 Visible-light/infrared image fusion method based on compressed sensing
WO2016050290A1 (en) * 2014-10-01 2016-04-07 Metaio Gmbh Method and system for determining at least one property related to at least part of a real environment
CN105719263A (en) * 2016-01-22 2016-06-29 昆明理工大学 Visible light and infrared image fusion algorithm based on NSCT domain bottom layer visual features
CN106600572A (en) * 2016-12-12 2017-04-26 长春理工大学 Adaptive low-illumination visible image and infrared image fusion method
CN106981057A (en) * 2017-03-24 2017-07-25 中国人民解放军国防科学技术大学 A kind of NSST image interfusion methods based on RPCA

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1873693B (en) * 2006-06-27 2010-05-12 上海大学 Method based on Contourlet transformation, modified type pulse coupling neural network, and image amalgamation
US9299130B2 (en) * 2013-03-15 2016-03-29 Trustees Of Tufts College Methods and apparatus for image processing and analysis
CN106327459B (en) * 2016-09-06 2019-03-12 四川大学 Visible light and infrared image fusion method based on UDCT and PCNN

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254314A (en) * 2011-07-17 2011-11-23 西安电子科技大学 Visible-light/infrared image fusion method based on compressed sensing
WO2016050290A1 (en) * 2014-10-01 2016-04-07 Metaio Gmbh Method and system for determining at least one property related to at least part of a real environment
CN105719263A (en) * 2016-01-22 2016-06-29 昆明理工大学 Visible light and infrared image fusion algorithm based on NSCT domain bottom layer visual features
CN106600572A (en) * 2016-12-12 2017-04-26 长春理工大学 Adaptive low-illumination visible image and infrared image fusion method
CN106981057A (en) * 2017-03-24 2017-07-25 中国人民解放军国防科学技术大学 A kind of NSST image interfusion methods based on RPCA

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A fusion Algorithm for infrared and visible images based on RDU-PCNN and ICA-bases in NSST domain;Zhanwen Liu 等;《Infrared Physics & Technology》;20161031;第183-190页 *
一种基于NSST和字典学习的红外和可见光图像融合算法;刘战文 等;《西北工业大学学报》;20170630;第35卷(第3期);第408-413页 *

Also Published As

Publication number Publication date
CN107451984A (en) 2017-12-08

Similar Documents

Publication Publication Date Title
CN107451984B (en) Infrared and visible light image fusion algorithm based on mixed multi-scale analysis
Adu et al. Image fusion based on nonsubsampled contourlet transform for infrared and visible light image
CN106846289B (en) A kind of infrared light intensity and polarization image fusion method
CN104268847B (en) A kind of infrared and visible light image fusion method based on interaction non-local mean filtering
CN109886908B (en) Infrared image and visible light image fusion method
CN104123705B (en) A kind of super-resolution rebuilding picture quality Contourlet territory evaluation methodology
CN107730482A (en) A kind of sparse blending algorithm based on region energy and variance
CN105976346A (en) Infrared and visible light image fusion method based on robust principal component sparse decomposition
Dharejo et al. A deep hybrid neural network for single image dehazing via wavelet transform
Yadav et al. A review on image fusion methodologies and applications
Sharma et al. Satellite image contrast and resolution enhancement using discrete wavelet transform and singular value decomposition
Zhang et al. Infrared and visible image fusion using joint convolution sparse coding
CN114298950A (en) Infrared and visible light image fusion method based on improved GoDec algorithm
Xu et al. AACNet: Asymmetric attention convolution network for hyperspectral image dehazing
Kaur et al. A comparative study of various digital image fusion techniques: A review
Pai et al. Medical color image enhancement using wavelet transform and contrast stretching technique
Indira et al. Pixel based medical image fusion techniques using discrete wavelet transform and stationary wavelet transform
CN105096274A (en) Infrared image noise reduction method based on non-subsampled contourlet domain mixed statistical model
Zhang et al. Visible and infrared image fusion using convolutional dictionary learning with consensus auxiliary-auxiliary coupling
Amro et al. General shearlet pansharpening method using Bayesian inference
CN114549379B (en) Infrared and visible light image fusion method under non-downsampled shear wave transformation domain
Zhu et al. Multifocus image fusion based on uniform discrete curvelet transform
Yuan et al. An efficient method for traffic image denoising
Sangeetha et al. Performance analysis of exemplar based image inpainting algorithms for natural scene image completion
CN112508829B (en) Pan-sharpening method based on shear wave transformation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20171208

Assignee: Guangxi Yanze Information Technology Co.,Ltd.

Assignor: GUILIN University OF ELECTRONIC TECHNOLOGY

Contract record no.: X2023980046249

Denomination of invention: A fusion algorithm for infrared and visible light images based on hybrid multiscale analysis

Granted publication date: 20210622

License type: Common License

Record date: 20231108

EE01 Entry into force of recordation of patent licensing contract