CN111709904B - Image fusion method and device - Google Patents

Image fusion method and device Download PDF

Info

Publication number
CN111709904B
CN111709904B CN202010460793.0A CN202010460793A CN111709904B CN 111709904 B CN111709904 B CN 111709904B CN 202010460793 A CN202010460793 A CN 202010460793A CN 111709904 B CN111709904 B CN 111709904B
Authority
CN
China
Prior art keywords
pyramid
image
images
fusion
laplacian
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010460793.0A
Other languages
Chinese (zh)
Other versions
CN111709904A (en
Inventor
童佳文
何旋
杨祥勇
乔丽静
苗应亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Maxvision Technology Corp
Original Assignee
Maxvision Technology Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maxvision Technology Corp filed Critical Maxvision Technology Corp
Priority to CN202010460793.0A priority Critical patent/CN111709904B/en
Publication of CN111709904A publication Critical patent/CN111709904A/en
Application granted granted Critical
Publication of CN111709904B publication Critical patent/CN111709904B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/14Fourier, Walsh or analogous domain transformations, e.g. Laplace, Hilbert, Karhunen-Loeve, transforms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an image fusion method and device, wherein the method comprises the following steps: respectively calculating fusion coefficients of all pixel points in the two images, and respectively calculating a Gaussian pyramid of the fusion coefficients of the two images according to the fusion coefficients of all pixel points in the two images; respectively calculating Laplacian pyramids of the two images; according to the fusion coefficient Gaussian pyramid of the two images and the Laplacian pyramid of the two images, fusion is carried out to obtain a fused Laplacian pyramid; reconstructing a fused image according to the fused Laplacian pyramid. The technical scheme of the invention has the advantages of good image fusion effect and simple algorithm.

Description

Image fusion method and device
Technical Field
The present invention relates to the field of image capturing, and in particular, to an image fusion method and apparatus.
Background
Imaging techniques are widely used in various fields. In order to improve the definition of the photographed image, in many photographing devices, two cameras are used to photograph the image respectively, and then the images photographed by the two cameras are fused, so as to obtain a clearer picture.
However, in the existing image fusion scheme, the image fusion effect is poor due to the adoption of a simple fusion algorithm, and the image fusion effect can be improved due to the adoption of a complex algorithm, but the fusion operation time is long, so that the requirement of the image pickup device for outputting the image in real time cannot be met.
Disclosure of Invention
The invention aims to provide an image fusion method and device with good image fusion effect and simple algorithm.
In an embodiment of the present invention, an image fusion method is provided, which includes:
respectively calculating fusion coefficients of all pixel points in the two images, and respectively calculating a Gaussian pyramid of the fusion coefficients of the two images according to the fusion coefficients of all pixel points in the two images;
respectively calculating Laplacian pyramids of the two images;
according to the fusion coefficient Gaussian pyramid of the two images and the Laplacian pyramid of the two images, fusion is carried out to obtain a fused Laplacian pyramid;
reconstructing a fused image according to the fused Laplacian pyramid.
In the embodiment of the invention, the fusion coefficient of each pixel point in the two images is calculated as follows:
G1(i)=W1(i)/(W1(i)+W2(i));
G2(i)=W2(i)/(W1(i)+W2(i));
W(i)=C(i)*S(i);
wherein W (i) represents the weight value of the ith pixel of the image, C (i) represents the contrast value of the ith pixel of the image, S (i) represents the saturation value of the ith pixel of the image, W1 (i), and W2 (i) respectively represents the weight values of the ith pixel of the two images G1 (i) and G2 (i) respectively represent the fusion coefficients of the ith pixel point in the two images.
In the embodiment of the invention, the Laplacian pyramid of the two images is fused by adopting the following formula:
Ln=LPn(1)*Gn(W1)+LPn(2)*Gn(W2),
wherein: ln represents the fused Laplacian pyramid, n represents the number of layers of the pyramid, LPn (1) represents the Laplacian pyramid of the first image LPn (2) represents the laplacian pyramid of the second image, gn (W1) represents the gaussian pyramid of the fusion coefficient of the first image, and Gn (W2) represents the gaussian pyramid of the fusion coefficient of the second image.
In the embodiment of the invention, reconstructing the fused image according to the fused Laplacian pyramid comprises the following steps:
deducing a corresponding Gaussian pyramid according to the fused Laplacian pyramid;
and taking the data of the derived Gaussian pyramid first layer as fused image data.
In the embodiment of the invention, the formula for deducing the corresponding Gaussian pyramid according to the fused Laplacian pyramid is as follows:
when n=n, G N =LP N
When 0 is less than or equal to n<N, G n =LP n +G * n+1 ,
Wherein Gn represents the N-th Gaussian pyramid data, LPn represents the N-th Laplacian pyramid data, N represents the number of layers of the pyramid top layer, G * n+1 Representing the data obtained by upsampling the n+1 layer gaussian pyramid.
In the embodiment of the invention, the image fusion method further comprises the following steps:
the number of layers of the Gaussian pyramid and the Laplacian pyramid is set according to the quality of the image.
In the embodiment of the invention, the Gaussian pyramid of the input image is generated in the following way:
G(n)=F downsampling (G(n-1)),
Wherein n is an integer, and n>0,G (n) represents n-th layer data of a gaussian pyramid, and when n=0, G (0) is input image data, F Downsampling (G (n-1)) means that the data of the n-1 layer gaussian pyramid is subjected to a convolution and upsampling operation.
In the embodiment of the invention, the Laplacian pyramid of the input image is generated in the following way:
LP(n)=G(n)–F upsampling (G(n+1)),
Wherein n is an integer, and n>0 or more, LP (n) represents the n-th layer data of the Laplacian pyramid, G (n) represents the i-th layer data of the Gaussian pyramid of the input image, F Upsampling (G (n+1)) means that the data of the n+1th layer gaussian pyramid is subjected to convolution and upsampling processing.
In an embodiment of the present invention, there is also provided an image fusion apparatus, including:
the fusion coefficient calculation module is used for calculating the fusion coefficient of each pixel point in the two images respectively;
the Gaussian pyramid generation module is used for respectively calculating the fusion coefficient Gaussian pyramid of the two images according to the fusion coefficient of each pixel point in the two images;
the Laplacian pyramid generation module is used for respectively calculating Laplacian pyramids of the two images;
the image fusion module is used for fusing the two images according to the Gaussian pyramid and the Laplacian pyramid of the two images to obtain a fused Laplacian pyramid;
and the image reconstruction module is used for reconstructing the fused image according to the fused Laplacian pyramid.
And the image reconstruction module is used for reconstructing the fused image according to the fused Laplacian pyramid.
An embodiment of the present invention provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the image fusion method as provided in the first aspect described above.
Compared with the prior art, the image fusion method and device have the advantages that the Laplacian pyramid of the two images and the Gaussian pyramid of the fusion coefficient of the two images are respectively adopted, the fusion is carried out on the Laplacian pyramid of the two images according to the Gaussian pyramid of the fusion coefficient of the two images and the Laplacian pyramid of the two images, the fused Laplacian pyramid is obtained, then the fused image is reconstructed according to the fused Laplacian pyramid, the algorithm is simple, and the image fusion effect is good.
Drawings
Fig. 1 is a flowchart of an image fusion method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of an image obtained in an image fusion process of the image fusion method according to the embodiment of the present invention.
Fig. 3 is a schematic diagram of generating a laplacian pyramid according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a reconstructed image according to a fused laplacian pyramid, provided by an embodiment of the present invention.
Fig. 5 is a flowchart of an image fusion method according to an embodiment of the present invention.
Fig. 6 is a schematic structural diagram of an image fusion apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and preferred pixels of the present invention more apparent, the present invention will be further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
The implementation of the present invention is described in detail below in connection with specific embodiments.
Fig. 1 is a flowchart of an image fusion method according to an embodiment of the present invention, and fig. 2 is a schematic diagram of an image obtained in an image fusion process of the image fusion method, where the method includes:
s1, respectively calculating fusion coefficients of all pixel points in two images;
s2, respectively calculating fusion coefficient Gaussian pyramids of the two images according to the fusion coefficients of all pixel points in the two images;
s3, respectively calculating Laplacian pyramids of the two images;
s4, fusing the Laplacian pyramid of the two images according to the Gaussian pyramid of the fusion coefficient of the two images to obtain a fused Laplacian pyramid;
s5, reconstructing a fused image according to the fused Laplacian pyramid. Specifically, in step S1, the fusion coefficients of each pixel point in the two images are calculated respectively, and the calculation formula is as follows:
G1(i)=W1(i)/(W1(i)+W2(i));
G2(i)=W2(i)/(W1(i)+W2(i));
W(i)=C(i)*S(i);
wherein W (i) represents the weight value of the ith pixel point of the image, and C (i) represents the ith pixel point of the imagePixel contrast value, S (i) represents saturation value of ith pixel of image, W1 (i), W2 (i) represents weight value of ith pixel of two images respectively G1 (i) and G2 (i) respectively represent the fusion coefficients of the ith pixel point in the two images.
It should be noted that, the two images used for fusion may be two images captured by one camera, or may be images captured by two cameras respectively, where the two images have the same specification and size. The higher the contrast and saturation, the more information amount, and therefore, the contrast and saturation of each pixel point need to be calculated for calculating the image fusion coefficient.
When the contrast ratio is calculated, gray level processing is needed to be carried out on each image to generate a gray level image, then convolution processing is carried out on the gray level image, and the absolute value of the filter response is taken, so that the contrast ratio of each pixel point in the image is obtained.
The saturation is calculated as follows:
first, a pixel average value m= (r+g+b)/3 is calculated;
then, index index= ((R-M) 2+ (G-M) 2+ (B-M) 2)/3 corresponding to saturation is calculated;
finally, obtaining the saturation value corresponding to the index in a table look-up mode.
After the fusion coefficient of each pixel point in the two images is obtained, the fusion coefficients can respectively form two fusion coefficient images, and a fusion coefficient Gaussian pyramid of the two images can be obtained according to the two fusion coefficient images.
In step S2, a gaussian pyramid of the input image is generated as follows:
G(n)=F downsampling (G(n-1)),
Wherein n is an integer, and n>0,G (n) represents n-th layer data of a gaussian pyramid, and when n=0, G (0) is input image data, F Downsampling (G (n-1)) means that the data of the n-1 layer gaussian pyramid is subjected to a convolution and upsampling operation.
Specifically, the original image is first taken as the bottommost image G0 (layer 0 of the gaussian pyramid), convolved by using a gaussian kernel (3*3), then downsampled to obtain the upper image G1, taken as input, and repeated convolution and downsampling operations to obtain the upper image, and repeated iteration is performed for a plurality of times to form a pyramid-shaped image data structure, namely the gaussian pyramid.
From G0, G1,, GN, a Gaussian pyramid is formed, where G0 is the bottom of the Gaussian pyramid (same as the original image) GN is the top of the pyramid. The current layer image of the Gaussian pyramid is generated by firstly Gaussian low-pass filtering the previous layer image, and then performing interlaced and spaced downsampling by 2. The previous layer image size is in turn 4 times the current layer image size.
The convolution operation and downsampling process are exemplified as follows:
the convolution operation coefficients are as follows:
{1,2,1,
2,4,2,
1,2,1};
the convolution operation is firstly carried out on the image data, and the value range of the operation result is [0,255].
Sampling and taking value, assuming that the width and height of the original picture are W, H respectively, the width and height of the sampled picture
NW=(W+1)/2
NH=(H+1)/2
Sampled image data (x, y) =raw image data (x×2, y×2) x= … NW-1, y= … NH-1 (pixel coordinate position is indicated in brackets)
Examples of convolved images are shown below
V00 V01 V02 V03 V04
V10 V11 V12 V13 V14
V20 V21 V22 V23 V24
V30 V31 V31 V33 V34
V40 V41 V41 V43 V44
The sampled image is as follows
V00 V02 V04
V20 V22 V24
V40 V41 V44
In step S3, a laplacian pyramid of the input image is generated as follows:
LP(n)=G(n)–F upsampling (G(n+1)),
Wherein n is an integer, and n>0 or more, LP (n) represents the n-th layer data of the Laplacian pyramid, G (n) represents the i-th layer data of the Gaussian pyramid of the input image, F Upsampling (G (n+1)) means that the data of the n+1th layer gaussian pyramid is subjected to convolution and upsampling processing.
It should be noted that, during the operation of the gaussian pyramid, part of high-frequency detail information is lost through the operations of rolling and downsampling. To describe this high frequency information, a laplacian pyramid is defined. Subtracting the predicted image after upsampling and Gaussian convolution of the previous layer of image from each layer of image of the Gaussian pyramid to obtain a series of difference images, namely the Laplacian pyramid decomposition image.
The steps of convolution and upsampling are exemplified as follows:
the convolution operation coefficients are as follows:
{1,2,1,
2,4,2,
1,2,1};
A. firstly, an original picture (width W, height H) is enlarged to a required size (width NW, height NH), the enlarged image data (2 x,2 y) =the original image data (x, y) x= … W-1, y= … H-1, x, y) represents pixel coordinate positions, and a value of a position where no assignment is set to be 0;
B. performing convolution operation on the amplified image;
C. and calculating the obtained convolution value (4) to obtain a final calculation result.
As shown in fig. 3, assuming N is the layer number of the top layer of the laplacian pyramid, LPl is the layer i image of the laplacian pyramid decomposition. The pyramid formed by LP0, LP1 and LP2 … LPN is the Laplacian pyramid. Each layer of L0 image is the difference between the image of the layer G0 of the Gaussian pyramid and the image G1 of the higher layer of the Gaussian pyramid, and the process is equivalent to band-pass filtering, so that the Laplacian pyramid is also called band-pass pyramid decomposition.
In step S4, the laplacian pyramids of the two images are fused using the following formula:
Ln=LPn(1)*Gn(W1)+LPn(2)*Gn(W2),
wherein: ln represents the fused Laplacian pyramid, n represents the number of layers of the pyramid, LPn (1) represents the Laplacian pyramid of the first image LPn (2) represents the laplacian pyramid of the second image, gn (W1) represents the gaussian pyramid of the fusion coefficient of the first image, and Gn (W2) represents the gaussian pyramid of the fusion coefficient of the second image.
In step S5, a corresponding Gaussian pyramid is deduced according to the fused Laplacian pyramid;
and taking the data of the derived Gaussian pyramid first layer as fused image data.
It should be noted that, since the laplacian pyramid is derived from the gaussian pyramid, the laplacian pyramid can be derived from the corresponding gaussian pyramid, specifically, as shown in fig. 4, the formula for deriving the corresponding gaussian pyramid from the fused laplacian pyramid is as follows:
when n=n, G N =LP N
When 0 is less than or equal to n<N, G n =LP n +G * n+1 ,
Wherein Gn represents the N-th layer Gaussian pyramid data, LPn represents the N-th layer Laplacian pyramid data, and N represents the layer number of the pyramid top layer,G * n+1 Representing the data obtained by upsampling the n+1 layer gaussian pyramid.
In the image fusion method of the embodiment of the invention, before calculating the laplacian pyramid and the fusion coefficient gaussian pyramid of the two images, the number of layers of the gaussian pyramid and the laplacian pyramid is also required to be set according to the quality of the images, so that the fusion coefficient gaussian pyramid and the laplacian pyramid of the two images are consistent in number of layers.
As shown in fig. 5, the layer numbers of the gaussian pyramid and the laplacian pyramid are set as follows:
first, the maximum layer number maxLevel is calculated from the image width height (height):
maxLevel=log(min(width,height))/log2;
the pyramid layer number LayerNum is set, the layer number can be processed in a self-defined mode, the value range is [ 1], the calculated maximum layer number ] and the default value is 4;
a Threshold1 is set, the value range is [0,0.1], and the default value is 0.008. Counting the number (Count) of pixels with the weight value of 4095 of darker images in the two images, and if Count < = width height 1, setting the layer number of the pyramid as LayerNum;
a Threshold2 is set, the value range is [0,0.1], and the default value is 0.01. When width height 1< Count < = width height 2, the layer number of the pyramid is set as LayerNum 2-1 layer;
otherwise, the layer number of the pyramid is set to be the maximum layer number maxLevel.
As shown in fig. 6, corresponding to the above image fusion method, in an embodiment of the present invention, there is further provided an image fusion apparatus, which includes:
the fusion coefficient calculation module 61 is configured to calculate fusion coefficients of each pixel point in the two images respectively;
the gaussian pyramid generation module 62 is configured to calculate a gaussian pyramid of a fusion coefficient of the two images according to the fusion coefficient of each pixel point in the two images;
the laplacian pyramid generation module 63 is used for respectively calculating laplacian pyramids of the two images;
the image fusion module 64 is configured to fuse the laplacian pyramids of the two images according to the fusion coefficient gaussian pyramid of the two images, and obtain a fused laplacian pyramid;
the image reconstruction module 65 is configured to reconstruct a fused image according to the fused laplacian pyramid.
In the above image fusion device, the content such as information interaction and execution process between the modules is based on the same concept, and the specific implementation process and the technical effects brought by the implementation process are described in detail in the image fusion method, which is not described herein.
The present embodiment provides a non-transitory computer-readable storage medium storing computer instructions that cause a computer to perform the methods provided by the above-described method embodiments, for example, including: respectively calculating fusion coefficients of all pixel points in the two images, and respectively calculating a Gaussian pyramid of the fusion coefficients of the two images according to the fusion coefficients of all pixel points in the two images; respectively calculating Laplacian pyramids of the two images; according to the fusion coefficient Gaussian pyramid of the two images and the Laplacian pyramid of the two images, fusion is carried out to obtain a fused Laplacian pyramid; reconstructing a fused image according to the fused Laplacian pyramid.
In summary, by adopting the image fusion method and the device provided by the invention, the Laplacian pyramid of the two images and the Gaussian pyramid of the fusion coefficient of the two images are respectively used, the fusion is carried out on the Laplacian pyramid of the two images according to the Gaussian pyramid of the fusion coefficient of the two images and the Laplacian pyramid of the two images, so that the fused Laplacian pyramid is obtained, then the fused image is reconstructed according to the fused Laplacian pyramid, the algorithm is simple, and the image fusion effect is good.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (10)

1. A method of image fusion, the method comprising:
respectively calculating fusion coefficients of all pixel points in the two images, and respectively calculating a Gaussian pyramid of the fusion coefficients of the two images according to the fusion coefficients of all pixel points in the two images;
respectively calculating Laplacian pyramids of the two images;
fusing the Laplacian pyramids of the two images according to the Gaussian pyramid of the fusion coefficient of the two images to obtain the fused Laplacian pyramids;
reconstructing a fused image according to the fused Laplacian pyramid,
the layer number setting process of the Gaussian pyramid and the Laplacian pyramid is as follows:
first, the maximum layer number maxLevel is calculated from the image width height (height):
maxLevel=log(min(width,height))/log2;
setting an initial pyramid layer number LayerNum, wherein the value range is [1, maxLevel ];
setting a Threshold value Threshold 1=0.008, counting the number of pixels of a darker image in the two images, and if Count < = width height 1, setting the layer number of the pyramid as LayerNum;
setting a Threshold value Threshold 2=0.01, and setting the layer number of the pyramid as LayerNum 2-1 layer when width height 1< Count < = width height 2;
otherwise, the layer number of the pyramid is set to be the maximum layer number maxLevel.
2. The image fusion method of claim 1, wherein the fusion coefficients of each pixel in the two images are calculated as follows:
G1(i)=W1(i)/(W1(i)+W2(i));
G2(i)=W2(i)/(W1(i)+W2(i));
W(i)=C(i)*S(i);
wherein W (i) represents the weight value of the ith pixel point of the image, C (i) represents the contrast value of the ith pixel point of the image, S (i) represents the saturation value of the ith pixel point of the image, W1 (i), W2 (i) respectively represents the weight values of the ith pixel points in the two images, and G1 (i), G2 (i) respectively represent the fusion coefficients of the ith pixel points in the two images.
3. The image fusion method of claim 1, wherein the laplacian pyramids of the two images are fused using the formula:
Ln=LPn(1)*Gn(W1)+LPn(2)*Gn(W2),
wherein: ln represents the number of layers of the fused Laplacian pyramid, n represents the Laplacian pyramid of the first image, LPn (1) represents the Laplacian pyramid of the second image, gn (W1) represents the Gaussian pyramid of the fusion coefficient of the first image, and Gn (W2) represents the Gaussian pyramid of the fusion coefficient of the second image.
4. The method of image fusion according to claim 1, wherein reconstructing the fused image from the fused laplacian pyramid comprises:
deducing a corresponding Gaussian pyramid according to the fused Laplacian pyramid;
and taking the data of the derived Gaussian pyramid first layer as fused image data.
5. The image fusion method of claim 4, wherein the formula for deriving the corresponding gaussian pyramid from the fused laplacian pyramid is as follows:
when n=n, G N =LP N
When 0 is less than or equal to n<N, G n =LP n +G * n+1 ,
Wherein Gn represents an nth layer Gao Sijin wordTower data, LPn represents the N-th Laplacian pyramid data, N represents the number of layers at the top of the pyramid, G * n+1 Representing the data obtained by upsampling the n+1 layer gaussian pyramid.
6. The image fusion method of claim 1, further comprising:
the number of layers of the Gaussian pyramid and the Laplacian pyramid is set according to the quality of the image.
7. The image fusion method of claim 1, wherein the fusion coefficient gaussian pyramid of the input image is generated by:
g (n) =f downsampling (G (n-1)),
where n is an integer, n >. Gtoreq. 0,G (n) represents n-th layer data of the gaussian pyramid, and when n=0, G (0) is fusion coefficient data of the input image, and F downsampling (G (n-1)) represents performing a convolution and upsampling operation on the data of the n-1-th layer gaussian pyramid.
8. The image fusion method of claim 1, wherein the laplacian pyramid of the input image is generated by:
LP(n)=G(n)–F upsampling (G(n+1)),
Wherein n is an integer, and n>0 or more, LP (n) represents the n-th layer data of the Laplacian pyramid, G (n) represents the i-th layer data of the Gaussian pyramid of the input image, F Upsampling (G (n+1)) means that the data of the n+1th layer gaussian pyramid is subjected to convolution and upsampling processing.
9. An image fusion apparatus, comprising:
the fusion coefficient calculation module is used for calculating the fusion coefficient of each pixel point in the two images respectively;
the Gaussian pyramid generation module is used for respectively calculating the fusion coefficient Gaussian pyramid of the two images according to the fusion coefficient of each pixel point in the two images;
the Laplacian pyramid generation module is used for respectively calculating Laplacian pyramids of the two images;
the image fusion module is used for fusing the Laplacian pyramids of the two images according to the fusion coefficient Gaussian pyramids of the two images to obtain fused Laplacian pyramids;
an image reconstruction module for reconstructing a fused image according to the fused Laplacian pyramid,
the layer number setting process of the Gaussian pyramid and the Laplacian pyramid is as follows:
first, the maximum layer number maxLevel is calculated from the image width height (height):
maxLevel=log(min(width,height))/log2;
setting an initial pyramid layer number LayerNum, wherein the value range is [1, maxLevel ];
setting a Threshold value Threshold 1=0.008, counting the number of pixels of a darker image in the two images, and if Count < = width height 1, setting the layer number of the pyramid as LayerNum;
setting a Threshold value Threshold 2=0.01, and setting the layer number of the pyramid as LayerNum 2-1 layer when width height 1< Count < = width height 2;
otherwise, the layer number of the pyramid is set to be the maximum layer number maxLevel.
10. A non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor performs the steps of the image fusion method according to any of claims 1 to 8.
CN202010460793.0A 2020-05-27 2020-05-27 Image fusion method and device Active CN111709904B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010460793.0A CN111709904B (en) 2020-05-27 2020-05-27 Image fusion method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010460793.0A CN111709904B (en) 2020-05-27 2020-05-27 Image fusion method and device

Publications (2)

Publication Number Publication Date
CN111709904A CN111709904A (en) 2020-09-25
CN111709904B true CN111709904B (en) 2023-12-26

Family

ID=72537977

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010460793.0A Active CN111709904B (en) 2020-05-27 2020-05-27 Image fusion method and device

Country Status (1)

Country Link
CN (1) CN111709904B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112561792B (en) * 2020-12-25 2023-10-03 北京百度网讯科技有限公司 Image style migration method and device, electronic equipment and storage medium
CN112561909B (en) * 2020-12-28 2024-05-28 南京航空航天大学 Fusion variation-based image countermeasure sample generation method
CN114549377B (en) * 2022-01-11 2024-02-02 上海应用技术大学 Medical image fusion method
CN116563190B (en) * 2023-07-06 2023-09-26 深圳市超像素智能科技有限公司 Image processing method, device, computer equipment and computer readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011008239A1 (en) * 2009-06-29 2011-01-20 Thomson Licensing Contrast enhancement
US8885976B1 (en) * 2013-06-20 2014-11-11 Cyberlink Corp. Systems and methods for performing image fusion
CN106709898A (en) * 2017-03-13 2017-05-24 微鲸科技有限公司 Image fusing method and device
CN107767330A (en) * 2017-10-17 2018-03-06 中电科新型智慧城市研究院有限公司 A kind of image split-joint method
CN107845128A (en) * 2017-11-03 2018-03-27 安康学院 A kind of more exposure high-dynamics image method for reconstructing of multiple dimensioned details fusion
CN110852982A (en) * 2019-11-19 2020-02-28 常州工学院 Self-adaptive exposure adjustment multi-scale entropy fusion underwater image enhancement method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160148359A1 (en) * 2014-11-20 2016-05-26 Siemens Medical Solutions Usa, Inc. Fast Computation of a Laplacian Pyramid in a Parallel Computing Environment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011008239A1 (en) * 2009-06-29 2011-01-20 Thomson Licensing Contrast enhancement
US8885976B1 (en) * 2013-06-20 2014-11-11 Cyberlink Corp. Systems and methods for performing image fusion
CN106709898A (en) * 2017-03-13 2017-05-24 微鲸科技有限公司 Image fusing method and device
CN107767330A (en) * 2017-10-17 2018-03-06 中电科新型智慧城市研究院有限公司 A kind of image split-joint method
CN107845128A (en) * 2017-11-03 2018-03-27 安康学院 A kind of more exposure high-dynamics image method for reconstructing of multiple dimensioned details fusion
CN110852982A (en) * 2019-11-19 2020-02-28 常州工学院 Self-adaptive exposure adjustment multi-scale entropy fusion underwater image enhancement method

Also Published As

Publication number Publication date
CN111709904A (en) 2020-09-25

Similar Documents

Publication Publication Date Title
CN111709904B (en) Image fusion method and device
CN111539879B (en) Video blind denoising method and device based on deep learning
CN111275626B (en) Video deblurring method, device and equipment based on ambiguity
EP2164040B1 (en) System and method for high quality image and video upscaling
CN109978774B (en) Denoising fusion method and device for multi-frame continuous equal exposure images
TWI665916B (en) Method, apparatus, and circuitry of noise reduction
JP2007188493A (en) Method and apparatus for reducing motion blur in motion blur image, and method and apparatus for generating image with reduced motion blur by using a plurality of motion blur images each having its own blur parameter
KR101685885B1 (en) Image processing device, image processing method, and image processing program
CN107133923B (en) Fuzzy image non-blind deblurring method based on adaptive gradient sparse model
KR101633397B1 (en) Image restoration device, image restoration method and image restoration system
CN111192226B (en) Image fusion denoising method, device and system
CN110418065B (en) High dynamic range image motion compensation method and device and electronic equipment
US8587703B2 (en) Systems and methods for image restoration
JP2009130537A (en) Image processing device and image processing method
Zheng et al. Wavelet based nonlocal-means super-resolution for video sequences
KR101362011B1 (en) Method for blur removing ringing-atifactless
JP4544336B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
Jeong et al. Multi-frame example-based super-resolution using locally directional self-similarity
CN116847209B (en) Log-Gabor and wavelet-based light field full-focusing image generation method and system
CN110111261B (en) Adaptive balance processing method for image, electronic device and computer readable storage medium
CN113793272B (en) Image noise reduction method and device, storage medium and terminal
CN111353982B (en) Depth camera image sequence screening method and device
JP6221333B2 (en) Image processing apparatus, image processing circuit, and image processing method
KR101464743B1 (en) Signal dependent noise estimation apparatus and method on camera module
CN112106352A (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant