CN106651800B - MS and PAN image fusion method based on PAN modulation and multivariate linear regression - Google Patents

MS and PAN image fusion method based on PAN modulation and multivariate linear regression Download PDF

Info

Publication number
CN106651800B
CN106651800B CN201611208080.5A CN201611208080A CN106651800B CN 106651800 B CN106651800 B CN 106651800B CN 201611208080 A CN201611208080 A CN 201611208080A CN 106651800 B CN106651800 B CN 106651800B
Authority
CN
China
Prior art keywords
image
pan
formula
wave band
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201611208080.5A
Other languages
Chinese (zh)
Other versions
CN106651800A (en
Inventor
李慧
荆林海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Remote Sensing and Digital Earth of CAS
Original Assignee
Institute of Remote Sensing and Digital Earth of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Remote Sensing and Digital Earth of CAS filed Critical Institute of Remote Sensing and Digital Earth of CAS
Priority to CN201611208080.5A priority Critical patent/CN106651800B/en
Publication of CN106651800A publication Critical patent/CN106651800A/en
Application granted granted Critical
Publication of CN106651800B publication Critical patent/CN106651800B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The inventionA method for fusing MS and PAN images based on PAN modulation and multivariate linear regression is disclosed, which comprises the following steps: original low resolution MS imageI L Upsampling to original PAN image by means of cubic convolutionPTo obtain an up-sampled MS imageI(ii) a Sampling the original PAN image to the spatial resolution of the MS image by adopting an averaging method to obtain the PAN image with the spatial resolution of the MS imageP L (ii) a Synthesizing a PAN image at MS spatial resolution using a weighted sum of individual MS bandsPs(ii) a Calculating PAN band and MS imageiHaze value of wave bandH P AndH i (ii) a Computing contrast-enhanced PAN imagesP F (ii) a Up-sampling MS imageISynthesizing PAN imagePsAnd contrast enhanced PAN imagesP F Fusion is performed. The invention has the beneficial effects that: the spectral distortion of the fused image is obviously reduced, the fused image with different degrees of spatial detail enhancement can be provided, and particularly, under the condition that registration errors exist in MS and PAN pixels, the fused image has less spectral distortion and better visual effect.

Description

MS and PAN image fusion method based on PAN modulation and multivariate linear regression
Technical Field
The invention relates to the technical field of remote sensing image processing, in particular to a method for fusing MS and PAN images based on PAN modulation and multivariate linear regression.
Background
Currently, a large number of on-orbit satellites (e.g., Landsat 7ETM +, IKONOS, Quickbird, SPOT-5, and WorldView-2/3, etc.) provide both high spatial resolution panchromatic band (PAN) and low spatial resolution multispectral band (MS) remote sensing images. Since a large number of applications require the use of multispectral images with high spatial resolution, it is necessary to perform fusion processing on MS and PAN images to obtain MS images with enhanced spatial resolution for applications such as remote sensing image interpretation, surface coverage classification, and target detection. Researchers at home and abroad have developed a great deal of remote sensing fusion technology in recent years to fuse MS and PAN images to obtain an MS image with high spatial resolution.
A PAN modulation-based fusion method (for example, pansar method in PCI Geomatica software) is one of the fusion methods that are currently used. MS and PAN fusion based on PAN modulation is based on the assumption: the ratio of the fused MS wave band to the original MS wave band is equal to the ratio of the original PAN image to the synthesized PAN image. Since the calculation formula implies a spectral distortion minimization model, the fused image has a better visual effect and less spectral distortion. The fusion algorithm based on PAN Modulation technology mainly comprises Brovey transformation, Pradines', Synthetic Variable Ratio (Synthetic Variable Ratio), smoothening Filter-based Intensity Modulation, PANSHARP (PS), Haze-and Ratio-based (HR), and the like. The fusion method based on PAN modulation has the advantages of simple calculation, strong robustness and the like, and is widely applied to the fusion of satellite data. The registration accuracy of the images to be fused, the influence of atmospheric radiation, the problem of mixed pixels and the like are several key problems influencing the quality of the fused images.
The Haze-and Ratio-based (HR) method proposed by Jing and Cheng (2009) uses the original PAN image for fusion; and P used in the HR methodSThe generation method comprises the following steps: the spatial resolution is first reduced by averaging the original PAN resolution image and then up-sampled to the original PAN spatial resolution. This approach does not improve the effect for the case where there is a small amount of spatial misalignment between the MS and PAN images.
An effective solution to the problems in the related art has not been proposed yet.
Disclosure of Invention
Aiming at the technical problems in the related art, the invention provides an MS and PAN image fusion method based on PAN modulation and multiple linear regression, which can reduce the influence of a small amount of spatial dislocation of MS and PAN images on the quality of a fusion image.
The purpose of the invention is realized by the following technical scheme:
a MS and PAN image fusion method based on PAN modulation and multivariate linear regression comprises the following steps:
s1: original low resolution MS image ILAdopting a cubic convolution mode to up-sample the resolution of the original PAN image P to obtain an up-sampled MS image I;
s2: sampling the original PAN image to the spatial resolution of the MS image by adopting an averaging method to obtain the PAN image P with the spatial resolution of the MS imageL
S3: synthesizing a PAN image P at MS spatial resolution using a weighted sum of the individual MS bandsS
S4: calculating PAN band sumFog value H of ith wave band of MS imagePAnd HiWherein the haze value of the PAN band is HpMin (p); the fog value of the ith wave band of the MS image is Hi=min(MSi):
S5: obtaining a contrast enhanced PAN image P according to the spatial information enhancement degree coefficient kF
S6: according to different values of the spatial information enhancement degree coefficient k, the up-sampling MS image I and the synthetic PAN image P are processedSAnd contrast enhanced PAN image PFFusion is performed.
Further, in the step S3, P is usedLBased on a dependent variable, ILSolving regression coefficient a of each MS wave band by adopting a least square method formula (1) as an independent variableiAnd b:
Figure BDA0001190413580000021
computing a weighted sum of the respective bands of the upsampled MS using equation (2) to synthesize a PAN image PS
Figure BDA0001190413580000031
In the formula (1) and the formula (2), N is the number of the bands of the MS image,
Figure BDA0001190413580000032
is the ith wave band, I, of the original MS imageiFor up-sampling the i-th band, P, of an MS imageSTo synthesize PAN images, aiAnd b are the coefficient and constant terms of the i-th band, respectively.
Further, in the step S5, the image P is contrast-enhancedFObtained by the following formula, the value of the threshold value T is set to PFThe variance of (a);
Figure BDA0001190413580000033
p in formula (3)EFor Laplacian (Laplacian) of the original PAN imageLas) filter filtered edge detail image.
Further, an edge detail image P is generatedEThe following Laplacian filters were employed:
Figure BDA0001190413580000034
in formula (4), g is a Laplacian filter.
Further, in step S6:
for the pixel (m, n) with the gray level greater than or equal to the threshold value T in the PAN image, the fused spectrum F is calculated by the following formula (5)i
Figure BDA0001190413580000035
For pixels (m, n) with gray levels less than a threshold value T in the PAN image, a relatively low fog value is calculated according to the following formulas (6) and (7)
Figure BDA0001190413580000036
And
Figure BDA0001190413580000037
Figure BDA0001190413580000038
Figure BDA0001190413580000039
in the formula (6) and the formula (7), p is more than or equal to 0.5 and less than or equal to 1,
Figure BDA0001190413580000041
and
Figure BDA0001190413580000042
respectively representing the fog values of the PAN wave band and the ith wave band of the dark pixel in the image;
the fused spectrum F was calculated by the following formula (8)i(m,n):
Figure BDA0001190413580000043
In the formula (5) and the formula (8), k is a spatial information enhancement degree parameter, P is an image after the contrast enhancement of the original PAN band, and P isFThe PAN band contrast enhanced image is obtained.
The invention has the beneficial effects that: the spectral distortion of the fused image is obviously reduced, the fused image with different degrees of spatial detail enhancement can be provided, and particularly, under the condition that registration errors exist in MS and PAN pixels, the fused image has less spectral distortion and better visual effect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a flowchart of an MS and PAN image fusion method based on PAN modulation and multiple linear regression according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present invention.
As shown in fig. 1, a method for fusing an MS and a PAN image based on PAN modulation and multiple linear regression according to an embodiment of the present invention includes the following steps:
s1: the original low resolution MS image (I)L) Upsampling to original PAN image by cubic convolution (P) resolution, obtaining an up-sampled MS image I;
s2: sampling P to MS image space resolution by using average method to obtain PAN image P of MS space resolutionL
S3: with PLBased on a dependent variable, ILSolving regression coefficient a of each MS wave band by adopting a least square method formula (1) as an independent variableiAnd b:
Figure BDA0001190413580000051
solving the regression coefficient aiAnd b, generating a synthetic PAN image P using the following formulaS
Figure BDA0001190413580000052
In the formula (1) and the formula (2), N is the number of the bands of the MS image,
Figure BDA0001190413580000053
is the ith wave band, I, of the original MS imageiFor up-sampling the i-th band, P, of an MS imageSTo synthesize PAN images, aiAnd b are the coefficient and constant terms of the i-th band, respectively.
S4: calculating fog values H of the PAN wave band and the ith wave band of the MS imagePAnd HiRespectively determined by the minimum value of the I wave band of the PAN wave band and the MS image, namely the fog value of the PAN wave band is HPMin (P), the fog value of the ith wave band of the MS image is Hi=min(MSi)。
S5: obtaining a contrast enhanced PAN image P according to the spatial information enhancement degree coefficient kF. Contrast enhanced image PFObtained by the following formula, the value of the threshold value T is set to PFThe variance of (a);
Figure BDA0001190413580000054
p in formula (3)EFor Laplacian (pull) of the original PAN imagePlausibility) filtered edge detail image, generating an edge detail image PEThe following Laplacian filters were employed:
Figure BDA0001190413580000055
in the formula (4), g is a Laplacian filter.
S6: according to different values of the spatial information enhancement degree coefficient k, the up-sampling MS image I and the synthetic PAN image P are processedSAnd contrast enhanced PAN image PFFusion is performed.
For the pixel (m, n) with the gray scale larger than or equal to the threshold value T in the PAN image, the fused spectrum F is calculated by using the following formulai
Figure BDA0001190413580000061
For pixels (m, n) in the PAN image with a gray level less than a threshold value T, a relatively low fog value is calculated according to the following formula
Figure BDA0001190413580000062
And
Figure BDA0001190413580000063
Figure BDA0001190413580000064
Figure BDA0001190413580000065
in the formula (6) and the formula (7), p is more than or equal to 0.5 and less than or equal to 1,
Figure BDA0001190413580000066
and
Figure BDA0001190413580000067
respectively the fog values of PAN wave band and ith wave band of dark pixel in the image, and utilizesThe fused spectrum F is calculated by the following formulai(m,n):
Figure BDA0001190413580000068
In the formula (5) and the formula (8), k is a spatial information enhancement degree parameter, P is an image after the contrast enhancement of the original PAN band, and P isFThe PAN band contrast enhanced image is obtained.
In order to facilitate understanding of the above-described embodiments of the present invention, the following detailed description of the embodiments of the present invention is provided by way of specific usage.
The following relates to 2 comparative experiments. The first experiment was performed with no registration error between the MS and PAN images, while the second experiment was performed with different degrees of registration error between the MS and PAN images.
In the first comparison experiment, the experimental data comprises 3 high-resolution remote sensing images from WorldView-2, Pleiades, IKONOS and other sensors respectively; the comparison method selects 5 currently recognized and excellent fusion algorithms, specifically including Mode 1 (hereinafter abbreviated as GS1) and Mode 2 (hereinafter abbreviated as GS2) of Gram-Schmidt (GS) method, adaptive GS (GSA), Generalized Gaussian Generalized (GLP), "atrous" wave transform (ATWT), and Adaptive Wave Luminescence Protocol (AWLP); the fusion image quality evaluation index selects relative global dimension comprehensive index (ERGAS), Spectrum Angle (SAM), comprehensive quality index Q4/Q8 and Spatial Correlation Coefficient (SCC). The EASE reflects the deviation of the fused image and the reference image, and the smaller the value is, the better the fusion effect is; ERGAS reflects the global spectral radiation deformation error of the fused image and the reference image, and the smaller the error, the better the error; the SAM reflects the difference between the fused image spectrum and the reference image spectrum, and the smaller the value is, the better the fusion effect is; q4 and Q8 are comprehensive quality indexes considering local mean deviation, contrast change and correlation loss of the fused image and the reference image at the same time, and the larger the value is, the better the value is; the SCC is an index considering the spatial detail correlation between the fused image and the PAN image, and the larger the value, the better the value. Statistics of the fused image quality evaluation indices for the 3 experimental images are shown in tables 1-3.
TABLE 1 statistics of quality evaluation indexes of fused images of WorldView-2 satellite images
Figure BDA0001190413580000071
TABLE 2 quality evaluation index statistics for fused images of Pleiades satellite images
Figure BDA0001190413580000072
Figure BDA0001190413580000081
TABLE 3 IKONOS satellite image fusion image quality evaluation index statistics
Figure BDA0001190413580000082
As can be seen from the statistical indexes in tables 1-3, the method of the present invention is superior to other fusion methods when the spatial detail adjustment parameter k is zero, both from the viewpoint of the spectral quality evaluation index (RASE, ERGAS, SAM, Q4/Q8, etc.) and the spatial quality evaluation index (SCC). This indicates that the fused image of the method of the present invention is significantly superior to the fused images of the methods GS1, GS2, GSA, GLP, ATWT, ATWP, etc., in both spectral and spatial indices without deliberate spatial detail enhancement (i.e., k is 0). Although the quality evaluation indexes of RASE, ERGAS, SAM, Q4/Q8, SCC and the like of the fused image are not completely superior to those of other methods when the k value is 1 or 2 in the method, the fused image has no obvious spectral distortion and better enhances the spatial details in visual comparison. This indicates that the method of the present invention is a good choice when fused images are used for remote sensing mapping, visual interpretation, etc.
In the second comparative experiment, the experimental data are the IKONOS data in the first comparative experiment; the experimental method comprises the steps of respectively carrying out deviation of different degrees on an up-sampling MS image, fusing by adopting the method, GSA, GLP, ATWT, AWLP and the like, and finally carrying out quality evaluation on the fused image; ERGAS, SAM, Q4 and SCC were selected for the fusion image quality evaluation. In this experiment, seven kinds of offsets (0, 1), (1, 1), (2, 2), (3, 3), (4, 3), and (4, 4) were selected. Statistics of the fusion image quality evaluation indices generated in this experiment are shown in table 4.
As can be seen from the statistical indexes in Table 4, the fused image of the method of the present invention is superior to the fused images of the methods such as GSA, GLP, ATWT and ATWP in terms of the spectral quality evaluation index and spatial quality evaluation index SCC of ERGAS, SAM and Q4 under different offsets. This shows that the method of the present invention has significant advantages over other methods in the case of spatial misalignment of the MS and PAN images. The fused images of the method have smaller spectral distortion and clearer boundaries in visual comparison.
The two experiments show that compared with the similar method, the method further reduces the spectrum distortion of the fused image and enhances the spatial detail, and particularly, the fused image is superior to a comparison method in the aspects of spectrum, spatial quality evaluation indexes and visual effects under the condition that the MS image and the PAN image have spatial dislocation.
TABLE 4 IKONOS satellite image fusion image quality evaluation index statistics under different offset conditions
Figure BDA0001190413580000091
PAN image P with MS spatial resolution used by fusion method provided by the inventionSThe method is obtained by adopting the weighted sum of each MS wave band, wherein the weight coefficient is the optimal coefficient when the PAN wave band and each MS wave band are subjected to multiple linear regression. Under the condition that slight spatial dislocation exists between MS and PAN, the optimal coefficient better reflects the relation between PAN and each MS wave band, so that the influence of the dislocation on the quality of the fused image can be reduced.
The method is a fusion method based on PAN modulation and considering fog influence. The PAN modulation based fusion method utilizes a spectral distortion minimization model during spatial detail injection. Therefore, the spectral distortion of the fused image can be limited, and the visual effect of the fused image is ensured. Due to the influence of the atmospheric path radiation, the fog value of each wave band in the image influences the direction of the spectrum vector of the fused pixel, so that the spectrum distortion degree of the fused image is influenced. Therefore, removing the haze effect before PAN modulation can reduce the spectral distortion of the fused image. In the implementation process of the method, a slightly low fog value is adopted for relatively dark pixels in the image, so that the spectrum distortion phenomenon of fused pixels such as water, shadow and the like can be improved.
The method can enhance the contrast of the original PAN image to different degrees by setting the spatial detail enhancement coefficient k, and then fuse the PAN image after the contrast enhancement. The method is beneficial for a user to autonomously control the spatial detail enhancement degree of the fused image according to the application requirements, and the fused image meeting the different application requirements is obtained.
In summary, by means of the above technical solution of the present invention, the PAN modulation method under the influence of fog is considered, the spectral distortion of the fused image is significantly reduced, and the fused image with different degrees of spatial detail enhancement can be provided, especially under the condition that registration errors exist in MS and PAN pixels, the fused image by the method has less spectral distortion and better visual effect.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (1)

1. A MS and PAN image fusion method based on PAN modulation and multiple linear regression is characterized by comprising the following steps:
s1: original low resolution MS image ILAdopting a cubic convolution mode to up-sample the resolution of the original PAN image P to obtain an up-sampled MS image I;
s2: sampling P to MS image space resolution by using average method to obtain PAN image of MS space resolutionLike PL
S3: with PLAs a dependent variable, ILSolving regression coefficient a of each MS wave band by adopting a least square method formula (1) as an independent variableiAnd b:
Figure FDA0002261096360000011
solving the regression coefficient aiAnd b, generating a synthetic PAN image Ps using the following formula:
Figure FDA0002261096360000012
in the formula (1) and the formula (2), N is the number of the bands of the MS image,
Figure FDA0002261096360000013
is the ith wave band, I, of the original MS imageiFor up-sampling the i-th band of the MS image, Ps is the synthetic PAN image, aiAnd b are the coefficient and constant terms of the ith waveband respectively;
s4: calculating fog values H of the PAN wave band and the ith wave band of the MS imagePAnd HiRespectively determined by the minimum value of the I wave band of the PAN wave band and the MS image, namely the fog value of the PAN wave band is HpMin (P), the fog value of the ith wave band of the MS image is Hi=min(Ii);
S5: obtaining a contrast enhanced PAN image P according to the spatial information enhancement degree coefficient kFContrast enhanced image PFObtained by the following formula, the value of the threshold value T is set to PFThe variance of (a);
Figure FDA0002261096360000014
p in formula (3)EK is a spatial information enhancement degree parameter to generate an edge detail image P for the edge detail image after Laplacian (Laplacian) filtering is carried out on the original PAN imageEThe following Laplacian filters were employed:
Figure FDA0002261096360000021
in formula (4), g is a Laplacian filter;
s6: according to different values of the spatial information enhancement degree coefficient k, the up-sampling MS image I, the synthetic PAN image Ps and the contrast enhancement PAN image P are processedFCarrying out fusion;
for the pixel (m, n) with the gray scale larger than or equal to the threshold value T in the PAN image, the fused spectrum F is calculated by using the following formulai
Figure FDA0002261096360000022
For pixels (m, n) in the PAN image with a gray level less than a threshold value T, a relatively low fog value is calculated according to the following formula
Figure FDA0002261096360000023
And
Figure FDA0002261096360000024
Figure FDA0002261096360000025
Figure FDA0002261096360000026
in the formula (6) and the formula (7), p is more than or equal to 0.5 and less than or equal to 1,
Figure FDA0002261096360000027
and
Figure FDA0002261096360000028
respectively calculating the fog values of the PAN wave band and the ith wave band of the dark pixel in the image by using the following formulai(m,n):
Figure FDA0002261096360000029
In the formula (5) and the formula (8), k is a spatial information enhancement degree parameter, Ps is a synthetic PAN image, and P isFThe PAN band contrast enhanced image is obtained.
CN201611208080.5A 2016-12-23 2016-12-23 MS and PAN image fusion method based on PAN modulation and multivariate linear regression Expired - Fee Related CN106651800B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611208080.5A CN106651800B (en) 2016-12-23 2016-12-23 MS and PAN image fusion method based on PAN modulation and multivariate linear regression

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611208080.5A CN106651800B (en) 2016-12-23 2016-12-23 MS and PAN image fusion method based on PAN modulation and multivariate linear regression

Publications (2)

Publication Number Publication Date
CN106651800A CN106651800A (en) 2017-05-10
CN106651800B true CN106651800B (en) 2020-05-22

Family

ID=58826868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611208080.5A Expired - Fee Related CN106651800B (en) 2016-12-23 2016-12-23 MS and PAN image fusion method based on PAN modulation and multivariate linear regression

Country Status (1)

Country Link
CN (1) CN106651800B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110070518B (en) * 2019-03-15 2023-05-23 南京航空航天大学 Hyperspectral image super-resolution mapping method based on dual-path support
CN110533600B (en) * 2019-07-10 2022-07-19 宁波大学 Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1877636A (en) * 2006-07-03 2006-12-13 中国科学院遥感应用研究所 Method for fusion generation of high-resolution multi-spectral image
CN101246594A (en) * 2008-02-22 2008-08-20 华南师范大学 Optimized amalgamation remote sensing image processing method based on gradient field
CN102446351A (en) * 2010-10-15 2012-05-09 江南大学 Multispectral and high-resolution full-color image fusion method study
WO2014183259A1 (en) * 2013-05-14 2014-11-20 中国科学院自动化研究所 Full-color and multi-spectral remote sensing image fusion method
CN104933690A (en) * 2015-06-04 2015-09-23 中国科学院遥感与数字地球研究所 Remote sensing multi-spectral and panchromatic image fusion method based on mixed sub-pixel un-mixing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1877636A (en) * 2006-07-03 2006-12-13 中国科学院遥感应用研究所 Method for fusion generation of high-resolution multi-spectral image
CN101246594A (en) * 2008-02-22 2008-08-20 华南师范大学 Optimized amalgamation remote sensing image processing method based on gradient field
CN102446351A (en) * 2010-10-15 2012-05-09 江南大学 Multispectral and high-resolution full-color image fusion method study
WO2014183259A1 (en) * 2013-05-14 2014-11-20 中国科学院自动化研究所 Full-color and multi-spectral remote sensing image fusion method
CN104933690A (en) * 2015-06-04 2015-09-23 中国科学院遥感与数字地球研究所 Remote sensing multi-spectral and panchromatic image fusion method based on mixed sub-pixel un-mixing

Also Published As

Publication number Publication date
CN106651800A (en) 2017-05-10

Similar Documents

Publication Publication Date Title
Choi A new intensity-hue-saturation fusion approach to image fusion with a tradeoff parameter
Shahdoosti et al. Fusion of MS and PAN images preserving spectral quality
Yang et al. Learning low-rank decomposition for pan-sharpening with spatial-spectral offsets
US8693771B2 (en) Method for pan-sharpening panchromatic and multispectral images using dictionaries
US8699790B2 (en) Method for pan-sharpening panchromatic and multispectral images using wavelet dictionaries
CN106327455A (en) Improved method for fusing remote-sensing multispectrum with full-color image
CN111882504B (en) Method and system for processing color noise in image, electronic device and storage medium
Yang et al. Remote sensing image fusion based on adaptively weighted joint detail injection
CN107169947B (en) Image fusion experimental method based on feature point positioning and edge detection
CN107958450B (en) Panchromatic multispectral image fusion method and system based on self-adaptive Gaussian filtering
Yang et al. Pansharpening for multiband images with adaptive spectral–intensity modulation
CN106651800B (en) MS and PAN image fusion method based on PAN modulation and multivariate linear regression
CN107610144A (en) A kind of improved IR image segmentation method based on maximum variance between clusters
CN103971345B (en) A kind of image de-noising method based on improvement bilateral filtering
CN112308775A (en) Underwater image splicing method and device
CN109635809B (en) Super-pixel segmentation method for visual degradation image
Ciotola et al. Unsupervised deep learning-based pansharpening with jointly-enhanced spectral and spatial fidelity
CN109859153B (en) Multispectral image fusion method based on adaptive spectrum-spatial gradient sparse regularization
CN115965552B (en) Frequency-space-time domain joint denoising and recovering system for low signal-to-noise ratio image sequence
CN112927161B (en) Method and device for enhancing multispectral remote sensing image and storage medium
JPWO2018084069A1 (en) Image composition system, image composition method, and image composition program recording medium
CN112634186A (en) Image analysis method of unmanned aerial vehicle
CN111652808A (en) Infrared image detail enhancement and noise suppression method
Zheng et al. Welding seam image dust removal algorithm based on fusion of dual-scale dark channel and bright channel
Ajisha et al. Survey on remote sensing image fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200522