CN109447922B - Improved IHS (induction heating system) transformation remote sensing image fusion method and system - Google Patents

Improved IHS (induction heating system) transformation remote sensing image fusion method and system Download PDF

Info

Publication number
CN109447922B
CN109447922B CN201810749669.9A CN201810749669A CN109447922B CN 109447922 B CN109447922 B CN 109447922B CN 201810749669 A CN201810749669 A CN 201810749669A CN 109447922 B CN109447922 B CN 109447922B
Authority
CN
China
Prior art keywords
image
new
panchromatic
ihs
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810749669.9A
Other languages
Chinese (zh)
Other versions
CN109447922A (en
Inventor
易维
曾湧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Center for Resource Satellite Data and Applications CRESDA
Original Assignee
China Center for Resource Satellite Data and Applications CRESDA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Center for Resource Satellite Data and Applications CRESDA filed Critical China Center for Resource Satellite Data and Applications CRESDA
Priority to CN201810749669.9A priority Critical patent/CN109447922B/en
Publication of CN109447922A publication Critical patent/CN109447922A/en
Application granted granted Critical
Publication of CN109447922B publication Critical patent/CN109447922B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10041Panchromatic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

An improved IHS transform remote sensing image fusion method and system comprises the steps of (1) resampling a multispectral image to achieve the resolution of a panchromatic image; (2) performing IHS transformation on the resampled multispectral image to respectively obtain I, H, S components; (3) simulating a new panchromatic image; (4) taking the I component image histogram as a reference, carrying out histogram matching on the new panchromatic image to obtain a matched panchromatic image Inew(ii) a (5) Full-color image I after histogram matchingnewAnd replacing the I component to perform IHS inverse transformation, thereby completing the remote sensing image fusion. The invention aims to improve the IHS transformation fusion rule, so that the spectral range of a full-color image is consistent with that of a multispectral image, and the brightness and contrast of a fusion result are not lower than those before fusion.

Description

Improved IHS (induction heating system) transformation remote sensing image fusion method and system
Technical Field
The invention relates to an improved IHS transformation remote sensing image fusion method and system, and belongs to the technical field of remote sensing image processing.
Background
The remote sensing image is one of the modes for obtaining the spatial information, and the spatial resolution and the spectral resolution are two important indexes for measuring the quality of the remote sensing image. The method is limited by data storage equipment on a remote sensing satellite, and the current domestic satellite optical camera generally adopts a setting mode of panchromatic image high spatial resolution and multispectral image low spatial resolution. The panchromatic image cannot display the colors of ground objects, the attractiveness is not enough, the multispectral image can give RGB colors to different wave bands to obtain a color image, but the resolution is low, and the application requirements cannot be met. In order to obtain a high-resolution color image, a panchromatic image with high spatial resolution and a multispectral image with low spatial resolution need to be fused, which is beneficial to human eye recognition and computer processing. According to the difference of the abstraction degree and the fusion application level of the image data, the remote sensing image fusion can be divided into three levels of pixel level fusion, feature level fusion and decision level fusion, and the fusion algorithm and the application range adopted by each level are different.
The IHS transform is one of pixel-level fusion algorithms by which a color image having red (R), green (G), and blue (B) can be transformed into a saturation (S), chrominance (H), and luminance (I) space. However, this algorithm has two significant drawbacks: firstly, only information of 3 wave bands of red, green and blue can be processed, and the multispectral wave band setting of domestic satellites is generally 4 or more; and secondly, if the spectral coverage of the panchromatic band is inconsistent with the multispectral band, the fused image spectrum has larger distortion.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the method overcomes the defects of the prior art, and provides an improved IHS transformation remote sensing image fusion method, improves the IHS transformation algorithm, and can obtain a fusion result superior to the traditional IHS transformation algorithm.
The technical solution of the invention is as follows:
an improved IHS transformation remote sensing image fusion method comprises the following steps:
(1) resampling the multispectral image to achieve the resolution of a panchromatic image;
(2) performing IHS transformation on the resampled multispectral image to respectively obtain I, H, S components;
(3) simulating a new panchromatic image;
(4) taking the I component image histogram as a reference, carrying out histogram matching on the new panchromatic image obtained in the step (3) to obtain a matched panchromatic image Inew
(5) Full-color image I after histogram matchingnewAnd replacing the I component to perform IHS inverse transformation, thereby completing the remote sensing image fusion.
And resampling the multispectral image by adopting a bilinear interpolation method.
Performing IHS transformation on the resampled multispectral image in the step (2) to respectively obtain I, H, S components, specifically:
(2.1) converting the two-dimensional image gray value into a one-dimensional image gray value of the resampled multispectral image;
(2.2) obtaining I, H, S components by the following formula:
Figure BDA0001725241110000021
Figure BDA0001725241110000022
Figure BDA0001725241110000023
r, G, B is an image gray matrix of red, green and blue spectra, I after transformation reflects the space detail of the image, H and S reflect the spectrum information of the image, V1And V2Is an intermediate variable.
The step (3) of simulating a new full-color image is carried out in the following way:
Figure BDA0001725241110000024
in the formula, PANnewThe image gray scale matrix of the new full color spectrum band, PAN the image gray scale matrix of the full color spectrum band and IR the image gray scale matrix of the near infrared spectrum band.
The step (4) takes the I component image histogram as a reference, and performs histogram matching on the new panchromatic image, specifically:
(4.1) firstly, calculating the probability densities P (l) and p (k) of the I component and the new panchromatic image histogram;
(4.2) obtaining the cumulative probability densities S (l) and V (k) of the I component and the new panchromatic image;
(4.3) generating a lookup table based on the cumulative probability densities S (l) and V (k), replacing the DN value of the pixel in the new panchromatic image from k to l or l +1, completing the histogram matching of the new panchromatic image, and obtaining a matched panchromatic image Inew
A probability density P (l) of I component DN value l
Figure BDA0001725241110000031
The probability density p (k) of the new panchromatic image DN having a value k is
Figure BDA0001725241110000032
Wherein k is 0,1,2, … 1023, mlIs the number of pixels with I component DN value of l, mkThe number of pixels with the new panchromatic image DN value of k is shown, and N is the total number of the pixels.
A cumulative probability density S (l) of I component DN value l
Figure BDA0001725241110000033
The new panchromatic image DN has a value k and an accumulated probability density V (k) of
Figure BDA0001725241110000034
The step (4.3) of replacing the DN value of the pixel in the new full-color image from k to l or l +1 specifically comprises the following steps:
for any DN value of k for the new full color image,
if | Vk-Sl|-|Vk-Sl+1If the | is less than or equal to 0, replacing DN value k in the new panchromatic image with l;
if | Vk-Sl|-|Vk-Sl+1|>0, replacing DN value k in the new panchromatic image with l + 1;
Sl≤Vk≤Sl+1
the new panchromatic image I after histogram matching in the step (5)newReplacing the I component to perform IHS inverse transformation, specifically:
(5.1) transforming the IHS space to the RGB space:
Figure BDA0001725241110000041
wherein R isnew、GnewAnd BnewRespectively one-dimensional red, green and blue images after image fusion;
and (5.2) converting the gray value of the one-dimensional image into the gray value of the two-dimensional image, thereby completing the remote sensing image fusion.
In the step (1), the resolution of the multispectral image before resampling is 3.2 meters, and the resolution of the panchromatic image is 0.8 meters.
An improved IHS transform remote sensing image fusion system comprises:
a resampling module: resampling the multispectral image to achieve the resolution of a panchromatic image;
an IHS conversion module: performing IHS transformation on the resampled multispectral image to respectively obtain I, H, S components;
the new full-color image generation module: simulating a new panchromatic image;
a histogram matching module: taking the I component image histogram as a reference, carrying out histogram matching on the new panchromatic image generated by the new panchromatic image generation module to obtain a matched panchromatic image Inew
An IHS inverse transformation module: panchromatic image I obtained after histogram matching module processingnewAnd replacing the I component to perform IHS inverse transformation, thereby completing the remote sensing image fusion.
Compared with the prior art, the invention has the beneficial effects that:
(1) the information of four wave bands of a domestic satellite can be fully utilized, and the waste of the information is prevented;
(2) according to the spectrum setting rule, a full-color image is regenerated, so that the requirements of a fusion algorithm are better met, and spectrum distortion of the fused spectrum is avoided;
(3) the histogram matching technology is adopted to ensure that the brightness and the contrast of the fused image are basically consistent with those of the fused image before fusion, thereby reducing the information loss of the image fusion.
(4) The brightness component is replaced by the regenerated full-color image, and the detail information of the image is effectively maintained.
(5) Data fusion experiments carried out by the method prove that the improved IHS fusion method can improve the correlation coefficient of the fusion result and the original multispectral image and reduce the spectral distortion.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a schematic diagram of resampling according to the present invention.
Detailed Description
The IHS transformation fusion method is a commonly used method in the pixel level image fusion technology, and due to the characteristics of the fusion algorithm, a good fusion effect can be obtained only when the spectral range of a full-color image is completely consistent with the multispectral image. At present, the multispectral spectral bands of the domestic satellite remote sensing image are set to be 4 bands of near infrared, red, green and blue, and one more near infrared spectral band is added compared with red, green and blue required by IHS transformation, so that the spectral band range of a full-color image and the multispectral image is inconsistent when the IHS transformation is used. The invention aims to improve the IHS transformation fusion rule, so that the spectral range of a full-color image is consistent with that of a multispectral image, and the brightness and contrast of a fusion result are not lower than those before fusion.
Referring to the attached fig. 1, the steps of the improved IHS transform remote sensing image fusion method are as follows:
the method comprises the following steps: resampling the multispectral image to achieve the resolution of a panchromatic image;
the multispectral image has low resolution, and the panchromatic image has high resolution, and the multispectral image and the panchromatic image need to have uniform resolution. Taking the resolution setting of the current domestic remote sensing satellite, the resolution of multispectral images of most satellites is 3-4 times of that of panchromatic images, taking a high-resolution secondary satellite remote sensing image as an example, the resolution of the panchromatic images is 0.8 meter, and the resolution of the multispectral images is 3.2 meters, so that resampling is needed.
The invention adopts a bilinear interpolation method for resampling, and if the image is a high-resolution second satellite image, 1 multispectral pixel is sampled into 16(4 x 4) pixels. The calculation method of each pixel value after sampling is explained with reference to fig. 2: in fig. 2, four points Q11, Q12, Q22 and Q21 are four adjacent pixels before sampling, respectively, and now 16 new pixels need to be generated between Q11 and Q22, taking P point as an example to illustrate the calculation process of the gray value.
Q11、Q12、Q22、Q21、R1、R2The coordinates of the P points are marked in the graph, the f (×) function represents the gray value of the pixel, and linear interpolation is firstly carried out in the x direction to obtain:
Figure BDA0001725241110000061
Figure BDA0001725241110000062
then linear interpolation is carried out in the direction to obtain
Figure BDA0001725241110000063
Step two: carrying out IHS transformation on the multispectral image to respectively obtain I, H, S components;
the calculation formula for transforming the RGB space to the IHS space is as follows:
Figure BDA0001725241110000064
Figure BDA0001725241110000065
Figure BDA0001725241110000066
wherein R, G, B are image gray level matrixes of red, green and blue spectral bands respectively, and the transformed I reflects the graphSpatial detail of the image, H and S reflect the spectral information of the image, V1And V2Is an intermediate variable.
The calculation formula needs to convert the gray value of the two-dimensional image of each waveband into the gray value of the one-dimensional image, if the image size is m rows and n columns, R, G, B in the formula are all gray vectors of 1 row m × n columns, and in the same way, I, V1、V2H, S are all vectors of 1 row m x n columns, where V1And V2In order not to be of practical significance, only intermediate results are indicated.
Step three: simulating a new full-color image according to a wave band setting rule;
the new panchromatic image is calculated as follows:
Figure BDA0001725241110000071
in the formula, PANnewThe image gray scale matrix of the new full color spectrum band, PAN the image gray scale matrix of the full color spectrum band and IR the image gray scale matrix of the near infrared spectrum band.
The panchromatic image generally covers a spectral range including all visible light ranges, and taking the high-resolution secondary satellite remote sensing image as an example, the panchromatic image has a spectral range of 0.45 to 0.90 μm, the multispectral middle-blue band image is 0.45 to 0.52 μm, the green band image is 0.52 to 0.59 μm, the red band image is 0.63 to 0.69 μm, and the near-infrared band image is 0.77 to 0.89 μm. In step one, R, G, B spectrum information is adopted by IHS transformation, IR spectrum information is not used, and the coverage range of actual full-color spectrum comprises the coverage range of four spectrum bands, so that the subsequent IHS inverse transformation by replacing I components with full-color images necessarily causes spectrum distortion. Considering that the IHS transform can only use three spectral bands, this step regenerates the panchromatic band to include as little information as possible of three spectral bands.
Step four: taking the I component image histogram as a reference, and performing histogram matching on the new panchromatic image;
(1) first, the probability density of the I component and the new panchromatic image histogram is found.
A probability density P (l) of I component DN value l
Figure BDA0001725241110000072
The probability density p (k) of the new panchromatic image DN having a value k is
Figure BDA0001725241110000073
Wherein k is 0,1,2, … 1023, mlIs the number of pixels with I component DN value of l, mkThe number of pixels with the new panchromatic image DN value of k is shown, and N is the total number of the pixels.
(2) The cumulative probability density of the I component and the new panchromatic image is found.
A cumulative probability density S (l) of I component DN value l
Figure BDA0001725241110000081
The new panchromatic image DN has a value k and an accumulated probability density S (k) of
Figure BDA0001725241110000082
(3) A look-up table is generated.
For any value of k DN for a new panchromatic image, one l can always be found in the I component, so that Sl≤Vk≤Sl+1. If | Vk-Sl|-|Vk-Sl+1If | is less than or equal to 0, substituting l for DN value k in the new full-color image. If | Vk-Sl|-|Vk-Sl+1|>0, then l +1 replaces the DN value k in the new panchromatic image.
Step five: and replacing the I component with the new full-color image after histogram matching, and performing IHS inverse transformation.
The calculation formula for transforming the IHS space to the RGB space is as follows:
Figure BDA0001725241110000083
after the calculation is finished, the gray value of the one-dimensional image of each wave band needs to be converted into the gray value of the two-dimensional image, and if the image size is m rows and n columns, R in the formulanew、Gnew、BnewAre all 1 row m x n column gray scale vectors.
Based on the remote sensing image fusion method, the invention also provides an improved IHS transformation remote sensing image fusion system, which comprises the following steps:
a resampling module: resampling the multispectral image to achieve the resolution of a panchromatic image;
an IHS conversion module: performing IHS transformation on the resampled multispectral image to respectively obtain I, H, S components;
the new full-color image generation module: simulating a new panchromatic image;
a histogram matching module: taking the I component image histogram as a reference, carrying out histogram matching on the new panchromatic image generated by the new panchromatic image generation module to obtain a matched panchromatic image Inew
An IHS inverse transformation module: panchromatic image I obtained after histogram matching module processingnewAnd replacing the I component to perform IHS inverse transformation, thereby completing the remote sensing image fusion.
Example (b):
taking high-resolution two-color panchromatic and multispectral images as an example to perform a fusion experiment, adopting a traditional HIS fusion method and an improved HIS fusion method provided by the invention to perform a comparison experiment, and adopting average gradient, entropy, correlation coefficient and spectral torsion as objective evaluation standards. The results are evaluated in Table 1, and the technical effects of the present invention can be seen.
TABLE 1 fusion evaluation results
Figure BDA0001725241110000091
As can be seen from the table, the improved HIS and IHS fusion methods produce images with small changes in mean gradient and entropy, and the amount of information and detail information are not much different, but the correlation coefficient improvement method is improved from 0.58 to 0.90, and the spectral distortion is reduced from 18.68 to 6.76.

Claims (8)

1. An improved IHS transformation remote sensing image fusion method is characterized by comprising the following steps:
(1) resampling the multispectral image to achieve the resolution of a panchromatic image;
(2) performing IHS transformation on the resampled multispectral image to respectively obtain I, H, S components; the method specifically comprises the following steps:
(2.1) converting the two-dimensional image gray value into a one-dimensional image gray value of the resampled multispectral image;
(2.2) obtaining I, H, S components by the following formula:
Figure FDA0002812569000000011
Figure FDA0002812569000000012
Figure FDA0002812569000000014
r, G, B is an image gray matrix of red, green and blue spectra, I after transformation reflects the space detail of the image, H and S reflect the spectrum information of the image, V1And V2Is an intermediate variable;
(3) simulating a new panchromatic image;
the method comprises the following steps:
Figure FDA0002812569000000013
in the formula, PANnewIs the image gray scale matrix for the new panchromatic band,PAN is an image gray matrix of a full-color spectrum band, and IR is an image gray matrix of a near-infrared spectrum band;
(4) taking the I component image histogram as a reference, carrying out histogram matching on the new panchromatic image obtained in the step (3) to obtain a matched panchromatic image Inew
(5) Full-color image I after histogram matchingnewReplacing the I component to perform IHS inverse transformation, thereby completing remote sensing image fusion; the method specifically comprises the following steps:
(5.1) transforming the IHS space to the RGB space:
Figure FDA0002812569000000021
wherein R isnew、GnewAnd BnewRespectively one-dimensional red, green and blue images after image fusion;
and (5.2) converting the gray value of the one-dimensional image into the gray value of the two-dimensional image, thereby completing the remote sensing image fusion.
2. The improved IHS transform remote sensing image fusion method according to claim 1, wherein: and resampling the multispectral image by adopting a bilinear interpolation method.
3. The improved IHS transform remote sensing image fusion method according to claim 1, wherein: the step (4) takes the I component image histogram as a reference, and performs histogram matching on the new panchromatic image, specifically:
(4.1) firstly, calculating the probability densities P (l) and p (k) of the I component and the new panchromatic image histogram;
(4.2) obtaining the cumulative probability densities S (l) and V (k) of the I component and the new panchromatic image;
(4.3) generating a lookup table based on the cumulative probability densities S (l) and V (k), replacing the DN value of the pixel in the new panchromatic image from k to l or l +1, completing the histogram matching of the new panchromatic image, and obtaining a matched panchromatic image Inew
4. The improved IHS transform remote sensing image fusion method according to claim 3, wherein: a probability density P (l) of I component DN value l
Figure FDA0002812569000000022
The probability density p (k) of the new panchromatic image DN having a value k is
Figure FDA0002812569000000023
Wherein k is 0,1,2, … 1023, mlIs the number of pixels with I component DN value of l, mkThe number of pixels with the new panchromatic image DN value of k is shown, and N is the total number of the pixels.
5. The improved IHS transform remote sensing image fusion method according to claim 4, wherein: a cumulative probability density S (l) of I component DN value l
Figure FDA0002812569000000031
The new panchromatic image DN has a value k and an accumulated probability density V (k) of
Figure FDA0002812569000000032
6. The improved IHS transform remote sensing image fusion method according to claim 4, wherein: the step (4.3) of replacing the DN value of the pixel in the new full-color image from k to l or l +1 specifically comprises the following steps:
for any DN value of k for the new full color image,
if | Vk-Sl|-|Vk-Sl+1If | is less than or equal to 0, then use l to replace new panchromaticDN value k in the image;
if | Vk-Sl|-|Vk-Sl+1|>0, replacing DN value k in the new panchromatic image with l + 1;
Sl≤Vk≤Sl+1
7. the improved IHS transform remote sensing image fusion method according to claim 1, wherein: in the step (1), the resolution of the multispectral image before resampling is 3.2 meters, and the resolution of the panchromatic image is 0.8 meters.
8. An improved IHS transform remote sensing image fusion system is characterized by comprising:
a resampling module: resampling the multispectral image to achieve the resolution of a panchromatic image;
an IHS conversion module: performing IHS transformation on the resampled multispectral image to respectively obtain I, H, S components; the method specifically comprises the following steps:
converting the two-dimensional image gray value into a one-dimensional image gray value of the resampled multispectral image;
the I, H, S components are obtained by the following formula:
Figure FDA0002812569000000041
Figure FDA0002812569000000042
Figure FDA0002812569000000043
r, G, B is an image gray matrix of red, green and blue spectra, I after transformation reflects the space detail of the image, H and S reflect the spectrum information of the image, V1And V2Is an intermediate variable;
the new full-color image generation module: simulating a new panchromatic image;
the method comprises the following steps:
Figure FDA0002812569000000044
in the formula, PANnewThe image gray level matrix of the new full-color spectrum band is adopted, PAN is the image gray level matrix of the full-color spectrum band, and IR is the image gray level matrix of the near-infrared spectrum band;
a histogram matching module: taking the I component image histogram as a reference, carrying out histogram matching on the new panchromatic image generated by the new panchromatic image generation module to obtain a matched panchromatic image Inew
An IHS inverse transformation module: panchromatic image I obtained after histogram matching module processingnewReplacing the I component to perform IHS inverse transformation, thereby completing remote sensing image fusion;
the method specifically comprises the following steps:
transforming IHS space to RGB space:
Figure FDA0002812569000000045
wherein R isnew、GnewAnd BnewRespectively one-dimensional red, green and blue images after image fusion;
and converting the gray value of the one-dimensional image into the gray value of the two-dimensional image, thereby completing the fusion of the remote sensing images.
CN201810749669.9A 2018-07-10 2018-07-10 Improved IHS (induction heating system) transformation remote sensing image fusion method and system Active CN109447922B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810749669.9A CN109447922B (en) 2018-07-10 2018-07-10 Improved IHS (induction heating system) transformation remote sensing image fusion method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810749669.9A CN109447922B (en) 2018-07-10 2018-07-10 Improved IHS (induction heating system) transformation remote sensing image fusion method and system

Publications (2)

Publication Number Publication Date
CN109447922A CN109447922A (en) 2019-03-08
CN109447922B true CN109447922B (en) 2021-02-12

Family

ID=65532618

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810749669.9A Active CN109447922B (en) 2018-07-10 2018-07-10 Improved IHS (induction heating system) transformation remote sensing image fusion method and system

Country Status (1)

Country Link
CN (1) CN109447922B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111476746A (en) * 2020-03-19 2020-07-31 航天信德智图(北京)科技有限公司 Remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics
CN112330581B (en) * 2020-11-02 2022-07-12 燕山大学 Fusion method and system of SAR and multispectral image
CN112837268A (en) * 2021-01-11 2021-05-25 黄河勘测规划设计研究院有限公司 Collapse damage house disaster body information extraction-oriented multi-source remote sensing data fusion method
CN113343871B (en) * 2021-06-17 2022-03-18 哈尔滨工业大学 Poisson fusion and histogram matching-based ship target simulation method and system in high-resolution four-number multispectral remote sensing image
CN113870110B (en) * 2021-09-10 2023-06-13 深圳市魔方卫星科技有限公司 Image fusion method and device of remote sensing image, electronic equipment and storage medium
CN114331936B (en) * 2021-12-24 2024-04-16 郑州信大先进技术研究院 Remote sensing image fusion method based on wavelet decomposition and IHS algorithm improvement
CN116309497B (en) * 2023-03-26 2023-10-03 湖南医药学院 Image recognition-based auxiliary analysis method for cancer cell counting and prognosis prediction
CN117058053B (en) * 2023-07-18 2024-04-05 珠江水利委员会珠江水利科学研究院 IHS space-spectrum fusion method, system, equipment and medium based on mean value filtering

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103186893A (en) * 2012-12-19 2013-07-03 中国科学院对地观测与数字地球科学中心 Universal high-resolution remote sensing image fusion method
US20140301659A1 (en) * 2013-04-07 2014-10-09 Bo Li Panchromatic Sharpening Method of Spectral Image Based on Fusion of Overall Structural Information and Spatial Detail Information
CN105160647A (en) * 2015-10-28 2015-12-16 中国地质大学(武汉) Panchromatic multi-spectral image fusion method
US9342760B1 (en) * 2011-04-22 2016-05-17 Exelis Inc. System and method for combining color information with spatial information in multispectral images
CN106327455A (en) * 2016-08-18 2017-01-11 中国科学院遥感与数字地球研究所 Improved method for fusing remote-sensing multispectrum with full-color image
US20170076456A1 (en) * 2015-09-16 2017-03-16 Raytheon Company Systems and methods for digital elevation map filters for three dimensional point clouds
US20170084008A1 (en) * 2015-09-17 2017-03-23 Raytheon Company Systems and methods for sharpening multi-spectral imagery
CN107958450A (en) * 2017-12-15 2018-04-24 武汉大学 Panchromatic multispectral image fusion method and system based on adaptive Gaussian mixture model

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9342760B1 (en) * 2011-04-22 2016-05-17 Exelis Inc. System and method for combining color information with spatial information in multispectral images
CN103186893A (en) * 2012-12-19 2013-07-03 中国科学院对地观测与数字地球科学中心 Universal high-resolution remote sensing image fusion method
US20140301659A1 (en) * 2013-04-07 2014-10-09 Bo Li Panchromatic Sharpening Method of Spectral Image Based on Fusion of Overall Structural Information and Spatial Detail Information
US20170076456A1 (en) * 2015-09-16 2017-03-16 Raytheon Company Systems and methods for digital elevation map filters for three dimensional point clouds
US20170084008A1 (en) * 2015-09-17 2017-03-23 Raytheon Company Systems and methods for sharpening multi-spectral imagery
CN105160647A (en) * 2015-10-28 2015-12-16 中国地质大学(武汉) Panchromatic multi-spectral image fusion method
CN106327455A (en) * 2016-08-18 2017-01-11 中国科学院遥感与数字地球研究所 Improved method for fusing remote-sensing multispectrum with full-color image
CN107958450A (en) * 2017-12-15 2018-04-24 武汉大学 Panchromatic multispectral image fusion method and system based on adaptive Gaussian mixture model

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"A Fast Intensity–Hue–Saturation Fusion Technique With Spectral Adjustment for IKONOS Imagery";Te-Ming Tu等;《IEEE GEOSCIENCE AND REMOTE SENSING LETTERS,》;20041031;第1卷(第4期);全文 *
"CBERS-01/02卫星CCD图像相对辐射校正研究";郭建宁等;《中国科学 E辑 信息科学》;20051231;第15-16页 *
"基于IHS变换与直方图匹配法的遥感影像融合";伍娟等;《武汉理工大学学报》;20040229;第28卷(第1期);全文 *
"基于IHS变换的多光谱和全色图像融合算法研究";王华伟;《中国优秀硕士学位论文全文数据库 信息科技辑》;20170115;第8-13页 *

Also Published As

Publication number Publication date
CN109447922A (en) 2019-03-08

Similar Documents

Publication Publication Date Title
CN109447922B (en) Improved IHS (induction heating system) transformation remote sensing image fusion method and system
KR100944462B1 (en) Satellite image fusion method and system
CN107123089B (en) Remote sensing image super-resolution reconstruction method and system based on depth convolution network
Imai et al. High-resolution multi-spectral image archives: a hybrid approach
CN104867124B (en) Multispectral and panchromatic image fusion method based on the sparse Non-negative Matrix Factorization of antithesis
CN109754375B (en) Image processing method, system, computer device, storage medium and terminal
CN102446351A (en) Multispectral and high-resolution full-color image fusion method study
CN107958450B (en) Panchromatic multispectral image fusion method and system based on self-adaptive Gaussian filtering
CN108961325A (en) Method for registering between more/high-spectrum remote sensing wave band
CN107016641B (en) A kind of panchromatic and hyperspectral image fusion method based on improvement ratio transformation
Bi et al. Haze removal for a single remote sensing image using low-rank and sparse prior
CN112104847B (en) SONY-RGBW array color reconstruction method based on residual error and high-frequency replacement
CN106875370B (en) Fusion method and device for full-color image and multispectral image
CN109859153B (en) Multispectral image fusion method based on adaptive spectrum-spatial gradient sparse regularization
CN111563866B (en) Multisource remote sensing image fusion method
Kumar et al. A Robust Approach for Image Super-Resolution using Modified Very Deep Convolution Networks
CN108765361A (en) A kind of adaptive PAN and multi-spectral image interfusion method
CN114897706A (en) Full-color multispectral image fusion green vegetation enhancement method
CN111476746A (en) Remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics
CN110084774B (en) Method for minimizing fusion image by enhanced gradient transfer and total variation
CN116205958A (en) Feature coupling-based visible and medium wave infrared image registration and fusion method
CN110163830A (en) Image interfusion method based on Riesz-Lap transformation and PCNN
KR20210096925A (en) Flexible Color Correction Method for Massive Aerial Orthoimages
CN113284067A (en) Hyperspectral panchromatic sharpening method based on depth detail injection network
Zheng A channel-based color fusion technique using multispectral images for night vision enhancement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant