CN114418854A - Unsupervised remote sensing image super-resolution reconstruction method based on image recursion - Google Patents

Unsupervised remote sensing image super-resolution reconstruction method based on image recursion Download PDF

Info

Publication number
CN114418854A
CN114418854A CN202210079653.8A CN202210079653A CN114418854A CN 114418854 A CN114418854 A CN 114418854A CN 202210079653 A CN202210079653 A CN 202210079653A CN 114418854 A CN114418854 A CN 114418854A
Authority
CN
China
Prior art keywords
image
resolution
super
low
resolution image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210079653.8A
Other languages
Chinese (zh)
Inventor
张浩鹏
梅寒
姜志国
谢凤英
赵丹培
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202210079653.8A priority Critical patent/CN114418854A/en
Publication of CN114418854A publication Critical patent/CN114418854A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an unsupervised remote sensing image super-resolution reconstruction method based on image recursion; acquiring an original low-resolution image; extracting a degradation kernel from an original low-resolution image; adopting a degradation core to carry out down-sampling on the original low-resolution image to obtain a down-sampled low-resolution image; training a super-resolution reconstruction network model based on the original low-resolution image and the down-sampled low-resolution image; and testing the low-resolution image to be tested by adopting a super-resolution reconstruction network model to obtain a corresponding super-resolution reconstruction result. The method can apply the super-resolution reconstruction method to the remote sensing image, thereby improving the accuracy of obtaining the pairing data from the remote sensing image and enabling the network to realize good reconstruction effect on the down sampling used by reconstruction synthesis.

Description

Unsupervised remote sensing image super-resolution reconstruction method based on image recursion
Technical Field
The invention belongs to the technical field of digital image processing, and particularly relates to an unsupervised remote sensing image super-resolution reconstruction method based on image recursion.
Background
The super-resolution reconstruction is a method for reconstructing a low-resolution image with less details into a high-resolution image with rich texture details, and the single-frame image super-resolution reconstruction refers to an image processing task of processing input image data into a non-sequence image and processing the single-frame low-resolution image data into a high-resolution image with rich texture details and a good visual effect. At present, a deep learning method is mostly adopted for remote sensing image super-resolution reconstruction, and the supervised image super-resolution reconstruction method based on the deep learning is trained through images of low-resolution images and high-resolution images in the same region. Since an imaging sensor on a satellite cannot simultaneously obtain an image pair of a low-resolution image and a high-resolution image of the same area, it is very difficult to acquire a paired remote sensing image data set, some supervised image super-resolution reconstruction methods use a synthesized image pair for training, the synthesized image pair is generated by using a fixed down-sampling method, such as bicubic down-sampling, and a degradation method of an actual low-resolution remote sensing image may be very different from the fixed down-sampling method, which also results in poor performance of a training model in reconstructing a true low-resolution remote sensing image.
Therefore, how to apply the super-resolution reconstruction method to the remote sensing image improves the accuracy of obtaining the paired data from the remote sensing image, and enables the network to realize a good reconstruction effect on the down-sampling used for reconstruction synthesis becomes a key problem of current research.
Disclosure of Invention
In view of the above problems, the present invention provides an unsupervised remote sensing image super-resolution reconstruction method based on image recursion, which at least solves some of the above technical problems, and by this method, the super-resolution reconstruction method can be applied to the remote sensing image, so as to improve the accuracy of obtaining the paired data from the remote sensing image, and enable the network to achieve good reconstruction effect on the down-sampling used for reconstruction synthesis.
The embodiment of the invention provides an unsupervised remote sensing image super-resolution reconstruction method based on image recursion, which comprises the following steps:
s1, acquiring an original low-resolution image;
s2, extracting a degradation kernel from the original low-resolution image; adopting the degradation core to perform down-sampling on the original low-resolution image to obtain a down-sampled low-resolution image;
s3, training a super-resolution reconstruction network model based on the original low-resolution image and the down-sampled low-resolution image;
and S4, testing the low-resolution image to be tested by adopting the super-resolution reconstruction network model, and obtaining a corresponding super-resolution reconstruction result.
Further, in S2, the constraint condition of the degenerate kernel is expressed as:
Figure BDA0003485580690000021
wherein argmin represents the minimum of the whole function; i represents the abscissa of the original low-resolution image; j represents the ordinate of the original low resolution image; m represents a constant weight; k represents a degenerate nucleus; kijValues representing different locations of the degenerate kernel; m isijRepresenting constant weights at different locations in the original low resolution image; indicates that the degradation kernel is acting on the image; (I)LRK) ↓srepresents the original low-resolution image is downsampled by adopting the degradation core; i isLRRepresenting the original low resolution image; ↓ represents image downsampling; s represents the ratio of down-sampling.
Further, S3 is specifically expressed as:
Figure BDA0003485580690000022
wherein the content of the first and second substances,
Figure BDA0003485580690000031
representing a mapping function from the low-resolution image after down sampling to the original low-resolution image, namely training content of a training super-resolution reconstruction network model; i isLRRepresenting the original low resolution image; i isLLRRepresenting the downsampled low resolution image; ↓ represents image downsampling; s represents the ratio of down-sampling; λ represents a trade-off parameter; phi (I)HR) Representing a regularization term; i isHRRepresenting a high resolution image.
Further, it is characterized in that,
Figure BDA0003485580690000032
wherein the content of the first and second substances,
Figure BDA0003485580690000033
representing a mapping function from the low-resolution image after down sampling to the original low-resolution image, namely training content of a training super-resolution reconstruction network model;
Figure BDA0003485580690000034
represents the degradation process of the down-sampled low resolution image, i.e. the mapping function from the original low resolution image to the high resolution image.
Further, in S4, when the resolution of the to-be-measured low-resolution image is consistent with the resolution of the original low-resolution image, the obtained super-resolution reconstruction result is optimal.
Further, the total loss function of the super-resolution reconstruction network model is represented as:
LG_total=λ1·Lcbper·Lperadv·LG_adv (4)
wherein λ is1Representing a pixel loss coefficient; l iscbRepresents pixel loss; lambda [ alpha ]perRepresenting a perceptual loss coefficient; l isperRepresenting a loss of perception; lambda [ alpha ]advRepresenting the coefficient of resistance loss;LG_advindicating resistance to loss.
Further, the super-resolution reconstruction network model comprises a discriminator and a generator;
the discriminator is used for assisting the generator to train;
the generator is used for generating a super-resolution reconstruction result.
Further, an SAF module is included in the generator;
the SAF module is a remote sensing image super-resolution reconstruction module based on spatial attention fusion and is used for fusing channel attention and spatial attention so as to improve the performance of the super-resolution reconstruction network model.
Compared with the prior art, the unsupervised remote sensing image super-resolution reconstruction method based on image recursion has the following beneficial effects:
and (3) constructing a constraint relation by utilizing the recursion of the image, realizing the super-resolution reconstruction of the unsupervised remote sensing image, and realizing the super-resolution reconstruction only by using the low-resolution image.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
fig. 1 is a flowchart of an unsupervised remote sensing image super-resolution reconstruction method based on image recursion provided by an embodiment of the present invention.
Fig. 2 is a schematic diagram of a generator network structure in a super-resolution reconstruction network model according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating the comparison between the method of the present invention and the prior art.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Referring to fig. 1, an embodiment of the present invention provides an unsupervised remote sensing image super-resolution reconstruction method based on image recursion, which specifically includes the following steps:
s1, acquiring an original low-resolution image;
s2, extracting a degradation kernel from the original low-resolution image; adopting the degradation core to perform down-sampling on the original low-resolution image to obtain a down-sampled low-resolution image;
s3, training a super-resolution reconstruction network model based on the original low-resolution image and the down-sampled low-resolution image;
and S4, testing the low-resolution image to be tested by adopting the super-resolution reconstruction network model, and obtaining a corresponding super-resolution reconstruction result.
The above steps will be described in detail below.
In the above step S2, the original low resolution image I is subjected toLRUnsupervised degradation kernel extraction is performed, and then the original low resolution image I is verified using the degradation kernelLRDown-sampling to obtain lower low resolution image, i.e. the down-sampled low resolution image ILLR(ii) a Wherein the constraint of the degenerate kernel is expressed as:
Figure BDA0003485580690000051
wherein, in the first term of the above formula
Figure BDA0003485580690000052
In, argmin represents the minimum of the whole function; indicates that the degradation kernel is acting on the image; (I)LRK) ↓srepresents the original low-resolution image is downsampled by adopting the degradation core; i isLRRepresenting the original low resolution image; ↓ represents image downsampling; s represents the ratio of down-sampling; and ILRsIndicating that the low resolution image is down-sampled using an ideal kernel; this enables the lower resolution low resolution image to retain important low frequency information consistent with the original true low resolution image. Among these, the ideal kernel is an ideal downsampling kernel, and can be regarded as a downsampling kernel consistent with forming a low-resolution image. Second term of the formula |1- Σ Ki,jThe sum of k is constrained to be 1 in |, and i represents the abscissa of the original low-resolution image; j represents the ordinate of the original low resolution image; k represents a degenerate nucleus; kijValues representing different positions of the degenerate kernel. Third term of formula
Figure BDA0003485580690000053
Center the centroid of k in the image, where (x)0,y0) Is the central index; equation fourth term | ∑ Ki,j·mi,jI constrains the boundaries of the estimated degenerate kernels; m represents a constant weight; m isijRepresenting constant weights at different locations in the original low resolution image; the fifth term of the formula |1-D ((I)LR*K)↓S) I enables the discriminator to ensure the consistency of the degraded image domain with the original image domain.
This stage is a kernel estimation stage, after which the original low resolution image ILRGenerating a downsampled low resolution image I after satisfying the constraint of equation (1)LLR
The above step S3 is a training phase, i.e. using the output { I ] of the kernel estimation phaseLR,ILLRAnd training the super-resolution reconstruction network model, wherein the super-resolution reconstruction network learning is lower in scoreThe mapping between the resolution image to the low resolution image is as in equation (2).
Figure BDA0003485580690000061
Wherein the content of the first and second substances,
Figure BDA0003485580690000062
representing a mapping function from the low-resolution image after down sampling to the original low-resolution image, namely training content of a training super-resolution reconstruction network model; i isLRRepresenting the original low resolution image; i isLLRRepresenting the downsampled low resolution image; ↓ represents image downsampling; s represents the ratio of down-sampling; λ represents a trade-off parameter; phi (I)HR) Representing a regularization term; i isHRRepresenting a high resolution image.
In the embodiment of the invention, the main target of the super-resolution reconstruction network model is to learn the mapping function
Figure BDA0003485580690000063
Since the acquisition of the degradation kernel is learned from the real low-resolution image by the constraint formula (1), I is generatedLLRThe degradation process of (2) is approximately the same as that of the real low-resolution image, and according to the characteristic of the recursion of the image, the two mapping functions are approximately equal, as shown in formula (3).
Figure BDA0003485580690000064
Wherein the content of the first and second substances,
Figure BDA0003485580690000065
representing a mapping function from the low-resolution image after down sampling to the original low-resolution image, namely training content of a training super-resolution reconstruction network model;
Figure BDA0003485580690000066
indicating lower miningAnd (3) a degradation process of the sampled low-resolution image, namely a mapping function from the original low-resolution image to the high-resolution image.
From the formula (3), the trained image super-resolution reconstruction model can obtain a reliable super-resolution reconstruction result of the target resolution by testing the same images of the original low-resolution images. Therefore, an effective unsupervised super-resolution reconstruction strategy is constructed.
The step S4 is a testing stage, in which when the resolution of the low-resolution image to be tested is consistent with the resolution of the original low-resolution image, the obtained super-resolution reconstruction result is optimal.
The super-resolution reconstruction network used in the embodiment of the invention is used for generating a countermeasure network, and a generator for generating the countermeasure network is shown in fig. 2, wherein a basic module of the network is a combination module of a dense block and a channel attention module (CA), a batch normalization BN (batch normalization) layer is not used in the whole network, and due to the fact that the BN layer is added, artificial artifacts are easily generated and the generalization capability of the model is easily affected, and the SAF module is added in the generator of the embodiment of the invention, so that the expression capability of the model can be further enhanced.
In the embodiment of the present invention, a patch discriminator (patch discriminator), also called markov discriminator, which has been widely used in unsupervised image transformation generation countermeasure networks, is selected as the discriminator network.
The patch arbiter also addresses two drawbacks of the VGG-128 used in the ESRGAN network: 1) VGG-128 limits the size of the generated image to 128, which is difficult to multi-scale training. 2) The VGG-128 has a deeper network structure with a fixed fully connected layer that allows the discriminator to focus more on global features and ignore local features. The patch discriminator used by the IRSR is a full convolution network, a three-layer network corresponds to a 70 x 70 patch, so that each output value of the discriminator is only related to the patch of a local fixed area, and then the patch is fed back to a generator through patch loss to optimize the gradient of local details, and the final error is the average value of all local errors so as to ensure global consistency.
According to the unsupervised super-resolution reconstruction method, the loss function is used for constraining the original input low-resolution image and the original reconstructed image, so that the input low-resolution image and the reconstructed image are consistent as much as possible.
The overall loss function of the generator is the following equation:
LG_total=λ1·Lcbper·Lperadv·LG_adv (4)
wherein λ is1Representing a pixel loss coefficient; l iscbRepresents pixel loss; lambda [ alpha ]perRepresenting a perceptual loss coefficient; l isperRepresenting a loss of perception; lambda [ alpha ]advRepresenting the coefficient of resistance loss; l isG_advIndicating resistance to loss.
Loss by pixel (pixel loss) LcbPerceptual loss LperAnd antagonistic loss (adaptive loss) LG_advThe composition is that Charbonier Loss is selected for constraint in pixel Loss as shown in formula (5):
Figure BDA0003485580690000081
Figure BDA0003485580690000082
where ε is a very small constant (1 × 10 is chosen herein)-6) For the image generation type task,/2The loss function is easy to cause image blurring, and most researchers adopt l1Loss function, comparison with1The loss function, Charbonier loss function, is still derivable at 0, which also solves for l1The loss function is not derivable at 0, resulting in an unstable result at the generated value of 0. The perception loss is realized by using the trained VGG-19, and the sparsity of the features of the activated layer can be effectively solved by converting the perception loss into the feature space, so that the performance of the whole network is improved, and the increase of the performance of the whole network is facilitatedStrong edges, etc. low frequency features. The resistance loss is expressed by the following formula and is used for enhancing the texture detail of the generated image so as to make the generated image more vivid.
Figure BDA0003485580690000083
Figure BDA0003485580690000084
Finally, the technical effect of the method provided by the invention is explained by applying the method provided by the embodiment of the invention to the remote sensing image of the house building.
The embodiment of the invention adopts unsupervised super-resolution reconstruction based on image recursion to carry out super-resolution reconstruction on the remote sensing image. The method only uses the original low-resolution remote sensing image for training, and realizes the unsupervised super-resolution reconstruction of the remote sensing image through a three-stage network.
The image used in the embodiment of the invention is a public remote sensing data set Inria, the evaluation indexes are PSNR, SSIM, ERGAS and NIQE, the higher the PSNR and SSIM is, the higher the image quality is, the lower the ERGAS and NIQE are, and the higher the reconstructed image quality is.
TABLE 1 comparison of the patented Process with other Processes
Method PSNR(dB) SSIM ERGAS NIQE
IBP[32] 25.18 0.6667 3.141 20.21
BDB[81] 24.19 0.6243 3.589 21.98
GPR[89] 24.89 0.6311 3.232 20.97
FSR[90] 23.79 0.5586 3.897 22.98
EUSR[69] 25.21 0.6798 3.132 18.79
UGSR[68] 18.13 0.3987 6.153 25.89
ZSSR[58] 27.02 0.7001 3.025 18.16
IRSR(ours) 27.66 0.7220 2.679 16.73
As can be seen from the table, the method provided by the embodiment of the invention realizes the optimal index, and the index is improved in comparison with the remote sensing image unsupervised methods EUSR and UGSR based on deep learning, or in comparison with the zsssr method which obtains the optimal performance in the unsupervised super-resolution reconstruction of the natural image, and the index is greatly improved in comparison with the unsupervised super-resolution reconstruction methods IBP, BDB, GPR and the like of the non-deep learning. The four indexes used by the test verification can comprehensively reflect the effectiveness of the IRSR, the optimal result is realized, and the visualization result is shown in fig. 3.
As can be seen from fig. 3, the reconstructed image according to the embodiment of the present invention has the finest texture details, specifically referring to fig. i), the test image has a more complex building texture, and compared with the reconstructed images by other methods, the texture details and edge features of the house are more consistent with the texture detail features of the true value of the high resolution image, for example, the tower on the right, only the reconstructed image by the method of the present invention has the edge details consistent with the true value image, and other methods all generate different degrees of deformation.
The embodiment of the invention provides an unsupervised remote sensing image super-resolution reconstruction method based on image recursion. The generation countermeasure network is adopted to realize the image super-resolution reconstruction, and both the generator and the discriminator are improved, so that the network performance is improved. The structure of the conventional convolutional neural network for super-resolution reconstruction is improved. The modified network reconstruction has better effect and higher reconstructed image quality.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (8)

1. An unsupervised remote sensing image super-resolution reconstruction method based on image recursion is characterized by comprising the following steps:
s1, acquiring an original low-resolution image;
s2, extracting a degradation kernel from the original low-resolution image; adopting the degradation core to perform down-sampling on the original low-resolution image to obtain a down-sampled low-resolution image;
s3, training a super-resolution reconstruction network model based on the original low-resolution image and the down-sampled low-resolution image;
and S4, testing the low-resolution image to be tested by adopting the super-resolution reconstruction network model, and obtaining a corresponding super-resolution reconstruction result.
2. The unsupervised remote sensing image super-resolution reconstruction method based on image recursion as claimed in claim 1, wherein in S2, the constraint condition of the degradation kernel is expressed as:
Figure FDA0003485580680000011
wherein argmin represents the minimum of the whole function; i represents the abscissa of the original low-resolution image; j represents the ordinate of the original low resolution image; m represents a constant weight; k represents a degenerate nucleus; kijValues representing different locations of the degenerate kernel; m isijRepresenting constant weights at different locations in the original low resolution image; indicates that the degradation kernel is acting on the image; (I)LRK) ↓srepresents the original low-resolution image is downsampled by adopting the degradation core; i isLRRepresenting the original low resolution image; ↓ represents image downsampling; s represents the ratio of down-sampling.
3. The unsupervised remote sensing image super-resolution reconstruction method based on image recursion as claimed in claim 2, wherein said S3 is specifically expressed as:
Figure FDA0003485580680000012
wherein the content of the first and second substances,
Figure FDA0003485580680000021
representing a mapping function from the low-resolution image after down sampling to the original low-resolution image, namely training content of a training super-resolution reconstruction network model; i isLRRepresenting the original low resolution image; i isLLRRepresenting the downsampled low resolution image; ↓ represents image downsampling; s represents the ratio of down-sampling; λ represents a trade-off parameter; phi (I)HR) Representing a regularization term; i isHRRepresenting a high resolution image.
4. The method for reconstructing the super-resolution of the unsupervised remote sensing image based on the image recursion property as claimed in claim 3,
Figure FDA0003485580680000022
wherein the content of the first and second substances,
Figure FDA0003485580680000023
representing a mapping function from the low-resolution image after down sampling to the original low-resolution image, namely training content of a training super-resolution reconstruction network model;
Figure FDA0003485580680000024
represents the degradation process of the down-sampled low resolution image, i.e. the mapping function from the original low resolution image to the high resolution image.
5. The method for super-resolution reconstruction of unsupervised remote sensing images based on image recursion according to claim 1, wherein in S4, when the resolution of the low-resolution image to be measured is consistent with the resolution of the original low-resolution image, the obtained super-resolution reconstruction result is optimal.
6. The method for reconstructing the super-resolution of the unsupervised remote sensing image based on the image recursion property of claim 1, wherein the total loss function of the super-resolution reconstruction network model is represented as follows:
LG_total=λ1·Lcbper·Lperadv·LG_adv (4)
wherein λ is1Representing a pixel loss coefficient; l iscbRepresents pixel loss; lambda [ alpha ]perRepresenting a perceptual loss coefficient; l isperRepresenting a loss of perception; lambda [ alpha ]advRepresenting the coefficient of resistance loss; l isG_advIndicating resistance to loss.
7. The unsupervised remote sensing image super-resolution reconstruction method based on image recursion as claimed in claim 1, wherein the super-resolution reconstruction network model comprises a discriminator and a generator;
the discriminator is used for assisting the generator to train;
the generator is used for generating a super-resolution reconstruction result.
8. The method for reconstructing the super-resolution of the unsupervised remote sensing image based on the image recursion property of claim 7, wherein the generator comprises an SAF module;
the SAF module is a remote sensing image super-resolution reconstruction module based on spatial attention fusion and is used for fusing channel attention and spatial attention so as to improve the performance of the super-resolution reconstruction network model.
CN202210079653.8A 2022-01-24 2022-01-24 Unsupervised remote sensing image super-resolution reconstruction method based on image recursion Pending CN114418854A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210079653.8A CN114418854A (en) 2022-01-24 2022-01-24 Unsupervised remote sensing image super-resolution reconstruction method based on image recursion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210079653.8A CN114418854A (en) 2022-01-24 2022-01-24 Unsupervised remote sensing image super-resolution reconstruction method based on image recursion

Publications (1)

Publication Number Publication Date
CN114418854A true CN114418854A (en) 2022-04-29

Family

ID=81276704

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210079653.8A Pending CN114418854A (en) 2022-01-24 2022-01-24 Unsupervised remote sensing image super-resolution reconstruction method based on image recursion

Country Status (1)

Country Link
CN (1) CN114418854A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116469047A (en) * 2023-03-20 2023-07-21 南通锡鼎智能科技有限公司 Small target detection method and detection device for laboratory teaching

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110363704A (en) * 2019-05-29 2019-10-22 西北大学 Merge the image super-resolution rebuilding model construction and method for reconstructing of form and color
CN113344793A (en) * 2021-08-04 2021-09-03 深圳市安软科技股份有限公司 Image super-resolution reconstruction method, device, equipment and storage medium
CN113643182A (en) * 2021-08-20 2021-11-12 中国地质大学(武汉) Remote sensing image super-resolution reconstruction method based on dual learning graph network
WO2021233008A1 (en) * 2020-05-21 2021-11-25 腾讯科技(深圳)有限公司 Super-resolution reconstruction method and related device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110363704A (en) * 2019-05-29 2019-10-22 西北大学 Merge the image super-resolution rebuilding model construction and method for reconstructing of form and color
WO2021233008A1 (en) * 2020-05-21 2021-11-25 腾讯科技(深圳)有限公司 Super-resolution reconstruction method and related device
CN113344793A (en) * 2021-08-04 2021-09-03 深圳市安软科技股份有限公司 Image super-resolution reconstruction method, device, equipment and storage medium
CN113643182A (en) * 2021-08-20 2021-11-12 中国地质大学(武汉) Remote sensing image super-resolution reconstruction method based on dual learning graph network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JINLONG LIU 等: "Single image super-resolution using feature adaptive learning and global structure sparsity", SIGNAL PROCESSING, 5 June 2021 (2021-06-05) *
张强 等: "基于自训练字典学习的单幅图像的超分辨率重建", 红外技术, vol. 37, no. 9, 30 September 2015 (2015-09-30), pages 2 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116469047A (en) * 2023-03-20 2023-07-21 南通锡鼎智能科技有限公司 Small target detection method and detection device for laboratory teaching

Similar Documents

Publication Publication Date Title
CN111754403B (en) Image super-resolution reconstruction method based on residual learning
CN109886871B (en) Image super-resolution method based on channel attention mechanism and multi-layer feature fusion
CN111192200A (en) Image super-resolution reconstruction method based on fusion attention mechanism residual error network
CN109214989B (en) Single image super resolution ratio reconstruction method based on Orientation Features prediction priori
CN111047515A (en) Cavity convolution neural network image super-resolution reconstruction method based on attention mechanism
CN112132959B (en) Digital rock core image processing method and device, computer equipment and storage medium
CN110415199B (en) Multispectral remote sensing image fusion method and device based on residual learning
Xu et al. High quality remote sensing image super-resolution using deep memory connected network
Yang et al. Image super-resolution based on deep neural network of multiple attention mechanism
CN111402138A (en) Image super-resolution reconstruction method of supervised convolutional neural network based on multi-scale feature extraction fusion
CN104899835A (en) Super-resolution processing method for image based on blind fuzzy estimation and anchoring space mapping
CN115511767B (en) Self-supervised learning multi-modal image fusion method and application thereof
CN111640067B (en) Single image super-resolution reconstruction method based on three-channel convolutional neural network
CN109118428B (en) Image super-resolution reconstruction method based on feature enhancement
CN113538246A (en) Remote sensing image super-resolution reconstruction method based on unsupervised multi-stage fusion network
CN114140442A (en) Deep learning sparse angle CT reconstruction method based on frequency domain and image domain degradation perception
CN117788295B (en) Super-resolution reconstruction method, system and medium for remote sensing image
CN114418854A (en) Unsupervised remote sensing image super-resolution reconstruction method based on image recursion
Zhu et al. Super resolution reconstruction method for infrared images based on pseudo transferred features
CN112184552B (en) Sub-pixel convolution image super-resolution method based on high-frequency feature learning
CN117333750A (en) Spatial registration and local global multi-scale multi-modal medical image fusion method
CN117593199A (en) Double-flow remote sensing image fusion method based on Gaussian prior distribution self-attention
CN116612009A (en) Multi-scale connection generation countermeasure network medical image super-resolution reconstruction method
CN116029908A (en) 3D magnetic resonance super-resolution method based on cross-modal and cross-scale feature fusion
CN115293983A (en) Self-adaptive image super-resolution restoration method fusing multi-level complementary features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination