CN112419193A - Method and system for removing uneven fog of remote sensing image - Google Patents

Method and system for removing uneven fog of remote sensing image Download PDF

Info

Publication number
CN112419193A
CN112419193A CN202011332445.1A CN202011332445A CN112419193A CN 112419193 A CN112419193 A CN 112419193A CN 202011332445 A CN202011332445 A CN 202011332445A CN 112419193 A CN112419193 A CN 112419193A
Authority
CN
China
Prior art keywords
image
remote sensing
atmospheric
sensing image
dark channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011332445.1A
Other languages
Chinese (zh)
Other versions
CN112419193B (en
Inventor
崔光茫
杨顺杰
赵巨峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202011332445.1A priority Critical patent/CN112419193B/en
Publication of CN112419193A publication Critical patent/CN112419193A/en
Application granted granted Critical
Publication of CN112419193B publication Critical patent/CN112419193B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for removing uneven fog of a remote sensing image, which comprises the following steps: acquiring an RGB remote sensing image, and calculating a dark channel prior map and a saturation prior map of the RGB remote sensing image; correcting the dark channel prior image according to saturation prior to form a dark channel-saturation prior image; calculating the atmospheric illumination intensity according to the dark channel-saturation prior map; calculating the transmissivity and the optical thickness T of the object of the remote sensing image1And atmospheric optical thickness T0(ii) a Calculating an Atmospheric Point Spread Function (APSF); and restoring the RGB remote sensing image by adopting a model through a foggy day image degradation model. According to the method and the system, the remote sensing image is corrected by adopting dark channel-saturation prior, so that defogging can be more effectively carried out in a foggy area, and the phenomenon of image color distortion cannot be caused in a fogless area.

Description

Method and system for removing uneven fog of remote sensing image
Technical Field
The invention relates to the field of image defogging, in particular to a method and a system for removing uneven fog of a remote sensing image.
Background
At present, remote sensing image defogging is always a very challenging task, the remote sensing image defogging usually acquires images at a longer distance, a dark channel prior is usually adopted in the prior art to remove haze in outdoor images, but no color distortion phenomenon exists in a haze area when the remote sensing image is processed; in addition, the existing neural network algorithm has poor defogging effect on the non-uniform defogging of the remote sensing image, the image blurring caused by the multiple scattering of the atmosphere is not considered in the prior art, the defogging algorithm is driven and enhanced under the fogless condition, the dark details of the image are lost, and the phenomenon of color distortion exists.
Disclosure of Invention
One of the purposes of the invention is to provide a method and a system for removing uneven fog of a remote sensing image, wherein the method and the system adopt dark channel-saturation prior to correct the remote sensing image, so that the defogging can be more effectively carried out in a foggy area, and the phenomenon of image color distortion cannot be caused in a fogless area.
One of the purposes of the invention is to provide a method and a system for removing uneven fog of a remote sensing image, wherein the method and the system adopt a point diffusion function, and the point diffusion function is added into an atmospheric image degradation model, so that an atmospheric multiple scattering image is clearer.
One of the purposes of the invention is to provide a remote sensing image uneven fog removing method and system, wherein the method is suitable for uneven fog and remote sensing images, and has better fog removing effect compared with the traditional outdoor fog removing method.
In order to achieve at least one of the above objects, the present invention further provides a method for removing uneven fog from a remote sensing image, including:
acquiring an RGB remote sensing image, and calculating a dark channel prior map and a saturation prior map of the RGB remote sensing image;
correcting the dark channel prior image according to saturation prior to form a dark channel-saturation prior image;
calculating the atmospheric illumination intensity according to the dark channel-saturation prior map;
calculating the transmissivity and the optical thickness T of the object of the remote sensing image1And atmospheric optical thickness T0
Calculating an Atmospheric Point Spread Function (APSF);
and restoring the RGB remote sensing image by adopting a model through a foggy day image degradation model.
According to a preferred embodiment of the present invention, the remote sensing image is cut to obtain a plurality of window images, and the dark channel image of each window image is calculated, wherein the dark channel image calculation method comprises: in the window image area, the minimum value of the RGB channel of each area is calculated and obtained, and a dark channel prior map is obtained in fogless weather.
According to another preferred embodiment of the present invention, the maximum saturation of the window image is calculated according to the following formula: j. the design is a squares(x)=W*maxy∈Ω(x)S(y);
Figure BDA0002796208530000021
Wherein Jr(y),Jg(y),Jb(y) three channel values for a pixel, minc∈(r,g,b)Jc(y) represents taking the minimum value in the RGB three channels, W is the intensity level of the image, S (y) is the pixel saturation value, Js(x) The saturation prior map value is obtained, and the dark channel prior is corrected according to the saturation prior to obtain a dark channel-saturation prior map.
According to another preferred embodiment of the present invention, the method for correcting the dark channel prior map comprises: and calculating the difference value between the dark channel prior map value and the saturation prior map value, and acquiring the maximum value between the difference value and a zero value, wherein the maximum value is recorded as the dark channel-saturation prior map value.
According to another preferred embodiment of the present invention, the calculation method of the atmospheric illumination intensity comprises: and calculating the pixel brightness in the dark channel-saturation prior image, selecting a position corresponding to 0.1% of pixels in front of the brightness value as an atmospheric light calculation candidate region, and calculating the average value of the brightness values of all pixels in the candidate region as the atmospheric illumination intensity.
According to another preferred embodiment of the present invention, the transmittance estimated variance is obtained according to an atmospheric scattering imaging formula, and the medium transmittance t is calculated by combining the dark channel-saturation prior map, wherein the medium transmittance calculation formula is:
Figure RE-GDA0002893800780000022
wherein A iscIs atmospheric light, omega is a constant parameter for adjusting the overall haze of the image, Ic(y) is the original RGB three channel image, Js(y) is a saturation prior map, further directed filtering of the transmission map.
According to another preferred embodiment of the invention, the object optical thickness T is calculated from said medium transmittance0And atmospheric optical thickness T1And calculating a formula: t is0=-lnt;T1=-ln(1-t)。
According to another preferred embodiment of the invention, the object is optically thick T0Or atmospheric optical thickness T1Dividing the forward scattering parameter value q into hierarchical functions having at least 3 levels, each layer function having a determined and different forward scattering parameter value q and each layer function corresponding to an object optical thickness range or an atmospheric optical thickness range, acquiring a forward scattering coefficient σ from the forward scattering parameter value q, wherein
Figure BDA0002796208530000031
Further calculating an atmospheric point spread function APSF, wherein the calculation formula is as follows:
Figure BDA0002796208530000032
Figure BDA0002796208530000033
wherein Γ () represents a gamma function, T is the optical thickness, k is a tuning coefficient, σ refers to the forward scattering coefficient, exp is an exponential function with e as the base, U is the blur kernel, i is the abscissa of the blur kernel, and j is the ordinate of the blur kernel.
According to another preferred embodiment of the invention, an image restoration formula is obtained according to a foggy day image degradation model, the remote sensing image is restored into a clear fogless image according to the restoration formula, and the image is restoredThe original formula is as follows:
Figure BDA0002796208530000034
wherein h isoIs the atmospheric point spread function of the light reflected by the object, h0=APSF(i,j;σ,T0),haIs the atmospheric point spread function of atmospheric reflected light, ha=APSF(i,j;σ,T1),
Figure BDA0002796208530000035
For convolution, foObject reflected light, faFor atmospheric reflected light, deconv is deconvolution, max (t, t)0) The transmittance is set to a minimum value.
In order to achieve at least one of the above objects, the present invention further provides a remote sensing image non-uniform fog removing system, which adopts the above remote sensing image non-uniform fog removing method.
Drawings
FIG. 1 is a schematic flow chart showing a method for removing uneven fog from a remote sensing image according to the present invention;
FIG. 2a is a schematic diagram showing the dark channel effect obtained by the conventional defogging method;
FIG. 2b is a schematic diagram showing the effect of the dark channel in the defogging method of the present invention.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
It will be understood by those skilled in the art that in the present disclosure, the terms "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for ease of description and simplicity of description, and do not indicate or imply that the referenced devices or components must be in a particular orientation, constructed and operated in a particular orientation, and thus the above terms are not to be construed as limiting the present invention.
It is understood that the terms "a" and "an" should be interpreted as meaning that a number of one element or element is one in one embodiment, while a number of other elements is one in another embodiment, and the terms "a" and "an" should not be interpreted as limiting the number.
Referring to fig. 1, fig. 2a and fig. 2b, the invention discloses a method for removing uneven fog from a remote sensing image, which is mainly applied to defogging processing of the remote sensing image, wherein the method adopts a saturation prior to correct an original dark channel prior to form a dark channel-saturation prior image, and defogging operation is performed according to the dark channel prior image, so that the problem of blurring of the remote sensing image can be effectively solved, and the removal effect of the uneven fog can be improved.
Specifically, the method comprises the following steps: the method comprises the steps of obtaining an RGB remote sensing image, cutting the RGB remote sensing image, obtaining at least one window image, wherein the window image is smaller than an original RGB remote sensing image, for example, an original image can be cut into 4-by-4 pixel window images, dark channel prior images of the window images are sequentially calculated, and it needs to be noted that the dark channel prior images are images with at least one channel pixel being 0 or pixel intensity being close to 0 under a fog-free condition.
Wherein the dark channel prior is performed based on the window image, obtaining a minimum value of each channel gray level within the window image, wherein the channels include an R channel, a G channel, and a B channel, R, G, B being red, green, and blue, respectively. The dark channel prior map acquires an image with the darkest pixels of at least one channel. Specifically, the dark channel prior map is obtained according to the following formula:
Jd(x)=miny∈Ω(x)(minc∈(r,g,b)Jc(y));
wherein Jd(x) For dark channel image gray value, Jc(y) is the gray level of the original image, minc∈(r,g,b)Jc(y) refers to taking the gray minimum in the RGB three channels, and Ω (x) refers to the local window image centered at x. y belongs to omega (x) and indicates that y pixel range values are in a local window image omega (x), the window image can slide in a complete original image in a window mode to form different window images, dark channel prior is calculated for the window images, and the dark channel prior image is obtained.
Further, after the window image is obtained, a saturation prior map of the window image is calculated, it should be noted that the image saturation is the purity of the color, and when the image saturation is higher, the color is brighter, the relative fog is less, and the image is clearer. Taking the value of the pixel with the maximum saturation in the window image as the saturation value of the window image, and using the value as the saturation value of the window image to obtain a saturation prior map, wherein a specific formula is as follows:
Js(x)=W*maxy∈Ω(x)S(y);
Figure BDA0002796208530000041
where W is the intensity level of the image, S (y) is the pixel saturation value, Js(x) Is a saturation prior map, Jr(y),Jg(y),Jb(y) represents three channels of pixels. minc∈(r,g,b)JcAnd (y) calculating the saturation priors of all window images by taking the minimum value in the RGB three channels.
Correcting the dark channel prior map according to the saturation prior map obtained by calculation, wherein the correction formula is as follows:
Jds(x)=max(Jd(x)-Js(x),0);
wherein, Jds(x) Representing the dark channel-saturation prior map, due to the dark channel of the high-saturation image in the remote sensing imageValue Jd(x) Not always close to 0, therefore max (J)d(x)-Js(x) 0) means that the minimum value of the image cannot be less than 0. In the remote sensing image, the corrected prior image has very low value in a clear area.
Further calculating the atmospheric light according to the dark channel-saturation prior degree image, wherein the atmospheric light is an estimated value, the estimated value is related to the brightness of the pixel in the dark channel-saturation prior degree image, and the specific estimation mode is as follows: and acquiring the brightness of all pixels in the dark channel-saturation prior map, calculating the area where the pixel with the brightness of the first 0.1% is located, calculating the average value of the brightness of the area, and taking the average value as the atmospheric light value of the dark channel-saturation prior map.
Deriving a transmissivity estimation variance according to an atmospheric scattering imaging formula in the prior art, and further calculating the image transmissivity according to a dark channel-saturation prior map obtained by previous calculation
Figure RE-GDA0002893800780000051
Figure RE-GDA0002893800780000052
Wherein A iscIs atmospheric light, omega is a constant parameter for adjusting the overall haze of the image, Ic(y) is the original three-channel image, Js(y) is a saturation prior map and t is the medium transmittance.
In some preferred embodiments of the present invention, the original transmission map can be filtered by using a guided filter, so that the original transmission map can be refined and local image distortion can be avoided.
Obtaining an optical thickness T from the medium transmittance T, wherein the optical thickness T comprises an optical thickness T of the object0And optical thickness T of atmosphere1The calculation formula is as follows:
T0=-lnt;
T1=-ln(1-t);
in order to solve the problem that the picture is unclear after light rays are scattered for many times in the atmosphere, generalized Gaussian distribution is further introduced to approximate an atmospheric point spread function APSF:
Figure BDA0002796208530000053
Figure BDA0002796208530000054
where Γ () represents the gamma function, T is the optical thickness calculated above, k is a conditioning coefficient, σ is the forward scattering coefficient, exp is an exponential function with e as the base, U is the blur kernel, i is the abscissa of the blur kernel, and j refers to the ordinate of the blur kernel. And eliminating the influence caused by atmospheric diffusion according to the atmospheric point diffusion function.
It is worth mentioning that because there is a large difference between the atmospheric scattering of the fogless image and the foggy image in the remote sensing image, different forward scattering coefficients σ are adopted for different fog concentrations of the image, and different forward scattering coefficients σ are formed by calculating different atmospheric thicknesses T, that is, the optical thickness T of the object0And optical thickness T of atmosphere1Two forward scattering coefficients σ can be formed separately. Further defining a forward scatter parameter q, wherein:
Figure BDA0002796208530000061
in order to comprehensively balance the calculated amount and the definition, the invention establishes a layering function for the forward scattering parameter q, wherein the layering function numerically classifies the forward scattering parameter q according to the atmospheric thickness value, and the layering function is as follows:
Figure BDA0002796208530000062
note that T in the hierarchical function can be taken as a valueT0Or T1When taking the value T0According to T0The corresponding forward scattering coefficient sigma can be obtained by the layering function value corresponding to the value, and the atmospheric point diffusion function h of the reflected light of the object can be calculated by substituting the atmospheric point diffusion function formula into the above atmospheric point diffusion function formula0I.e. h0=APSF(i,j;σ,T0) When the value is T1Then, the atmospheric point spread function h of the atmospheric reflected light can be calculatedaI.e. ha=APSF(i,j;σ,T1) And further acquiring the recovered clear fog-free image according to a recovery formula obtained by deducing the degradation model of the fog image, wherein the degradation model formula of the fog image is as follows:
Figure BDA0002796208530000063
the recovery formula is:
Figure BDA0002796208530000064
wherein
Figure BDA0002796208530000065
For convolution, foThe reflected light of the object is a restored image to be acquired, faFor atmospheric reflection, t is the medium transmission, deconv is the deconvolution, max (t, t)0) A minimum value is set for the transmission map.
Because the image obtained according to the restoration formula is a window image, the window image can be spliced according to the initial cutting rule to obtain a complete restoration image.
In particular, according to embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU). It should be noted that the computer readable medium mentioned above in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wire segments, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless section, wire section, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It will be understood by those skilled in the art that the embodiments of the present invention described above and illustrated in the accompanying drawings are given by way of illustration only and not by way of limitation, the objects of the invention having been fully and effectively achieved, the functional and structural principles of the invention having been shown and described in the embodiments, and that the embodiments of the invention may be varied or modified in many ways without departing from said principles.

Claims (10)

1. A remote sensing image uneven fog removing method is characterized by comprising the following steps:
acquiring an RGB remote sensing image, and calculating a dark channel prior map and a saturation prior map of the RGB remote sensing image;
correcting the dark channel prior image according to saturation prior to form a dark channel-saturation prior image;
calculating the atmospheric illumination intensity according to the dark channel-saturation prior map;
calculating the transmissivity and the optical thickness T of the object of the remote sensing image1And atmospheric optical thickness T0
Calculating an Atmospheric Point Spread Function (APSF);
and restoring the RGB remote sensing image by adopting a model through a foggy day image degradation model.
2. The remote sensing image non-uniform fog removal method according to claim 1, characterized in that the remote sensing image is cut to obtain a plurality of window images, and the dark channel image of each window image is calculated, wherein the dark channel image calculation method comprises: in the window image area, the minimum value of the RGB channel of each area is calculated and obtained, and a dark channel prior map is obtained in fog-free weather.
3. The remote sensing image uneven fog removal method according to claim 2, characterized in that the maximum saturation of the window image is calculated according to the following formula: j. the design is a squares(x)=W*maxy∈Ω(x)S(y);
Figure FDA0002796208520000011
Wherein Jr(y),Jg(y),Jb(y) three channel values for a pixel, minc∈(r,g,b)Jc(y) taking the minimum of the RGB three channels, W the intensity level of the image, S (y) the pixel saturation value, Js(x) The saturation prior map value is obtained, and the dark channel prior is corrected according to the saturation prior to obtain a dark channel-saturation prior map.
4. The method for removing the uneven fog of the remote sensing image according to claim 3, wherein the correction method of the dark channel prior image is as follows: and calculating the difference value between the dark channel prior map value and the saturation prior map value, and acquiring the maximum value between the difference value and a zero value, wherein the maximum value is recorded as the dark channel-saturation prior map value.
5. The method for removing the uneven fog of the remote sensing image according to claim 1, wherein the calculation method of the atmospheric illumination intensity is as follows: and calculating the pixel brightness in the dark channel-saturation prior image, selecting a position corresponding to 0.1% of pixels before the brightness value as an atmospheric light calculation candidate region, and calculating the average value of the brightness values of all pixels in the candidate region as atmospheric illumination intensity.
6. The remote sensing image uneven fog removal method according to claim 3, characterized in that a transmission estimation variance is obtained according to an atmospheric scattering imaging formula, and a medium transmission t is calculated by combining the dark channel-saturation prior map, wherein the medium transmission calculation formula is as follows:
Figure RE-FDA0002893800770000021
wherein A iscIs atmospheric light, omega is a constant parameter for adjusting the overall haze of the image, Ic(y) is the original RGB three channel image, Js(y) is a saturation prior map, further directed filtering of the transmission map.
7. The method for removing the uneven fog of the remote sensing image as claimed in claim 6, wherein the optical thickness T of the object is calculated according to the transmittance T of the medium0And atmospheric optical thickness T1And calculating a formula: t is0=-lnt;T1=-ln(1-t)。
8. The method for removing the uneven fog in the remote sensing image as claimed in claim 7, wherein the optical thickness T of the object is used as the basis of0Or atmospheric optical thickness T1Dividing the forward scattering parameter values q into hierarchical functions having at least 3 levels, each layer function having a determined and different forward scattering parameter value q and each layer function corresponding to an object optical thickness range or an atmospheric optical thickness range, acquiring a forward scattering coefficient sigma from said forward scattering parameter values q, wherein
Figure FDA0002796208520000022
Further calculating an atmospheric point spread function APSF, wherein the calculation formula is as follows:
Figure FDA0002796208520000023
Figure FDA0002796208520000024
wherein Γ () represents a gamma function, T is the optical thickness, k is a tuning coefficient, σ refers to the forward scattering coefficient, exp is an exponential function with e as the base, U is the blur kernel, i is the abscissa of the blur kernel, and j is the ordinate of the blur kernel.
9. The method for removing the uneven fog of the remote sensing image according to claim 8, wherein an image restoration formula is obtained according to a fog day image degradation model, the remote sensing image is restored into a clear fog-free image according to the restoration formula, and the image restoration formula is as follows:
Figure FDA0002796208520000025
wherein h isoIs the atmospheric point spread function of the light reflected by the object, h0=APSF(i,j;σ,T0),haIs the atmospheric point spread function of the atmospheric reflected light, ha=APSF(i,j;σ,T1),
Figure FDA0002796208520000026
For convolution, foObject reflected light, faFor atmospheric reflections, deconv is deconvolution, max (t, t)0) The transmittance is set to a minimum value.
10. A remote sensing image uneven fog removal system, which is characterized in that the system adopts the remote sensing image uneven fog removal method as claimed in any one of claims 1-9.
CN202011332445.1A 2020-11-24 2020-11-24 Method and system for removing nonuniform fog of remote sensing image Active CN112419193B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011332445.1A CN112419193B (en) 2020-11-24 2020-11-24 Method and system for removing nonuniform fog of remote sensing image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011332445.1A CN112419193B (en) 2020-11-24 2020-11-24 Method and system for removing nonuniform fog of remote sensing image

Publications (2)

Publication Number Publication Date
CN112419193A true CN112419193A (en) 2021-02-26
CN112419193B CN112419193B (en) 2024-02-02

Family

ID=74778649

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011332445.1A Active CN112419193B (en) 2020-11-24 2020-11-24 Method and system for removing nonuniform fog of remote sensing image

Country Status (1)

Country Link
CN (1) CN112419193B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2568438A2 (en) * 2011-09-08 2013-03-13 Fujitsu Limited Image defogging method and system
CN106530257A (en) * 2016-11-22 2017-03-22 重庆邮电大学 Remote sensing image de-fogging method based on dark channel prior model

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2568438A2 (en) * 2011-09-08 2013-03-13 Fujitsu Limited Image defogging method and system
CN106530257A (en) * 2016-11-22 2017-03-22 重庆邮电大学 Remote sensing image de-fogging method based on dark channel prior model

Also Published As

Publication number Publication date
CN112419193B (en) 2024-02-02

Similar Documents

Publication Publication Date Title
CN107767354B (en) Image defogging algorithm based on dark channel prior
WO2016206087A1 (en) Low-illumination image processing method and device
US9189830B2 (en) Image defogging method and system
US8774555B2 (en) Image defogging method and system
US9197789B2 (en) Method and system for removal of fog, mist, or haze from images and videos
Singh et al. Image dehazing using Moore neighborhood-based gradient profile prior
CN107103591B (en) Single image defogging method based on image haze concentration estimation
CN110689490A (en) Underwater image restoration method based on texture color features and optimized transmittance
TWI808406B (en) Image dehazing method and image dehazing apparatus using the same
CN115578297A (en) Generalized attenuation image enhancement method for self-adaptive color compensation and detail optimization
CN111598814B (en) Single image defogging method based on extreme scattering channel
CN115393216A (en) Image defogging method and device based on polarization characteristics and atmospheric transmission model
Ma et al. An improved color image defogging algorithm using dark channel model and enhancing saturation
CN110335210B (en) Underwater image restoration method
CN103226816A (en) Haze image medium transmission rate estimation and optimization method based on quick gaussian filtering
CN112488948A (en) Underwater image restoration method based on black pixel point estimation backscattering
Hassan et al. A cascaded approach for image defogging based on physical and enhancement models
CN111598800A (en) Single image defogging method based on space domain homomorphic filtering and dark channel prior
CN114119383B (en) Underwater image restoration method based on multi-feature fusion
CN106709876B (en) Optical remote sensing image defogging method based on dark image element principle
CN109345479B (en) Real-time preprocessing method and storage medium for video monitoring data
CN108898561B (en) Defogging method, server and system for foggy image containing sky area
CN112825189B (en) Image defogging method and related equipment
Negru et al. Exponential image enhancement in daytime fog conditions
CN112419193B (en) Method and system for removing nonuniform fog of remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant