CN112907461B - Defogging enhancement method for infrared foggy-day degraded image - Google Patents

Defogging enhancement method for infrared foggy-day degraded image Download PDF

Info

Publication number
CN112907461B
CN112907461B CN202110100817.6A CN202110100817A CN112907461B CN 112907461 B CN112907461 B CN 112907461B CN 202110100817 A CN202110100817 A CN 202110100817A CN 112907461 B CN112907461 B CN 112907461B
Authority
CN
China
Prior art keywords
image
super pixel
value
super
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110100817.6A
Other languages
Chinese (zh)
Other versions
CN112907461A (en
Inventor
李伟华
李范鸣
苗壮
谭畅
穆靖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Institute of Technical Physics of CAS
Original Assignee
Shanghai Institute of Technical Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Institute of Technical Physics of CAS filed Critical Shanghai Institute of Technical Physics of CAS
Priority to CN202110100817.6A priority Critical patent/CN112907461B/en
Publication of CN112907461A publication Critical patent/CN112907461A/en
Application granted granted Critical
Publication of CN112907461B publication Critical patent/CN112907461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30192Weather; Meteorology
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a defogging enhancement method for an infrared foggy weather degraded image. The invention realizes image defogging treatment based on irregular local area information and an atmospheric physical model. According to the complexity of the infrared image scene, the super pixel number and the boundary compactness of the image are preset, and the image segmentation is realized by using simple linear iterative clustering. And obtaining the global atmospheric light value A by using an improved automatic searching method for quadtree decomposition. In the obtained super pixel blocks, according to the value of the pixel matrix in each super pixel block, the visual visibility evaluation method and the upper and lower limit constraint relation of information truncation are analyzed to obtain the optimal transmittance t'. The haze-free image J can be inverted according to the obtained input image I, the atmospheric light value a, and the optimal transmittance t'. According to the method, the optimal parameters of the atmospheric physical model are calculated through the image local information, the method has adaptability to different fog concentrations, defogging inversion of the infrared image is realized, the structure is simple, the implementation is easy, and the imaging quality of the infrared foggy days is effectively improved.

Description

Defogging enhancement method for infrared foggy-day degraded image
Technical field:
the invention belongs to the technical field of image processing, and mainly aims at a degradation image enhancement algorithm of a single-band imaging system in a haze day, which is particularly suitable for enhancing the local blurring of an image, which is caused by high overall brightness and low contrast of an image on an infrared remote sensing image in a haze day, so that the influence of haze on the imaging system is reduced, and the improvement of the image quality is realized.
The background technology is as follows:
in the imaging process of an outdoor long-distance optical system in a haze weather, an infrared imaging system is often influenced by scattering, absorption and reflection effects generated by suspended particles in the atmosphere, so that the phenomena of high overall brightness value, low contrast, reduced dynamic range, reduced definition, blurring and the like are caused by degradation of an observation image, and the influence degree of the degradation on the image is enhanced along with the increase of the distance between a target and observation equipment. The problem of image degradation results in the inability to obtain clear and stable high quality images or sequences, and in the inability of subsequent algorithms for identification, tracking, segmentation, etc. to work effectively. Meanwhile, along with the rapid development of automatic driving and auxiliary driving at present, new requirements on the imaging quality of haze days are also provided. Defogging techniques can currently be divided into two main categories: an enhancement method based on image processing and an enhancement method based on image restoration.
The defogging method based on image enhancement starts from the characteristic of low contrast of a foggy image, improves visual quality through various filtering and linear nonlinear stretching transformation of the image contrast, and the mainstream methods comprise histogram equalization, a filtering method, wavelet change, a Retinex algorithm, an atmospheric modulation transfer function method and the like. These methods have a good effect to some extent, but have problems. For targets with uneven illumination or complex distance change, halo phenomenon and blockiness are easy to cause, the method has poor universality, simultaneously has the problem of large calculation amount, and defogging images do not accord with actual physical laws to a certain extent only from the visual point of view.
Based on the image restoration method, the physical process under the image quality caused by haze weather is started, and the defogging purpose is achieved by inverting the physical degradation process. According to the basic research of the atmospheric particle optics, an atmospheric physical scattering model is obtained, and the haze-free image can be inverted by solving the optimal parameters of the atmospheric physical scattering model. Based on this, this type of approach aims at finding the model optimal parameters, so most need to provide additional prior information in addition to the image, such as scene depth information or additional means to obtain depth assistance information, fog concentration information or scene polarization information, etc. Due to the difficulty in acquiring additional information and the complexity of operation, defogging methods based only on image information are rapidly developed. The dark channel method is a statistical priori information proposed by He Kaiming, and is the most effective defogging priori information at present, but the dark channel method is extremely easy to generate halo and blocky effect due to unreasonable blocking, and a large amount of calculation is needed for eliminating the blocky effect. However, the dark channel method is obtained by counting a large amount of three-channel information of the color image, and is not suitable for the single-band infrared image, so that effective constraint conditions need to be searched to guide the physical restoration process of the infrared degraded image.
The invention comprises the following steps:
in order to overcome the defects in the prior art, the invention provides an infrared foggy weather degradation image defogging enhancement method. The method is based on an image restoration method, the artifact phenomenon is eliminated by replacing rectangular segmentation with super-pixel segmentation, and meanwhile, the physical restoration process is guided by upper and lower limit constraint, so that the calculated amount is small, and the imaging quality is obviously improved.
The above object of the present invention is achieved by the following technical solutions:
the defogging and enhancing method for the infrared foggy-day degraded image is characterized in that after an image is segmented by utilizing a super-pixel segmentation algorithm, an improved quadtree decomposition search algorithm and upper and lower limit constraints are combined to obtain optimal parameters of an atmospheric physical model inversion equation so as to reconstruct the image, and the method comprises the following steps:
(1) The simple linear iterative clustering segmentation algorithm needs to preset the number k of super pixel blocks and the boundary compactness m, wherein the number k of super pixels and the boundary compactness m are respectively set to 40-80 and 30-50;
(2) And (3) initializing cluster centers according to a formula (1) and the super pixel number k, calculating an interval S of the initialized super pixel cluster centers, and uniformly distributing the initialized cluster centers on the infrared image according to the interval S. In order to avoid that the clustering center falls on the image edge to enable the point to form a closed connected domain with surrounding areas, taking a gradient minimum value point in a 3x3 neighborhood range of an initial clustering center point as an initial clustering center;
Figure BDA0002915832260000031
where N is the number of total pixels of the image, k is the number of superpixels,
Figure RE-GDA0003045531930000032
rounding down for reject fraction;
(3) Combining the infrared image brightness domain l and the spatial domain (x, y) into a three-dimensional spatial domain v= [ l, x, y] T D is defined as the distance between the pixel point and the cluster center in three-dimensional space. Calculating the distance D between the pixel point and the clustering center according to the formulas (2), (3) and (4), wherein the pixel point belongs to the minimum DAnd clustering the super pixel blocks in the center. The farther the cluster center is from the pixel point, the probability of the super pixel block belonging to the cluster center is rapidly reduced, and in order to increase the operation speed, the 2S multiplied by 2S range of the neighborhood of the pixel point is taken as the search domain. Updating the clustering center after all the image pixel points are clustered, taking the center point of each clustering block as a new clustering center, and obtaining a super-pixel map with accurate segmentation after four iterations;
Figure BDA0002915832260000033
Figure BDA0002915832260000034
Figure BDA0002915832260000035
wherein d is s For space dimension distance, i.e. coordinate difference, d l For the brightness dimension distance, i.e. the gray value difference, (x) i ,y i ) And (x) j ,y j ) Coordinates of the image pixel point and the clustering center point, l i And l j Respectively the gray values of the pixel points and the clustering center points;
(4) The atmospheric light value A is obtained by using an improved quadtree decomposition search algorithm, and the specific algorithm comprises the following steps of replacing a regular rectangular quartile method by a quartile method based on a super pixel block. Firstly, after the image super-pixel is segmented, each super-pixel block is numbered, so that each pixel point is marked with a corresponding super-pixel block number label. Then dividing the image into four equal parts, averaging the four parts, counting the label numbers of the contained pixels by the part with the largest average value, and forming a new sub-block by the pixels corresponding to the counted label numbers. The newly composed sub-block coordinate area is then divided into four equal parts. This process is repeated until only one tag number is contained in the block with the largest average value. And (5) taking an average value of the gray values of the pixel points of the super pixel block corresponding to the label number according to the formula (5) to obtain a global atmosphere light value A. In some cases, the iteration process falls into a local loop under a special condition that after a plurality of iterations, the number of super pixel blocks contained in the newly formed sub-blocks is less than or equal to four, so that when the new sub-blocks are divided into four equal parts, the label number appearing in the area with the largest statistical average value is equal to the label number of the new sub-blocks, so that the label number is consistent with the atomic block when the next-stage sub-blocks are constructed, and the local loop falls into the local loop, and the local loop can be taken as an iteration termination condition. At this time, since the number of super pixel blocks included in the sub-blocks is small, according to formula (5), calculating the average value of all the super pixel blocks in the sub-blocks, and selecting the maximum value as the global atmospheric light value A;
Figure BDA0002915832260000041
wherein I (x, y) represents a gray value at a coordinate point (x, y) in the super pixel block Ω, and n is a total number of pixels in the super pixel block Ω;
(5) Because of poor non-uniformity of the infrared image, in order to avoid that the maximum gray value point and the minimum gray value point of each super pixel block are just positioned at the blind pixels, the gray values of the pixel points of the super pixel blocks are ordered from small to large, and according to the formula (6) (7), the average value of the gray values of the front 5% and the rear 5% is calculated as the minimum maximum value of the super pixel blocks and is recorded as min_m and max_m:
Figure BDA0002915832260000042
Figure BDA0002915832260000043
wherein, Ω_min and Ω_max are respectively the sets of pixel points corresponding to the first 5% gray value and the last 5% gray value of the super pixel block Ω, and n is the number of total pixel points in the super pixel block;
(6) The Mean square Error MSE (Mean-square Error) is an effective parameter for measuring image quality and can be calculated by the formula (8)A higher value of MSE means that the image has a higher contrast, as calculated, to conform to the visual characteristics of the human eye. From equation (8), it can be deduced that the value of MSE is inversely related to the transmittance t, and a smaller t can obtain a higher value of MSE. For infrared images, a smaller t makes the gray value of the image pixel point more prone to generate data truncation phenomenon, thereby causing information loss, and defining the loss value caused by data truncation as C LOSS Namely, formula (9):
Figure BDA0002915832260000051
Figure BDA0002915832260000052
wherein the method comprises the steps of
Figure BDA0002915832260000053
Is the gray average value of the pixel points of the original image. To prevent information loss, C should be satisfied LOSS Less than or equal to zero. The value of the transmittance that does not cause data loss while making the MSE value larger is the optimal transmittance t'. Deriving the upper and lower limit constraints that can prevent information loss from equation (9) so that the value of the optimal transmittance of the super pixel block can be obtained from equation (10):
Figure BDA0002915832260000054
wherein ω is a constant for controlling the defogging degree, and is set to 0.95;
(7) The optimal atmospheric light value and the optimal transmissivity can be obtained through the steps (4) and (6), and the defogging reconstructed image J can be obtained through calculation according to an inversion formula (11) of an atmospheric physical model;
Figure BDA0002915832260000055
wherein J (x) and I (x) are pixel points of the reconstructed image and the original image respectively, and t' (x) is the optimal transmittance value of the super-pixel block corresponding to the point;
compared with the prior art, the invention has the beneficial effects that:
1) The defogging process is easy to realize, and the transmittance value can be automatically adjusted by self-adapting the distance degree of a target and the haze concentration without obtaining additional information beyond an infrared image and only needing single-frame information and multi-frame information;
2) Through super-pixel segmentation, halo phenomenon and blocking effect can be avoided, so that the imaging quality is more natural, and the human eye vision characteristic is more met.
3) The invention has less input parameters, only needs the input parameters in the segmentation stage, and the acquisition of the global atmosphere light value and the upper and lower limit constraint optimal transmissivity can be automatically carried out by an improved quadtree decomposition search algorithm, so that the defogging reconstruction process can be realized without human intervention.
Drawings
FIG. 1 is a block diagram of an implementation flow of the present invention;
fig. 2 is an original infrared image of a haze day as an input image in the present invention.
Fig. 3 is an output image of the present invention, i.e., defogging reconstructed image.
Detailed Description
The technical solutions in the embodiments of the present invention will be described in detail below with reference to the accompanying drawings in the embodiments of the present invention. Several parameters are involved which need to be adjusted for a specific processing environment to achieve good performance.
The test picture used in the invention is obtained by shooting a 640X 512 short wave infrared camera developed by Shanghai technology physical research institute of Chinese sciences.
Simulation environment: matlab2018b;
test image: short wave infrared image, size 640 x 512, scene is city building;
for the test image, the super pixel number k=60, the boundary compactness m=40, and the defogging degree control constant ω=0.95 in the defogging step (6) are set.

Claims (1)

1. The defogging and enhancing method for the infrared foggy-day degraded image is characterized by comprising the following steps of:
(1) Presetting the number k of super pixel blocks and the compactness m of the super pixel segmentation boundary in a simple linear iterative clustering super pixel segmentation algorithm, wherein the number k of the super pixel blocks is set to be 40-80, and the compactness m is set to be 30-50;
(2) Initializing a cluster center according to the number k of super pixel blocks, wherein the initial cluster center is uniformly distributed in an image by the size S of an interval grid, and in order to avoid that the initial cluster center is just positioned at the edge of the image, taking a pixel point with the minimum gradient value in the 3x3 neighborhood of the initial center point as the initial cluster center, wherein the size S of the grid is as follows:
Figure FDA0003912247060000011
wherein N is the number of total pixel points of the image, k is the number of super pixel blocks,
Figure FDA0003912247060000012
rounding down for reject fraction;
(3) Combining the one-dimensional brightness dimension l and the two-dimensional space dimension (x, y) of the infrared image into a three-dimensional space domain V= [ l, x, y ]] T Defining D as the distance between the pixel points and the clustering center in the three-dimensional space, classifying the pixel points into super pixel blocks for minimizing D, updating the clustering center after all the pixel points of the image are classified, and obtaining a super pixel map with accurate segmentation after four iterations, wherein in order to avoid overlarge calculated amount, the classifying search domain of the pixel points is limited in a range of 2S multiplied by 2S of a neighborhood around the pixel points:
Figure FDA0003912247060000013
Figure FDA0003912247060000014
Figure FDA0003912247060000015
wherein m is a compactness parameter, d s For space dimension distance, i.e. coordinate difference, d l For the brightness dimension distance, i.e. the gray value difference, (x) i ,y i ) And (x) j ,y j ) Coordinates of the image pixel point and the clustering center point, l i And l j Respectively the gray values of the pixel points and the clustering center points;
(4) Using an improved quad-tree decomposition search algorithm based on the super pixel block, obtaining a final super pixel block when the iteration termination condition is met, and calculating an average value re_m of the super pixel block to obtain an optimal global atmospheric light value A;
Figure FDA0003912247060000021
wherein I (x, y) represents a gray value at a coordinate point (x, y) in the super pixel block Ω, and n is a total number of pixels in the super pixel block Ω;
(5) The gray values corresponding to the pixel points in each super pixel block are arranged in a descending order, and the average value of the gray values of the first 5% and the last 5% in each block is calculated to be used as the minimum value and the maximum value of the super pixel block and respectively recorded as min_m and max_m:
Figure FDA0003912247060000022
Figure FDA0003912247060000023
wherein, Ω_min and Ω_max are respectively the sets of pixel points corresponding to the first 5% gray value and the last 5% gray value of the super pixel block Ω, and n is the number of total pixel points in the super pixel block;
(6) To obtain a higher mean square error value C MSE Namely, formula (8) and preventing information truncation from occurringC LOSS Namely, formula (9), the upper and lower limit constraints of the transmittance can be obtained; the optimal transmittance value t' of the super pixel block can be obtained according to an upper limit constraint formula (10) of the transmittance by utilizing the maximum and minimum average values max_m and min_m of the super pixel block:
Figure FDA0003912247060000024
Figure FDA0003912247060000025
Figure FDA0003912247060000026
wherein the method comprises the steps of
Figure FDA0003912247060000031
The gray average value of the pixel points of the original image is set to be 0.95, t is the transmissivity, omega is a constant for controlling defogging degree;
(7) Calculating an inversion formula (11) of the atmospheric physical model by using the obtained global atmospheric light value A and the optimal transmissivity t' of each super pixel block, and obtaining defogging reconstructed images:
Figure FDA0003912247060000032
wherein J (x) and I (x) are the gray values of the pixels of the reconstructed image and the original image, respectively, and t' (x) is the optimal transmittance value of the super-pixel block corresponding to the point.
CN202110100817.6A 2021-01-26 2021-01-26 Defogging enhancement method for infrared foggy-day degraded image Active CN112907461B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110100817.6A CN112907461B (en) 2021-01-26 2021-01-26 Defogging enhancement method for infrared foggy-day degraded image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110100817.6A CN112907461B (en) 2021-01-26 2021-01-26 Defogging enhancement method for infrared foggy-day degraded image

Publications (2)

Publication Number Publication Date
CN112907461A CN112907461A (en) 2021-06-04
CN112907461B true CN112907461B (en) 2023-05-05

Family

ID=76119218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110100817.6A Active CN112907461B (en) 2021-01-26 2021-01-26 Defogging enhancement method for infrared foggy-day degraded image

Country Status (1)

Country Link
CN (1) CN112907461B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549336B (en) * 2021-11-25 2024-05-03 湖南科技大学 Unsupervised image defogging method
CN115063404B (en) * 2022-07-27 2022-11-08 建首(山东)钢材加工有限公司 Weathering resistant steel weld joint quality detection method based on X-ray flaw detection

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111325688A (en) * 2020-02-18 2020-06-23 长安大学 Unmanned aerial vehicle image defogging method fusing morphological clustering and optimizing atmospheric light
CN111667433A (en) * 2020-06-09 2020-09-15 中国电子科技集团公司第五十四研究所 Unmanned aerial vehicle image defogging method based on simple linear iterative clustering optimization

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794688B (en) * 2015-03-12 2018-04-03 北京航空航天大学 Single image to the fog method and device based on depth information separation sky areas
CN108596849B (en) * 2018-04-23 2021-11-23 南京邮电大学 Single image defogging method based on sky region segmentation
CN110428371A (en) * 2019-07-03 2019-11-08 深圳大学 Image defogging method, system, storage medium and electronic equipment based on super-pixel segmentation
CN111598788B (en) * 2020-04-08 2023-03-07 西安理工大学 Single image defogging method based on quadtree decomposition and non-local prior
CN111899198A (en) * 2020-08-06 2020-11-06 北京科技大学 Defogging method and device for marine image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111325688A (en) * 2020-02-18 2020-06-23 长安大学 Unmanned aerial vehicle image defogging method fusing morphological clustering and optimizing atmospheric light
CN111667433A (en) * 2020-06-09 2020-09-15 中国电子科技集团公司第五十四研究所 Unmanned aerial vehicle image defogging method based on simple linear iterative clustering optimization

Also Published As

Publication number Publication date
CN112907461A (en) 2021-06-04

Similar Documents

Publication Publication Date Title
Wang et al. Fast image dehazing method based on linear transformation
Zhu et al. Haze removal method for natural restoration of images with sky
CN106157267B (en) Image defogging transmissivity optimization method based on dark channel prior
CN111292258B (en) Image defogging method based on dark channel prior and bright channel prior
CN103020920B (en) Method for enhancing low-illumination images
Gao et al. Sand-dust image restoration based on reversing the blue channel prior
CN111667433B (en) Unmanned aerial vehicle image defogging method based on simple linear iterative clustering optimization
CN109685045B (en) Moving target video tracking method and system
CN111598791B (en) Image defogging method based on improved dynamic atmospheric scattering coefficient function
CN112907461B (en) Defogging enhancement method for infrared foggy-day degraded image
CN111145105B (en) Image rapid defogging method and device, terminal and storage medium
CN112950589A (en) Dark channel prior defogging algorithm of multi-scale convolution neural network
CN112419163B (en) Single image weak supervision defogging method based on priori knowledge and deep learning
Fu et al. Scene-awareness based single image dehazing technique via automatic estimation of sky area
Fu et al. An anisotropic Gaussian filtering model for image de-hazing
CN110349113B (en) Adaptive image defogging method based on dark primary color priori improvement
CN109345479B (en) Real-time preprocessing method and storage medium for video monitoring data
Zhen et al. Single Image Defogging Algorithm based on Dark Channel Priority.
Li et al. DLT-Net: deep learning transmittance network for single image haze removal
CN113822816A (en) Haze removing method for single remote sensing image optimized by aerial fog scattering model
CN117611501A (en) Low-illumination image enhancement method, device, equipment and readable storage medium
CN109360169B (en) Signal processing method for removing rain and mist of single image
CN115631108A (en) RGBD-based image defogging method and related equipment
CN112598777B (en) Haze fusion method based on dark channel prior
CN111932469A (en) Significance weight quick exposure image fusion method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant