WO2020232710A1 - 雾霾图像质量评价方法、***、存储介质及电子设备 - Google Patents
雾霾图像质量评价方法、***、存储介质及电子设备 Download PDFInfo
- Publication number
- WO2020232710A1 WO2020232710A1 PCT/CN2019/088178 CN2019088178W WO2020232710A1 WO 2020232710 A1 WO2020232710 A1 WO 2020232710A1 CN 2019088178 W CN2019088178 W CN 2019088178W WO 2020232710 A1 WO2020232710 A1 WO 2020232710A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- haze
- haze image
- sky area
- map
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
Definitions
- the invention relates to the technical field of haze image quality evaluation, and more specifically, to a haze image quality evaluation method, system, storage medium and electronic equipment.
- Haze image quality evaluation has broad application prospects. For example, according to the haze image to estimate the haze concentration and impact in time, it can be used to predict the air quality level in weather forecasting, it can be used to estimate visibility in highway monitoring, and it can be used to calculate driving safety in the field of unmanned driving.
- haze image quality evaluation is also divided into two categories: subjective methods and objective methods. Among them, subjective image quality evaluation takes a long time and is difficult to apply in real-time in embedded devices, so it cannot be directly applied to the field of video surveillance. In the objective haze image quality evaluation method, since there is no corresponding original haze-free image in the haze image itself, the focus of the research should be the non-reference haze image quality evaluation algorithm.
- the structural similarity method (Structural similarity index, SSIM) is a reference image quality evaluation method. The greater the evaluation value, the greater the similarity between images. This method considers both brightness and contrast. The formula is as follows:
- ⁇ x and ⁇ y represent the mean values of images x and y, respectively, and represent brightness information.
- ⁇ x and ⁇ y represent the variance of the image x and y, respectively, and represent contrast information, and C 1 , C 2 and C 3 are constants.
- PSNR Peak signal to noise ratio
- L is the total number of gray levels, usually 255, and ⁇ represents the mean square error of the image.
- the Brenner gradient method is a non-reference image quality evaluation method. The larger the value, the higher the image quality.
- the formula is as follows:
- f(x, y) is the gray value of the corresponding pixel of the image
- D(f) is the result of image quality evaluation.
- Point sharpness evaluation method belongs to the non-reference type image quality evaluation method. The higher the value, the better the image quality evaluation result. Xu et al. believe that the greater the edge gray level changes, the higher the sharpness and the lower the haze concentration. Therefore, the image quality can be evaluated by statistical point sharpness.
- the formula is as follows:
- dI/dx represents the gray-scale derivative in the edge direction
- I(b)-I(a) represents the overall gray-scale change in the edge direction
- This method only counts specific image areas, and this area needs to be manually selected, which is not conducive to automation.
- the entropy method belongs to the non-reference image quality evaluation method. The greater the entropy of the image, the better the image quality. Image entropy is based on statistical features, used to measure the richness of image information, and is an important indicator of the amount of image information measured. The formula is as follows:
- Pi is the probability that a pixel with a gray value of i appears in the image
- L is the total number of gray levels.
- Gray scale difference method is a non-reference image quality evaluation method.
- An image with a lower degree of haze has more high-frequency components, so the change in gray level can be used as a basis for evaluating the quality of haze images.
- the formula is as follows:
- f(x, y) represents the gray value of the pixel at the coordinate (x, y) on the image.
- the image quality is evaluated by calculating the average gradient or point sharpness of the image.
- This kind of scheme fails to consider the physical model of the haze image degradation process.
- images of different scenes have different average gradient or point sharpness characteristics. Therefore, such methods are difficult to generalize to the quality comparison of haze images in different scenes.
- the technical problem to be solved by the present invention is to provide a haze image quality evaluation method, system, storage medium and electronic equipment in view of the above-mentioned defects of the prior art.
- the technical solution adopted by the present invention to solve its technical problems is: constructing a method for evaluating the quality of haze images, including the following steps:
- the calculating the first incident light attenuation rate corresponding to the non-sky area according to the first transmittance map and the non-sky area includes:
- the first incident light attenuation rate D non_sky satisfies:
- A is the global atmospheric light value
- I c (x) is the pixel value of the pixel x in the haze image in channel c
- ⁇ r, g, b ⁇ represents the three-color channel
- ⁇ non_sky represents the non- uniformity of the haze image. Sky area.
- the obtaining the non-sky area of the haze image includes:
- S22 Acquire a gradient map of the grayscale image according to edge detection, and convert to generate a binary image
- the acquiring a gradient map of the grayscale image according to edge detection and converting to generate a binary image includes:
- the gradient threshold value is the average value of the gradient of the gradient map
- the brightness threshold value is the average value of the brightness of the gray image.
- the converting the haze image into a grayscale image includes:
- the acquiring the gradient map of the grayscale image according to edge detection includes:
- Adopting Sobel operator, Prewitt operator or Laplacian operator to perform edge detection to obtain the initial gradient map of the gray image
- Median filtering is applied to the initial gradient image to obtain the final gradient image of the grayscale image.
- the present invention also constructs a haze image quality evaluation system, including:
- a first processing unit configured to obtain a first transmittance map corresponding to the haze image based on a dark channel prior method
- the second processing unit is used to obtain the non-sky area of the haze image
- a third processing unit configured to obtain a first incident light attenuation rate corresponding to the non-sky area according to the first transmittance map and the non-sky area;
- the output unit is configured to output the image quality of the haze image according to the first incident light attenuation rate.
- the present invention also constructs a computer storage medium on which a computer program is stored, and when the computer program is executed by a processor, the method for evaluating the quality of the haze image described in any one of the above is realized.
- the present invention also constructs an electronic device, including a memory and a processor
- the memory is used to store computer programs
- the processor is configured to execute the computer program to implement the haze image quality evaluation method as described in any one of the above.
- the implementation of the haze image quality evaluation method, system, storage medium and electronic equipment of the present invention has the following beneficial effects: by eliminating the sky area, the interference of the sky area on the haze image quality evaluation is reduced to optimize the image quality evaluation effect.
- FIG. 1 is a program flow chart of an embodiment of the method for evaluating haze image quality according to the present invention
- FIG. 2 is a program flowchart of another embodiment of the method for evaluating the quality of a haze image of the present invention
- FIG. 3 is a program flow chart of another embodiment of the method for evaluating haze image quality according to the present invention.
- Figure 4 is a schematic diagram of the identification process of non-sky areas in the haze image
- Figure 5 shows the NSDark values corresponding to the non-sky areas of different haze images
- Figure 6 and Figure 7 show the performance comparison of different haze images
- FIG. 8 is a logical block diagram of the first embodiment of the haze image quality evaluation system for detecting the sky area of the present invention.
- the scattering effect of particles in the air causes incident light attenuation, which is called direct light attenuation; the other part is that atmospheric light directly acts on suspended particles in the air, and is received by the imaging device after being scattered, and overlaps on the target image, which is called Additional scattered light.
- incident light attenuation which is called direct light attenuation
- the other part is that atmospheric light directly acts on suspended particles in the air, and is received by the imaging device after being scattered, and overlaps on the target image, which is called Additional scattered light.
- the lower the haze degree the higher the proportion of the direct attenuation light in the image
- the higher the haze degree the higher the proportion of additional scattered light in the image.
- I(x) is the haze image
- J(x) is the clear image
- t(x) is the transmittance map
- A is the global atmospheric light value.
- the dark channel prior method is used to estimate the first transmittance map corresponding to the haze image, and the first transmittance map can be expressed as:
- A represents the global atmospheric light value
- I c (x) represents the pixel value of the pixel x in the haze image in channel c
- ⁇ r, g, b ⁇ represents the three-color channel
- ⁇ represents the overall area of the haze image
- each pixel must have a pixel value close to 0 in the three primary color channels, that is, the elements in the channel composed of the minimum values of the image channels are close to 0.
- the formula is as follows:
- J c (x) represents the pixel value of pixel x in the c channel
- ⁇ r, g, b ⁇ represents the three color channels
- ⁇ represents the overall area of the image, at this time the image is a clear image
- J dark (x) represents the image The dark channel composed of the minimum of the three color channels.
- the first transmittance diagram can be obtained as formula (2).
- A represents the global atmospheric light value
- the average value of the brightness of each channel can be calculated by taking the first 0.01% pixels of the haze image with the maximum brightness.
- the non-sky area of the haze image specifically, the first transmittance map of the haze image acquired above corresponds to the transmittance of the entire image, and because in some scenes, for example, the sky area is cloudy or cloudy In the case of, the sky area in the haze image is too close to the haze, which will affect the judgment of the quality of the haze image. Therefore, it is necessary to further remove the sky area from the haze image to obtain the haze image
- the specific process of calculating the first incident light attenuation rate corresponding to the non-sky area according to the first transmittance map and the non-sky area includes: obtaining the haze according to the first transmittance map The second incident light attenuation rate of the image; the first incident light attenuation rate corresponding to the non-sky area is obtained according to the second incident light attenuation rate and the non-sky area; in another embodiment, it includes: according to the first transmittance map and the non-sky area Obtain the second transmittance map corresponding to the non-sky area in the sky area; obtain the first incident light attenuation rate corresponding to the non-sky area according to the second transmittance map.
- ⁇ represents the wavelength
- ⁇ ( ⁇ ) represents the atmospheric extinction coefficient
- its physical meaning is the relative attenuation rate of electromagnetic wave radiation per unit distance in the atmosphere
- d is the distance between the observation point and the target.
- ⁇ ( ⁇ ) represents the relative attenuation rate per unit distance
- ⁇ ( ⁇ ) ⁇ d represents the total attenuation rate of the light from the scene to the imaging device, that is, the incident light attenuation rate.
- the atmospheric extinction coefficient is a parameter related to the properties and density of aerosols. It is constant when the aerosol properties are constant and uniformly distributed. Therefore, in an image with a constant scene depth d, the incident light attenuation rate D(x) is related to the haze density, and its formula is:
- D(x) represents the attenuation rate of incident light
- ⁇ represents the wavelength
- ⁇ ( ⁇ ) represents the atmospheric extinction coefficient
- t(x) represents the transmittance graph. That is, on the basis of the above, the second incident light attenuation rate D(x) of the haze image can be obtained according to formula (7), and converted to the corresponding first incident light attenuation rate D non_sky according to the obtained non-sky area. It is also possible to first convert the first transmittance map corresponding to the haze image to obtain the second transmittance map t non_sky corresponding to the non-sky area. Based on the above, the converted formula can be:
- A is the global atmospheric light value
- I c (x) is the pixel value of the pixel x in the haze image in channel c
- ⁇ r, g, b ⁇ represents the three-color channel
- ⁇ non_sky represents the non- uniformity of the haze image.
- Sky area The specific process can refer to the above description.
- the attenuation rate D non_sky of the incident light in the non-sky area can be used as the evaluation value of the haze image quality, and the evaluation value can be defined as the NSDark value.
- step S2 obtaining the non-sky area of the haze image includes:
- the minimum filter is used to denoise, and the local scattered noise of the image is covered to make the overall segmentation more reasonable. Finally, the segmented non-sky area and sky area are obtained, so as to calculate the corresponding incident light attenuation rate according to the non-sky area of the haze image.
- the diameter of the minimum filter can be set to 3.
- step S22 acquiring a gradient map of a grayscale image according to edge detection, and converting to generate a binary image includes:
- edge information may appear in the sky area.
- the interference of artificial light sources may also cause the brightness of some areas on the ground to exceed the threshold. Therefore, a preset gradient threshold and a preset brightness threshold need to be set. Convert the first binary image corresponding to the gradient map according to the preset gradient, generate the second binary image corresponding to the brightness of the grayscale image according to the preset brightness threshold, and then merge the gradient and the brightness to generate the final binary image Figure.
- the gradient threshold value is the gradient average value of the gradient map
- the brightness threshold value is the brightness average value of the grayscale image.
- the preset gradient threshold and the preset brightness threshold may be set as the gradient average value of the gradient map and the brightness average value of the grayscale image, respectively. Then, take 0 or 255 for the divided pixels to obtain a binary image. Since the goal of sky recognition is to find the average transmittance of non-sky areas, the slight error at the junction of the sky and the ground caused by the setting of the preset gradient threshold and the preset brightness threshold has little effect on the calculation results.
- step S21 converting the haze image into a grayscale image includes: obtaining an RGB image of the haze image, and adjusting the RGB image according to a preset RGB ratio to obtain the corresponding grayscale image; specifically, considering some algorithms
- the performance improvement of the sky detection effect is limited.
- the conversion of the gray image may adopt a method of converting an RGB image into a gray image proportionally.
- the preset RGB ratio can be used, the weight of the R channel is set to 0.299, the weight of the G channel is set to 0.587, and the weight of the B channel is set to 0.114.
- acquiring the gradient map of the grayscale image according to the edge measurement includes: using Sobel operator, Prewitt operator or Laplacian operator to perform edge detection to acquire the initial gradient map of the grayscale image;
- the gradient map uses median filtering to obtain the final gradient map of the grayscale image.
- obtaining the gradient map of the gray image through a specific edge detection operator can adopt any one of the Sobel operator, the Prewitt operator, or the Laplacian operator.
- the gradation map can be denoised, so that subsequent operations can be performed on the denoised gradient map. Take the specific Laplacian operator as an example.
- L(f) represents the Laplacian detection value
- f represents the gray value of the pixel on the image
- x and y represent the abscissa and ordinate of the pixel respectively.
- L(f) represents the Laplacian detection value
- f(x, y) represents the gray value of the pixel with the coordinate (x, y) on the gray image.
- the sky area is bright and smooth as a whole, and it generally appears as a white area in the Laplacian edge detection image.
- the edge of the cloud, the dust on the imaging device, etc. may all cause interference noise.
- median filtering is needed to reduce noise. By sliding the window, the value of the center pixel of the window is taken as the window average value, which can effectively eliminate noise interference.
- FIG 4 it shows the recognition process of non-sky areas in the haze image.
- A represents the original image
- the edge detection of the gray image B is performed by Laplacian operator, and the conversion is performed to obtain the binary image C
- the minimum value filter is performed on the binary image C to obtain the final
- the non-sky area corresponding to the haze object corresponds to the white area in Figure D.
- Figure 5 shows the attenuation rate of incident light corresponding to the non-sky area of different haze images in the HID2018 database, that is, the NSDark value. It can be seen from Figure 5 that the NSDark values of Figure 5 (d) and (e) are small, and the image is less affected by haze. Figure 5(g) and (h) show that the NSDark value is large, and the image is seriously affected by the haze. The NSDark value is consistent with the subjective evaluation result.
- the process of obtaining the attenuation rate of incident light in the non-sky area D non_sky corresponding to the haze image quality evaluation value, that is, the NSDark value, to evaluate the haze image image quality is a reference-free image quality evaluation process.
- three commonly used non-reference image quality evaluation methods, entropy function (Entropy), gray-scale variance (SMD) and Laplacian gradient are used for performance comparison.
- the subjective score (MOS) of the haze image is used as a benchmark for comparison. Refer to the following table for details:
- Table 1 The data in Table 1 is normalized to obtain the performance comparison of Mos, NSDark and Laplacian shown in Figure 6, where A1 represents the MOS value, A2 represents the NsDark value, A3 represents the Laplacian value, and the Mos and SMD shown in Figure 7.
- A1 represents the MOS value
- A2 represents the NsDark value
- A3 represents the Laplacian value
- B1 is the MOS value
- B2 is the Entropy value
- B3 is the SMD value. It can be seen from Figure 6 and Figure 7 that the overall trend of the NSDark method is consistent with the Laplacian gradient method and MOS, and the monotonicity of its change is better than that of the SMD and Entropy methods.
- the mean square error between different image quality evaluation methods and MOS values is calculated.
- the mean square error between the Laplician value and the MOS value is 4.14
- the mean square error between the Entropy value and the MOS value is 1.13
- the mean square error between the SMD value and the MOS value is 1.91
- the mean square error between the NSDark value and the MOS value is 0.5.
- the mean square error of NSDark and MOS values of all images is the smallest, and the changes of NSDark and MOS values are relatively consistent. Therefore, the evaluation result of NSDark value is slightly better than other quality evaluation methods.
- a haze image quality evaluation system of the present invention includes:
- the obtaining unit 10 is used to obtain a haze image
- the first processing unit 20 is configured to obtain a first transmittance map corresponding to the haze image based on the dark channel prior method
- the second processing unit 30 is used to obtain the non-sky area of the haze image
- the third processing unit 40 is configured to obtain the first incident light attenuation rate corresponding to the non-sky area according to the first transmittance map and the non-sky area;
- the output unit 50 is configured to output the image quality of the haze image according to the first incident light attenuation rate.
- the specific coordination operation process between the units of the haze image quality evaluation system can refer to the above-mentioned haze image quality evaluation method, which will not be repeated here.
- an electronic device of the present invention includes a memory and a processor; the memory is used to store a computer program; the processor is used to execute the computer program to implement any of the above haze image quality evaluation methods.
- the process described above with reference to the flowchart can be implemented as a computer software program.
- an embodiment of the present invention includes a computer program product, which includes a computer program carried on a computer-readable medium, and the computer program contains program code for executing the method shown in the flowchart.
- the computer program when the computer program can be downloaded and installed by an electronic device and executed, it executes the above-mentioned functions defined in the method of the embodiment of the present invention.
- the electronic device in the present invention can be a terminal such as a notebook, a desktop computer, a tablet computer, a smart phone, etc., or a server.
- a computer storage medium of the present invention has a computer program stored thereon, and when the computer program is executed by a processor, any one of the above haze image quality evaluation methods is realized.
- the above-mentioned computer-readable medium of the present invention may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the two.
- the computer-readable storage medium may be, for example, but not limited to, an electric, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the above.
- Computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
- the computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
- a computer-readable signal medium may include a data signal propagated in a baseband or as a part of a carrier wave, and a computer-readable program code is carried therein.
- This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the above.
- the computer-readable signal medium may also be any computer-readable medium other than the computer-readable storage medium.
- the computer-readable signal medium may send, propagate, or transmit the program for use by or in combination with the instruction execution system, apparatus, or device .
- the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to: wire, optical cable, RF (Radio Frequency), etc., or any suitable combination of the above.
- the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or it may exist alone without being assembled into the electronic device.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
Description
图像编号 | MOS | Entropy | SMD | Laplacian | NSDark |
1 | 3.2500 | 7.016 | 3597.809 | 563.637 | 0.159 |
2 | 2.2500 | 6.933 | 2593.533 | 224.517 | 0.295 |
3 | 4.1875 | 6.88 | 4401.999 | 796.295 | 0.103 |
4 | 5.0000 | 7.472 | 2897.599 | 1994.336 | 0.092 |
5 | 5.0000 | 7.466 | 2925.595 | 2097.067 | 0.087 |
6 | 3.7500 | 6.7 | 2746.869 | 783.437 | 0.182 |
7 | 4.0625 | 6.742 | 3252.525 | 1006.469 | 0.197 |
8 | 2.6250 | 6.933 | 2641.013 | 395.499 | 0.241 |
9 | 4.9375 | 7.459 | 2758.44 | 2078.917 | 0.113 |
10 | 3.5625 | 6.666 | 3203.094 | 756.269 | 0.113 |
11 | 5.0000 | 7.481 | 2967.521 | 2211.008 | 0.206 |
12 | 4.6250 | 7.44 | 2749.59 | 1762.722 | 0.113 |
13 | 4.9375 | 7.272 | 2265.036 | 1711.025 | 0.156 |
14 | 3.5000 | 6.467 | 2529.061 | 727.178 | 0.261 |
15 | 2.2500 | 6.636 | 2137.632 | 309.936 | 0.362 |
16 | 2.4375 | 6.682 | 2516.218 | 346.830 | 0.276 |
17 | 1.6250 | 6.818 | 2035.718 | 115.954 | 0.532 |
18 | 1.6875 | 7.072 | 2654.255 | 86.571 | 0.575 |
19 | 1.8125 | 7.096 | 2739.016 | 116.050 | 0.405 |
20 | 4.7500 | 7.29 | 2398.71 | 1701.762 | 0.15 |
21 | 4.8750 | 7.348 | 2367.826 | 1680.980 | 0.146 |
22 | 2.7500 | 6.977 | 3036.286 | 307.178 | 0.259 |
23 | 2.1250 | 6.424 | 1477.099 | 305.824 | 0.439 |
24 | 3.1250 | 6.604 | 2285.189 | 652.431 | 0.278 |
25 | 5.0000 | 7.353 | 2238.621 | 1822.243 | 0.15 |
26 | 4.6250 | 7.235 | 2917.264 | 1189.022 | 0.145 |
27 | 2.7500 | 6.642 | 2752.827 | 403.511 | 0.244 |
Claims (10)
- 一种雾霾图像质量评价方法,其特征在于,包括以下步骤:S1、获取雾霾图像,并基于暗通道先验方法获取所述雾霾图像对应的第一透射率图;S2、获取所述雾霾图像的非天空区域;S3、根据所述第一透射率图和所述非天空区域获取所述非天空区域对应的第一入射光衰减率,以根据所述第一入射光衰减率获取所述雾霾图像的图像质量。
- 根据权利要求1所述的雾霾图像质量评价方法,其特征在于,所述步骤S3中,所述根据所述第一透射率图和所述非天空区域计算所述非天空区域对应的第一入射光衰减率,包括:根据所述第一透射率图获取所述雾霾图像的第二入射光衰减率;根据所述第二入射光衰减率和所述非天空区域获取所述非天空区域对应的第一入射光衰减率;或根据所述第一透射率图和所述非天空区域获取所述非天空区域对应的第二透射率图;根据所述第二透射率图获取所述非天空区域对应的第一入射光衰减率。
- 根据权利要求2所述的雾霾图像质量评价方法,其特征在于,在所述步骤S2中,所述获取所述雾霾图像的非天空区域包括:S21、转换所述雾霾图像为灰度图像;S22、根据边缘检测获取所述灰度图像的梯度图,并转换以生成二值图;S23、对所述二值图进行最小值滤波,以获取所述雾霾图像的非天空区域。
- 根据权利要求4所述的雾霾图像质量评价方法,其特征在于,所述步 骤S22中,所述根据边缘检测获取所述灰度图像的梯度图,并转换以生成二值图包括:S221、根据预设梯度阈值转换所述梯度图以生成第一二值图;S222、根据预设亮度阈值转换所述灰度图像以生成第二二值图;S223、合并所述第一二值图与所述第二二值图,以生成所述二值图。
- 根据权利要求4所述的雾霾图像质量评价方法,其特征在于,所述梯度阈值为所述梯度图的梯度平均值;所述亮度阈值为所述灰度图像的亮度平均值。
- 根据权利要求4所述的雾霾图像质量评价方法,其特征在于,在所述步骤S21中,所述转换所述雾霾图像为灰度图像包括:获取所述雾霾图像的RGB图像,根据预设RGB比例调整所述RGB图像以获取对应的所述灰度图像;和/或在所述步骤S22中,所述根据边缘测测获取所述灰度图像的梯度图,包括:采用Sobel算子、Prewitt算子或Laplacian算子进行边缘检测以获取所述灰度图像的初始梯度图;对所述初始梯度图采用中值滤波以获取所述灰度图像的最终梯度图。
- 一种雾霾图像质量评价***,其特征在于,包括:获取单元,用于获取雾霾图像;第一处理单元,用于基于暗通道先验方法获取所述雾霾图像对应的第一透射率图;第二处理单元,用于获取所述雾霾图像的非天空区域;第三处理单元,用于根据所述第一透射率图和所述非天空区域获取所述非天空区域对应的第一入射光衰减率;输出单元,用于根据所述第一入射光衰减率输出所述雾霾图像的图像质量评价。
- 一种计算机存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至7中任意一项所述的雾霾图像质量评价方法。
- 一种电子设备,其特征在于,包括存储器和处理器;所述存储器用于存储计算机程序;所述处理器用于执行所述计算机程序实现如权利要求1至7中任意一项所述的雾霾图像质量评价方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/088178 WO2020232710A1 (zh) | 2019-05-23 | 2019-05-23 | 雾霾图像质量评价方法、***、存储介质及电子设备 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/088178 WO2020232710A1 (zh) | 2019-05-23 | 2019-05-23 | 雾霾图像质量评价方法、***、存储介质及电子设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020232710A1 true WO2020232710A1 (zh) | 2020-11-26 |
Family
ID=73459333
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/088178 WO2020232710A1 (zh) | 2019-05-23 | 2019-05-23 | 雾霾图像质量评价方法、***、存储介质及电子设备 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020232710A1 (zh) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112465720A (zh) * | 2020-11-27 | 2021-03-09 | 南京邮电大学 | 一种基于图像天空分割的图像去雾方法、装置和存储介质 |
CN113658275A (zh) * | 2021-08-23 | 2021-11-16 | 深圳市商汤科技有限公司 | 能见度值的检测方法、装置、设备及存储介质 |
CN113822816A (zh) * | 2021-09-25 | 2021-12-21 | 李蕊男 | 气雾散射模型优化的单张遥感图像去霾方法 |
CN115941857A (zh) * | 2022-12-30 | 2023-04-07 | 湖南大学 | 除雾电路和方法 |
CN117788336A (zh) * | 2024-02-28 | 2024-03-29 | 山东昆仲信息科技有限公司 | 一种国土空间规划过程中数据优化采集方法及*** |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104809707A (zh) * | 2015-04-28 | 2015-07-29 | 西南科技大学 | 一种单幅雾天图像能见度估计方法 |
CN105424655A (zh) * | 2015-11-04 | 2016-03-23 | 北京交通大学 | 一种基于视频图像的能见度检测方法 |
CN106709903A (zh) * | 2016-11-22 | 2017-05-24 | 南京理工大学 | 基于图像质量的pm2.5浓度预测方法 |
US20170206690A1 (en) * | 2016-01-20 | 2017-07-20 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and storage medium |
US10290081B2 (en) * | 2016-04-29 | 2019-05-14 | Industry Foundation Of Chonnam National University | System for image dehazing by modifying lower bound of transmittance and method therefor |
-
2019
- 2019-05-23 WO PCT/CN2019/088178 patent/WO2020232710A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104809707A (zh) * | 2015-04-28 | 2015-07-29 | 西南科技大学 | 一种单幅雾天图像能见度估计方法 |
CN105424655A (zh) * | 2015-11-04 | 2016-03-23 | 北京交通大学 | 一种基于视频图像的能见度检测方法 |
US20170206690A1 (en) * | 2016-01-20 | 2017-07-20 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and storage medium |
US10290081B2 (en) * | 2016-04-29 | 2019-05-14 | Industry Foundation Of Chonnam National University | System for image dehazing by modifying lower bound of transmittance and method therefor |
CN106709903A (zh) * | 2016-11-22 | 2017-05-24 | 南京理工大学 | 基于图像质量的pm2.5浓度预测方法 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112465720A (zh) * | 2020-11-27 | 2021-03-09 | 南京邮电大学 | 一种基于图像天空分割的图像去雾方法、装置和存储介质 |
CN112465720B (zh) * | 2020-11-27 | 2024-02-23 | 南京邮电大学 | 一种基于图像天空分割的图像去雾方法、装置和存储介质 |
CN113658275A (zh) * | 2021-08-23 | 2021-11-16 | 深圳市商汤科技有限公司 | 能见度值的检测方法、装置、设备及存储介质 |
CN113822816A (zh) * | 2021-09-25 | 2021-12-21 | 李蕊男 | 气雾散射模型优化的单张遥感图像去霾方法 |
CN115941857A (zh) * | 2022-12-30 | 2023-04-07 | 湖南大学 | 除雾电路和方法 |
CN115941857B (zh) * | 2022-12-30 | 2024-04-02 | 湖南大学 | 除雾电路和方法 |
CN117788336A (zh) * | 2024-02-28 | 2024-03-29 | 山东昆仲信息科技有限公司 | 一种国土空间规划过程中数据优化采集方法及*** |
CN117788336B (zh) * | 2024-02-28 | 2024-05-24 | 山东昆仲信息科技有限公司 | 一种国土空间规划过程中数据优化采集方法及*** |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020232710A1 (zh) | 雾霾图像质量评价方法、***、存储介质及电子设备 | |
CN107103591B (zh) | 一种基于图像雾霾浓度估计的单幅图像去雾方法 | |
CN111598791B (zh) | 一种基于改进动态大气散射系数函数的图像去雾方法 | |
CN110287791B (zh) | 一种用于人脸图片的筛选方法和*** | |
CN111062293B (zh) | 一种基于深度学习的无人机森林火焰识别方法 | |
CN110675340A (zh) | 一种基于改进的非局部先验的单幅图像去雾方法及介质 | |
CN109389569B (zh) | 基于改进DehazeNet的监控视频实时去雾方法 | |
WO2021000302A1 (zh) | 基于超像素分割的图像去雾方法、***、存储介质及电子设备 | |
CN108133462B (zh) | 一种基于梯度场区域分割的单幅图像的复原方法 | |
CN117252868B (zh) | 基于机器视觉的直流屏缺陷检测方法 | |
CN111353968B (zh) | 一种基于盲元检测与分析的红外图像质量评价方法 | |
CN117764986B (zh) | 基于图像处理的钛板表面缺陷检测方法 | |
CN105023246B (zh) | 一种基于对比度和结构相似度的图像增强方法 | |
CN115393216A (zh) | 基于偏振特性与大气传输模型的图像去雾方法和装置 | |
CN110349113B (zh) | 一种基于暗原色先验改进的自适应图像去雾方法 | |
Khan et al. | Recent advancement in haze removal approaches | |
CN106709876B (zh) | 一种基于暗像元原理的光学遥感图像去雾方法 | |
CN110322431B (zh) | 雾霾图像质量评价方法、***、存储介质及电子设备 | |
CN112907461A (zh) | 一种红外雾天降质图像去雾增强方法 | |
Yang et al. | Single image dehazing using elliptic curve scattering model | |
CN109949239B (zh) | 一种适用于多浓度多场景雾霾图像的自适应清晰化方法 | |
Mahdi et al. | SINGLE IMAGE DE-HAZING THROUGH IMPROVED DARK CHANNEL PRIOR AND ATMOSPHERIC LIGHT ESTIMATION. | |
Pal et al. | Visibility enhancement of fog degraded images using adaptive defogging function | |
Liang et al. | A self-adaption single image dehaze method based on clarity-evaluation-function of image | |
Fan et al. | Image defogging approach based on incident light frequency |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19929299 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19929299 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22.03.2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19929299 Country of ref document: EP Kind code of ref document: A1 |