CN113269763B - Underwater image definition recovery method based on depth map restoration and brightness estimation - Google Patents

Underwater image definition recovery method based on depth map restoration and brightness estimation Download PDF

Info

Publication number
CN113269763B
CN113269763B CN202110620221.9A CN202110620221A CN113269763B CN 113269763 B CN113269763 B CN 113269763B CN 202110620221 A CN202110620221 A CN 202110620221A CN 113269763 B CN113269763 B CN 113269763B
Authority
CN
China
Prior art keywords
image
depth
underwater
value
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110620221.9A
Other languages
Chinese (zh)
Other versions
CN113269763A (en
Inventor
张维石
杨彤雨
周景春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Maritime University
Original Assignee
Dalian Maritime University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Maritime University filed Critical Dalian Maritime University
Priority to CN202110620221.9A priority Critical patent/CN113269763B/en
Publication of CN113269763A publication Critical patent/CN113269763A/en
Application granted granted Critical
Publication of CN113269763B publication Critical patent/CN113269763B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an underwater image definition recovery method based on depth map restoration and brightness estimation. The method of the invention comprises the following steps: firstly, carrying out equalization treatment on an underwater image; secondly, estimating the relative depth of the equalized underwater image by using a monocular depth estimation model, and then dividing a part with incorrect depth estimation by using an image segmentation strategy to re-estimate; then, smoothing the depth re-estimated region using guided filtering; then, converting the relative depth into absolute depth through a depth normalization operation; equally dividing the image pixel point into a plurality of sections according to the depth value, and searching the potential minimum pixel point of the degraded image in each section; using an underwater image imaging model, fitting parameters by a sub-channel, and estimating and removing backscattering; and finally, estimating the brightness parameter by adopting an automatic brightness value estimation method, adjusting the brightness on the image with the back scattering removed by using the optimal brightness parameter, and removing the color cast of the underwater image.

Description

Underwater image definition recovery method based on depth map restoration and brightness estimation
Technical Field
The invention relates to the technical field of image processing, in particular to an underwater image definition recovery method based on depth map restoration and brightness estimation.
Background
The underwater image plays an important role in the fields of marine geological exploration, marine ecological protection, marine resource development and the like. However, underwater images and videos generally have problems of low contrast, low definition, low chromaticity and the like due to the absorption and scattering effects of water and suspended particulate matters on light. The underwater image with serious degradation has no practical application value. In order to effectively improve the quality of an underwater image, an underwater image enhancement method and an underwater image restoration method are two commonly used methods. In recent years, with the rapid development of a deep learning method in the field of image processing, the use of deep learning to restore the sharpness of an underwater image has also become a popular development trend.
The underwater image enhancement method mainly uses methods such as a filter, histogram equalization and the like to recover the color and saturation of the image. Although the method can effectively improve the visual effect of the image, the relation between the degradation degree and the depth of field is ignored, and the true color of the scene cannot be recovered. The underwater image restoration method mainly reverses the degradation process of the underwater image through an underwater imaging model. Such methods typically use a priori knowledge to solve model parameters, but the prior-based underwater image restoration method has the problem of employing a priori and target scene mismatch. Deep learning-based methods are currently still not mature enough and have a number of problems. On the one hand, the deep learning-based method has fixed parameter estimation values after training and lacks sufficient flexibility in processing complex underwater environments. When the new underwater image type is different from the underwater environment type of the training set, the trained model may not output satisfactory results. On the other hand, deep learning has its own limitations such as: a large number of parameters are required to learn the complex mapping function and whether to find a suitable training set, and the potential value of the deep learning method in practical application is limited. The underwater image definition recovery method based on depth map restoration and brightness estimation not only considers the influence of the imaging range on the underwater image degradation, but also uses a more accurate imaging model. Therefore, when the underwater images of different water body types are processed, the method can obtain stable recovery results with better effects. In addition, the color correction method used by the method can obviously improve the brightness, chromaticity and contrast of the underwater image.
Disclosure of Invention
According to the technical problem, an underwater image definition recovery method based on depth map restoration and brightness estimation is provided. According to the invention, the thinned depth map and the minimum pixel point are used, the backscattering is estimated and removed according to the underwater imaging physical model, the information entropy is used for automatically selecting the optimal brightness parameter, and the integral brightness of the image is adjusted to obtain the colorful underwater image.
The invention adopts the following technical means:
an underwater image definition recovery method based on depth map restoration and brightness estimation comprises the following steps:
step S01: stretching the contrast of the original RGB image, ensuring that the minimum value and the maximum value of pixels of the original image are 0 and 255 respectively, and estimating the relative depth map of the image after stretching the contrast by using a monocular depth estimation model;
step S02: dividing a background region error estimation part of the image subjected to contrast stretching by using a dividing method, re-estimating the depth of the part, and smoothing the re-estimation region by using a guide filter;
step S03: selecting upper and lower depth limits according to the actual depth of the scene to perform depth normalization processing so as to obtain an absolute depth map of the original RGB image;
step S04: dividing each pixel point of an original RGB image into 10 groups from small to large according to the depth value, sequencing each group according to the sum of the RGB values of the pixel points from small to large, and taking 200 pixel points which are sequenced to the front;
step S05: according to the underwater image imaging model, the minimum pixel point and the depth value thereof are respectively fittedWherein A is a value of c Represents atmospheric light; />Representing the backscattering coefficient; j (J) c Representing an undegraded underwater image; beta c D Representing the bandwidth coefficient;
step S06: imaging models and parameters through the underwater imageValue, estimate and remove the back scatter;
step S07: on the image from which the back scattering is removed, sorting the pixel values of the three channels respectively, and taking the pixel values of the pixel points between 0.5% and 2% as brightness parameters according to the interval value of 0.15;
step S08: each brightness parameter corresponds to an image enhanced by the brightness parameter, and the image with the highest information entropy is selected as a final enhancement result.
Further, the image contrast stretching formula in step S01 is:
wherein x is min And x max Respectively representing the minimum value and the maximum value of pixel points in the original image, wherein x represents each pixel point of the image, and y represents the image with stretched contrast.
Further, the image segmentation method in step S02 adopts mahalanobis distance to measure the similarity between points in RGB space; the mahalanobis distance D (z, m) between any point z and average color m in RGB space is given by:
where C represents the covariance matrix of the selected samples.
Further, in step S03, for each relative depth value x in the original depth map, the relative depth value x is converted into an absolute depth value y by using a linear conversion manner, and a specific conversion formula is as follows:
wherein,,representing the maximum value and the minimum value of the depth values in the original depth map; /> Representing the minimum and maximum values of the depth values that need to be converted.
Further, the underwater image imaging model in step S05 is:
wherein A is c Represents atmospheric light;representing the backscattering coefficient; j (J) c Representing an undegraded underwater image; />Representing the bandwidth factor.
Further, the manner of automatically obtaining the optimal enhanced image in step S08 is as follows:
wherein H (D) ri ) Represents the maximum entropy of the information,after the pixel values of the three channels are sequenced, the pixel values of the pixel points between 0.5% and 2% of each channel are selected according to 0.15 as the interval; dc is the image after scattering removal, W 0 =0.1;/>The optimal enhanced image is obtained, wherein the information entropy is calculated as follows:
wherein i represents the gray level of the pixel, p i Representing the occupancy of a pixel having a gray level i in an entire imageRatio.
Compared with the prior art, the invention has the following advantages:
1. for the problems of color distortion of an image enhancement method and large transmissivity estimation deviation of a traditional DCP-based method, the invention uses a novel underwater physical imaging model to estimate back scattering in consideration of an underwater image degradation mechanism, the scattering removal effect is obvious, and the restoration result is close to a real undegraded underwater scene.
2. The invention only needs to obtain the depth map of the image, and does not need to estimate the transmissivity and the background light of the image, and compared with the traditional restoration method, the invention has lower complexity.
3. The invention has better practicability and robustness, and can process various underwater images without the problems of over-enhancement, artifact and the like.
For the reasons, the method can be widely popularized in the fields of image processing and the like.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to the drawings without inventive effort to a person skilled in the art.
Fig. 1 is a schematic flow chart of the underwater image definition recovery method according to the present invention.
FIG. 2 is a graph of the effect of the present invention compared to other methods of underwater image restoration in an offshore scenario; wherein, FIG. 2-1 is an image artwork (marine organism) acquired underwater; FIG. 2-2 is a diagram showing the processing effect of the Li et al UWCNN method; FIGS. 2-3 are graphs showing the effects of the Peng et al GDCP method; FIGS. 2-4 are graphs showing the effect of the Peng et al IBLA method; FIGS. 2-5 are graphs showing the effect of the method according to the present invention.
FIG. 3 is a graph showing the effect of the invention compared with other underwater imaging methods in a turbid water body, wherein FIG. 3-1 is an original image (turtle) acquired underwater; FIG. 3-2 is a diagram showing the processing effect of the Li et al UWCNN method; FIG. 3-3 is a graph of the processing effect of the Peng et al GDCP method; FIGS. 3-4 are graphs showing the effect of the Peng et al IBLA method; FIGS. 3-5 are graphs showing the processing effects of the method of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
As shown in fig. 1, the invention provides an underwater image definition recovery method based on depth map restoration and brightness estimation, which comprises the following steps:
step S01: stretching the contrast of the original RGB image, ensuring that the minimum value and the maximum value of pixels of the original image are 0 and 255 respectively, and estimating the relative depth map of the image after stretching the contrast by using a monocular depth estimation model; the image contrast stretching formula is:
wherein x is min And x max Respectively representing the minimum value and the maximum value of pixel points in an original image, wherein x represents each pixel point of the image, and y represents the image with stretched contrast;
step S02: dividing a background region error estimation part of the image subjected to contrast stretching by using a dividing method, re-estimating the depth of the part, and smoothing the re-estimation region by using a guide filter; the image segmentation method adopts a mahalanobis distance to measure the similarity between each point in an RGB space; the mahalanobis distance D (z, m) between any point z and average color m in RGB space is given by:
wherein C represents the covariance matrix of the selected samples; because the background area of the image is the area farthest from the camera in the whole image, the depth value of the sample point of the result area is estimated to be the 1% depth value of the original depth map after being sequenced from small to large;
step S03: selecting proper approximate depth upper and lower limits (in the range of 0-20 meters) according to the actual depth of a scene to perform depth normalization processing so as to acquire an absolute depth map of the original RGB image; for each relative depth value x in the original depth map, the relative depth value x is converted into an absolute depth value y by using a linear conversion mode, and a specific conversion formula is as follows:
wherein,,representing the maximum value and the minimum value of the depth values in the original depth map; /> Minimum and maximum values representing depth values to be converted, here +.>A depth estimation range (in meters) representing the reality of the image;
step S04: dividing each pixel point of an original RGB image into 10 groups from small to large according to the depth value, sequencing each group according to the sum of the RGB values of the pixel points from small to large, and taking 200 pixel points which are sequenced to the front;
step S05: according to the underwater image imaging model, the minimum pixel point and the depth value thereof are respectively fittedWherein A is a value of c Represents atmospheric light; />Representing the backscattering coefficient; j (J) c Representing an undegraded underwater image; beta c D Representing the bandwidth coefficient; the underwater image imaging model is as follows:
step S06: imaging models and parameters through the underwater imageValue, estimate and remove the back scatter;
step S07: on the image from which the back scattering is removed, sorting the pixel values of the three channels respectively, and taking the pixel values of the pixel points between 0.5% and 2% as brightness parameters according to the interval value of 0.15;
step S08: each brightness parameter corresponds to an image enhanced by the brightness parameter, and an image with highest information entropy is selected as a final enhancement result; the mode of automatically obtaining the optimal enhanced image is as follows:
wherein H (D) ri ) Represents the maximum entropy of the information,after the pixel values of the three channels are sequenced, the pixel values of the pixel points between 0.5% and 2% of each channel are selected according to 0.15 as the interval; d (D) c Is an image after scattering removal, W 0 =0.1;/>The optimal enhanced image is obtained. The information entropy is calculated as follows:
wherein i represents the gray level of the pixel, p i The duty cycle of a pixel point of gray level i in the entire image is represented.
In order to verify the effectiveness of the scattering removal of the invention, underwater images of different scenes are selected as a test set, and the experimental results of the methods of Li et al UWCNN (C.Li, S.Anwar, and F.Porikli, "Underwater scene prior inspired deep underwater image and video enhancement," Pattern Recognit.98,1-11 (2020). UWCNN), peng et al GDCP (Y.Peng, K.Cao, and P.C. Cosman, "Generalization of the Dark Channel Prior for Single Image Restoration," IEEE Trans.image Process.27 (6), 2856-2868 (2018). GDCP), peng et al IBLA (Y.Peng and P.C. Cosman, "Underwater Image Restoration Based on Image Blurriness and Light Absorption," IEEE Trans.image Process.26 (4), 1579-1594 (2017). IBLA) are compared and analyzed qualitatively and quantitatively.
As shown in fig. 2, the present invention provides a contrast effect map in an offshore scene (marine life) with other underwater image restoration methods. As can be seen by comparison, in the effect graph processed by the method, the image contrast is obviously improved and is superior to other methods (Li et al UWCNN, peng et al GDCP, peng et al IBLA). Therefore, the method can correct the color, enhance the contrast of the image and improve the visual effect of the image.
As shown in FIG. 3, the invention provides a graph comparing experimental effects of other methods in turbid water bodies (turtles). By comparing and analyzing with the methods (Li et al UWCNN, peng et al GDCP, peng et al IBLA), the color recovery effect of the processed turtle is obviously better than that of other methods, and the image definition is higher. Therefore, the method can correct the color, enhance the contrast of the image and improve the visual effect of the image.
In order to verify the robustness of the present invention, comparative analysis was performed without reference to the image quality evaluation indexes UIQM and uci qe, and specific data are shown in tables 1 and 2. The larger the non-reference image quality evaluation index is, the better the chromaticity, saturation and contrast of the image generated by the method is, and the better the visual effect can be obtained. The two index data values of the image processed by the method are superior to those of other methods. The method of the invention can effectively improve the color and contrast of the image.
TABLE 1 non-reference image quality assessment index (UIQM) for the results of the inventive and other methods
Raw image UWCNN GDCP IBLA Our
1.1217 1.2694 1.4964 1.4435 1.6065
0.6224 0.5232 0.9900 1.0983 1.4034
TABLE 2 non-reference image quality evaluation index (UCIQE) for processing results of the methods and other methods of the present invention
Raw image UWCNN GDCP IBLA Our
0.4328 0.4572 0.5594 0.5431 0.6577
0.3220 0.3126 0.3730 0.4754 0.6341
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced with equivalents; such modifications and substitutions do not depart from the spirit of the technical solutions according to the embodiments of the present invention.

Claims (5)

1. The underwater image definition recovery method based on depth map restoration and brightness estimation is characterized by comprising the following steps of:
step S01: stretching the contrast of the original RGB image, ensuring that the minimum value and the maximum value of pixels of the original image are 0 and 255 respectively, and estimating the relative depth map of the image after stretching the contrast by using a monocular depth estimation model;
step S02: dividing a background region error estimation part of the image subjected to contrast stretching by using a dividing method, re-estimating the depth of the part, and smoothing the re-estimation region by using a guide filter;
step S03: selecting upper and lower depth limits according to the actual depth of the scene to perform depth normalization processing so as to obtain an absolute depth map of the original RGB image;
step S04: dividing each pixel point of an original RGB image into 10 groups from small to large according to the depth value, sequencing each group according to the sum of the RGB values of the pixel points from small to large, and taking 200 pixel points which are sequenced to the front;
step S05: according to the underwater image imaging model, the minimum pixel point and the depth value thereof are respectively fittedWherein A is a value of c Represents atmospheric light; />Representing the backscattering coefficient; j (J) c Representing an undegraded underwater image; />Representing the bandwidth coefficient;
the underwater image imaging model in step S05 is:
wherein A is c Represents atmospheric light;representing the backscattering coefficient; j (J) c Representing an undegraded underwater image; />Representing the bandwidth coefficient;
step S06: imaging models and parameters through the underwater imageValue, estimate and remove the back scatter;
step S07: on the image from which the back scattering is removed, sorting the pixel values of the three channels respectively, and taking the pixel values of the pixel points between 0.5% and 2% as brightness parameters according to the interval value of 0.15;
step S08: each brightness parameter corresponds to an image enhanced by the brightness parameter, and the image with the highest information entropy is selected as a final enhancement result.
2. The underwater image definition restoration method based on depth map restoration and brightness estimation according to claim 1, wherein the image contrast stretching formula in step S01 is:
wherein x is min And x max Respectively representing the minimum value and the maximum value of pixel points in the original image, wherein x represents each pixel point of the image, and y represents the image with stretched contrast.
3. The underwater image definition restoration method based on depth map restoration and brightness estimation according to claim 1, wherein the image segmentation method in step S02 adopts mahalanobis distance to measure the similarity between points in RGB space; the mahalanobis distance D (z, m) between any point z and average color m in RGB space is given by:
where C represents the covariance matrix of the selected samples.
4. The underwater image definition restoration method based on depth map restoration and brightness estimation according to claim 1, wherein in step S03, for each relative depth value x in the original depth map, it is converted into an absolute depth value y by using a linear conversion method, and a specific conversion formula is as follows:
wherein,,representing the maximum value and the minimum value of the depth values in the original depth map; /> Representing the minimum and maximum values of the depth values that need to be converted.
5. The underwater image definition restoration method based on depth map restoration and brightness estimation according to claim 1, wherein the manner of automatically obtaining the optimal enhanced image in step S08 is as follows:
wherein,,representing the maximum entropy of information +.>After the pixel values of the three channels are sequenced, the pixel values of the pixel points between 0.5% and 2% of each channel are selected according to 0.15 as the interval; d (D) c Is an image after scattering removal, W 0 =0.1;/>The optimal enhanced image is obtained, wherein the information entropy is calculated as follows:
wherein i represents the gray level of the pixel, p i The duty cycle of a pixel point of gray level i in the entire image is represented.
CN202110620221.9A 2021-06-03 2021-06-03 Underwater image definition recovery method based on depth map restoration and brightness estimation Active CN113269763B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110620221.9A CN113269763B (en) 2021-06-03 2021-06-03 Underwater image definition recovery method based on depth map restoration and brightness estimation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110620221.9A CN113269763B (en) 2021-06-03 2021-06-03 Underwater image definition recovery method based on depth map restoration and brightness estimation

Publications (2)

Publication Number Publication Date
CN113269763A CN113269763A (en) 2021-08-17
CN113269763B true CN113269763B (en) 2023-07-21

Family

ID=77234232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110620221.9A Active CN113269763B (en) 2021-06-03 2021-06-03 Underwater image definition recovery method based on depth map restoration and brightness estimation

Country Status (1)

Country Link
CN (1) CN113269763B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115496694B (en) * 2022-09-30 2023-07-25 湖南科技大学 Method for recovering and enhancing underwater image based on improved image forming model

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017048927A1 (en) * 2015-09-18 2017-03-23 The Regents Of The University Of California Cameras and depth estimation of images acquired in a distorting medium
CN112488948A (en) * 2020-12-03 2021-03-12 大连海事大学 Underwater image restoration method based on black pixel point estimation backscattering

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017048927A1 (en) * 2015-09-18 2017-03-23 The Regents Of The University Of California Cameras and depth estimation of images acquired in a distorting medium
CN112488948A (en) * 2020-12-03 2021-03-12 大连海事大学 Underwater image restoration method based on black pixel point estimation backscattering

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于颜色衰减先验和白平衡的水下图像复原;韩辉;周妍;蔡晨东;;计算机与现代化(第04期);全文 *

Also Published As

Publication number Publication date
CN113269763A (en) 2021-08-17

Similar Documents

Publication Publication Date Title
CN111047530B (en) Underwater image color correction and contrast enhancement method based on multi-feature fusion
CN109872285B (en) Retinex low-illumination color image enhancement method based on variational constraint
CN111489303A (en) Maritime affairs image enhancement method under low-illumination environment
CN111127359B (en) Underwater image enhancement method based on selective compensation of colors and three-interval equalization
CN110458792B (en) Method and device for evaluating quality of face image
CN112488948B (en) Underwater image restoration method based on black pixel point estimation back scattering
CN110782407B (en) Single image defogging method based on sky region probability segmentation
CN109816605A (en) A kind of MSRCR image defogging method based on multichannel convolutive
CN114118144A (en) Anti-interference accurate aerial remote sensing image shadow detection method
Mostafa et al. Evaluating the effects of image filters in CT liver CAD system
CN111325688B (en) Unmanned aerial vehicle image defogging method for optimizing atmosphere light by fusion morphology clustering
CN116823686B (en) Night infrared and visible light image fusion method based on image enhancement
CN112330613B (en) Evaluation method and system for cytopathology digital image quality
CN105678245A (en) Target position identification method based on Haar features
CN115511907B (en) Scratch detection method for LED screen
CN111476744B (en) Underwater image enhancement method based on classification and atmospheric imaging model
CN113269763B (en) Underwater image definition recovery method based on depth map restoration and brightness estimation
CN114119383B (en) Underwater image restoration method based on multi-feature fusion
CN114693548B (en) Dark channel defogging method based on bright area detection
CN108596843B (en) Underwater image color recovery algorithm based on bright channel
CN117197064A (en) Automatic non-contact eye red degree analysis method
CN115908155A (en) NSST domain combined GAN and scale correlation coefficient low-illumination image enhancement and denoising method
CN116433525A (en) Underwater image defogging method based on edge detection function variation model
Dixit et al. Image contrast optimization using local color correction and fuzzy intensification
CN116797468A (en) Low-light image enhancement method based on self-calibration depth curve estimation of soft-edge reconstruction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant