CN117455824A - Image detail self-adaptive enhancement method based on bilateral filtering - Google Patents

Image detail self-adaptive enhancement method based on bilateral filtering Download PDF

Info

Publication number
CN117455824A
CN117455824A CN202311436747.7A CN202311436747A CN117455824A CN 117455824 A CN117455824 A CN 117455824A CN 202311436747 A CN202311436747 A CN 202311436747A CN 117455824 A CN117455824 A CN 117455824A
Authority
CN
China
Prior art keywords
detail
base
image
gray
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311436747.7A
Other languages
Chinese (zh)
Inventor
刘松
张瑞文
谭海
王南
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Huazhong Tianjing Tongshi Technology Co ltd
Original Assignee
Wuhan Huazhong Tianjing Tongshi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Huazhong Tianjing Tongshi Technology Co ltd filed Critical Wuhan Huazhong Tianjing Tongshi Technology Co ltd
Priority to CN202311436747.7A priority Critical patent/CN117455824A/en
Publication of CN117455824A publication Critical patent/CN117455824A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20028Bilateral filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a bilateral filtering-based image detail self-adaptive enhancement method, which comprises the steps of firstly obtaining a large dynamic base image layer for an infrared original image bilateral filter, then separating a detail image by using a non-sharpening mask image enhancement method, calculating a cutoff threshold value of the base image and the detail image, generating a frame self-adaptive parameter according to the extracted detail intensity when the gray level of a few detail layers is cut off, transmitting the frame self-adaptive parameter to the image layer, and finally carrying out curve gray level mapping, and stretching a gray level region needing enhancement so as to achieve the enhancement effect; the invention utilizes the self-adaptive parameters to distinguish the gains of certain details and noise and utilizes the gray features of the detail layers to adaptively strengthen the few-detail image frames, thereby effectively weakening the enhancement of the image noise, avoiding the excessive enhancement of the few-detail frames, leading the signal to noise ratio of the image after the detail enhancement to be higher, and simultaneously leading the compatibility of the few-detail frames caused by virtual focus and other reasons to be stronger in the scene of real-time application, improving the detail enhancement effect and widening the application scene.

Description

Image detail self-adaptive enhancement method based on bilateral filtering
Technical Field
The invention belongs to the technical field of infrared image processing, relates to an image detail self-adaptive enhancement method, and in particular relates to an image detail self-adaptive enhancement method based on bilateral filtering.
Background
The image detail enhancement method based on bilateral filtering is one of the main stream methods for digital image detail enhancement, and the core idea is to separate the large dynamic gray scale characteristics and the small dynamic detail characteristics of the image by utilizing the bilateral filter, and then perform mapping enhancement according to the gray scale characteristics of each image layer after separation.
But in the practical application scenario, two problems of the method are found: firstly, in the space dimension, noise in an image can be amplified along with detail enhancement; secondly, in the actual application scene, the image sensor virtual focus, zooming and the like cause the few detail layers of the partial frames to be excessively enhanced, and finally the whole picture gray level is abnormal, so that the image detail enhancement method is poor in the actual application scene.
Disclosure of Invention
Aiming at the two problems in the prior art, the invention provides an image detail self-adaptive enhancement method based on bilateral filtering, which utilizes self-adaptive parameters to distinguish gains of certain details and noise and utilizes gray features of detail layers to adaptively enhance few detail image frames.
The technical scheme adopted for solving the technical problems is as follows: an image detail self-adaptive enhancement method comprises the following steps:
s1, firstly, the infrared original image f with high bit width is processed in Inputting the two-sided filter, and obtaining a base layer f with large dynamic state through two-sided filtering base
Wherein the bilateral filter s (i-x, j-y) =g (x, y) ·r (i-x, j-y) acts as a gaussian low pass filter kernel in the spatial domainMiddle gray scale domain difference filtering kernel> x and y are the abscissa and ordinate, respectively, of a pixel point in a sliding window of size w x w, and i and j are the abscissa and ordinate, respectively, sigma, of a pixel point of the input image g Sum sigma r Variance parameters of a Gaussian low-pass filter kernel and a gray domain difference filter kernel are respectively represented;
s2, in the obtained large dynamic base layer f base Based on (a), a non-sharpening mask image enhancement method is utilized to pass through the infrared original image f in Subtracting base layer f base Then subtracting the minimum value of the detail graph to perform non-negative treatment, and then obtaining the infrared original graph f in Small dynamics, i.e. detailed diagrams, separate out:
f detail (i,j)=f in (i,j)-f base (i,j)-min{f in (u,v)-f base (u,v)},
wherein the value range of u is 0,1 and … W, the value range of v is 0,1 and … H, W and H are the width and height of the infrared original image matrix, and min is the region minimum operator;
s3, after the base image and the detail image are obtained, gray level truncation is carried out according to respective image distribution parameters, and a truncation threshold value of the base image and a truncation threshold value of the detail image are calculated according to the following formula:
f base_min =mean base -K base_min ·std base
f base_max =mean base +K base_max ·std base
f detail_min =mean detail -K detail_min ·std detail
f detail_max =mean detail +K detail_max ·std detail
wherein mean base And mean detail Image means, K, representing the base image and the detail image respectively base_min And K base_max Std is the standard deviation coefficient of the base map base And std detail The estimation method and the constraint condition of the standard deviation of the image and the standard deviation of the image of the detail graph respectively represent the standard deviation of the base graph are shown as follows:
histsum[mean base -K base_min ·std base ,mean base +K base_max ·std base ]=
α·H·W,
K detail_min and K detail_max The estimation method and the constraint condition of the standard deviation coefficient are shown as follows:
histsum[mean detail -K detail_min ·std detail ,mean detail +K detail_ma ·std detail ]=β·H·W,
wherein histsum [ a, b ] represents the total number of pixels in the gray value interval, alpha is the percentage of pixels reserved after the base map is truncated, and beta is the percentage of pixels reserved after the detail map is truncated;
s4, mapping curve gray scale: according to the respective gray characteristics of the base layer and the detail layer, a gray region to be enhanced is stretched through the following gray mapping function, so that an enhancement effect is achieved:
wherein x is an input normalized gray value, and b and c are adjustment parameters.
Further, the adaptive parameters for generating a detail frame according to the extracted detail intensity are also included between the step S3 and the step S4, wherein the adaptive parameters are generated by the gray level truncation of the detail layerTransmitting to the quantization stage of the image layer, wherein bit_sum is the pixel bit number for processing the infrared image, and gamma is constant;
the extracted detail intensity of the detail frame is measured through the self-adaptive parameter K, and is mapped into a scaling coefficient of a quantization interval in a nonlinear mode, wherein a gray mapping function of the base map is as follows:
wherein x is an input normalized gray value, and b and c are adjustment parameters.
Further, for the base layer f having a large gray scale interval span basw The transformation curve is further optimized by using the following sigmoid activation function, so that the gray stretching area and the stretching coefficient are more reasonable and smooth:
further, for a detail layer whose gray scale interval is relatively concentrated and whose value is small, it is mapped using the following gamma transform:
f detail (x(i,j))=x(i,j) γ ·G(i,j),
wherein, the gamma value is 2, i epsilon [0, W ], j epsilon [0, H ] is pixel position coordinate;
the base layer and detail layer are quantized after curve mapping:
f base_8bit (x)=x·255,
f detail_8bit (x)=x·255·K,
wherein x is a normalized gray value obtained after mapping the respective layers, and if gray is mapped to a gray interval of an 8-bit image, the base layer and the detail layer can be linearly overlapped according to a certain proportion p after quantization:
f 8bit (i,j)=f base8bit (i,j)·(1-p)+f detail8bit (i,j)·p,
p takes a value of 0.4, the final f 8bi The value of (2) is the gray level of the layer obtained after the enhancement.
Further, in the step S1, the sum Gain of weights obtained by calculating the bilateral filter according to the following formula is used as a dynamic variable for measuring the sliding window area:
the larger the fluctuation of gray value of the sliding window area taking the midpoint (i, j) of the image as the center is, the smaller the bilateral filtering weight s is, and the larger the sum Gain of the weights is.
The beneficial effects of the invention are as follows:
the method improves the adaptability to noise on a single image and the compatibility to some image frames with less details on the time dimension by carrying out the adaptability enhancement on the detail layer on the area dimension and the time dimension, and realizes the image detail enhancement through the self-adaption capability of a time-space domain.
On the basis of a non-sharpening mask algorithm frame of the bilateral filter, the method respectively constructs the adaptive parameters of the spatial domain, and limits the noise gain of the low-gradient region of the image. Meanwhile, the invention takes the actual application scene of the algorithm into consideration, and constructs the time domain self-adaptive parameters based on the frame sequence so as to solve the problem that the few dynamic detail frames such as virtual focus, zooming and the like are excessively enhanced in actual use. Therefore, the signal-to-noise ratio of the detail enhancement image and the gain sensitivity of few detail frames are improved, and the overall detail enhancement performance and the adaptability in each scene are improved.
The method can effectively weaken the enhancement of image noise, avoid excessive enhancement of few detail frames, enable the signal to noise ratio of the image after detail enhancement to be higher, simultaneously have stronger compatibility on few detail frames caused by virtual focus and other reasons in a scene of real-time application, promote detail enhancement effect and widen application scenes.
Drawings
FIG. 1 is a flow chart of an adaptive image detail enhancement algorithm of the present invention.
Detailed Description
Embodiments of the present invention are further described below with reference to specific examples and figures.
Aiming at two problems of image detail enhancement based on bilateral filtering, namely an unnecessary enhancement problem of non-detail noise and an unreasonable mapping problem when an image detail gray scale interval is smaller, the invention provides a self-adaptive detail enhancement method based on a basic principle of the filter, and the enhancement of self-adaptive parameters based on an image detail layer is realized.
As shown in FIG. 1, the image detail self-adaptive enhancement method based on bilateral filtering disclosed by the invention takes a 14-bit or 16-bit infrared original image as data input so as to ensure that more details are preserved, and finally, the enhanced 8-bit visible image is obtained through processing and outputting; the method specifically comprises the following steps.
S1, firstly, the infrared original image f with high bit width is processed in Inputting the two-sided filter, and obtaining a base layer f with large dynamic state through two-sided filtering base
Wherein the non-sharpening mask based on the bilateral filter is calculated by the following formula:
the bilateral filter s (i-x, j-y) in the formula is calculated by the following formula:
s(i-x,j-y)=g(x,y)·r(i-x,j-y);
the gaussian low pass filter kernel g (x, y) acting in the spatial domain is calculated from the following formula:
the gray domain difference filter kernel r (i-x, j-y) is calculated by the following formula:
where x and y are the abscissa and ordinate, respectively, of a pixel point in a sliding window of size w×w, and i and j are the abscissa and ordinate, respectively, of a pixel point of the input image (aboveCan be a constant in the above formula), σ g Sum sigma r The variance parameters of the gaussian low pass filter kernel g (x, y) and the gray domain difference filter kernel r (i-x, j-y) are shown, respectively.
The sum Gain of the weights calculated by the bilateral filter can be used as a measure for the dynamic variation of the sliding window area, and the specific construction process is as follows:
from the above equation, if there is a large fluctuation of gray value in the sliding window area centered on the midpoint (i, j) of the image, the obtained bilateral filtering weight s is smaller, and from the above equation, the larger the fluctuation of gray value in the area where a certain point is located, i.e. the more detail or the larger gradient, the larger Gain is, which indicates the rationality of the Gain parameter as the characteristic of the regional detail richness.
S2, in the infrared original image f in After bilateral filtering, a base layer f which retains the large dynamic state of the image is obtained base . On the basis, the small dynamic image, namely the detail image, is separated by using a non-sharpening mask image enhancement method.
The separation process of the non-sharpening mask is as follows:
f detail (i,j)=f in (i,j)-f base (i,j)-min{f in (u,v)-f base (u,v)},
wherein the value range of u is 0,1 and … W, the value range of v is 0,1 and … H, W and H are the width and height of the original image matrix, and min is the region minimum operator.
As can be seen from the above equation, the detail graph calculated by the method is obtained by subtracting the minimum value of the detail graph from the original graph and performing non-negative treatment.
S3, after the base image and the detail image are obtained, gray level cutting is carried out according to respective image distribution parameters. The most critical process of gray level cut-off is to find a proper cut-off threshold.
The process of constructing cutoff thresholds of the base graph and the detail graph by the algorithm is as follows:
f base_min =mean base -K base_min ·std base
f base_max =mean base +K base_max ·std base
f detail_min =mean detail -K detail_min ·std detail
f detail_max =mean detail +K detail_max ·std detail
wherein, mean base And mean detail Image means, std, representing base and detail views, respectively base And std detail Representing the standard deviation of the images of the base and detail views, respectively. K (K) base_min And k base_ma The estimation method and constraint conditions of the standard deviation coefficient are as follows:
histsum[mean base -K base_min ·std base ,mean base +K base_ma ·std base ]=α·H·W,
K detail_min and K detail_max For the standard deviation coefficient of the detail graph, the estimation method and constraint conditions of the coefficient are as follows:
histsum[mean detail -K detail_min ·std detail ,mean detail +K detail_ma ·std detail ]=β·H·W,
where histsum a, b represents the total number of pixels in the gray value interval. Alpha is the percentage of reserved pixels after the base map is truncated, and the alpha value adopted by the algorithm is 99.8%. Beta is the percentage of reserved pixels after the detail graph is truncated, and the beta value adopted by the algorithm is 99.98%.
Gray level truncation is to remove some extreme outliers from the image in order to preserve most of the effective pixel values of the image, on the one hand, and to make the image contain the most information in the smallest gray level interval, on the other hand. For the detail layer after the truncation processing, there is a large difference according to the extracted gray threshold under different scenes. When the detail features extracted from the detail layer image are fewer and the intensity is weak, the gray value and the maximum gray difference value of the detail layer are smaller. At this time, after the detail layer performs the subsequent curve gray mapping and quantization, the detail gray is excessively amplified.
S4, the invention provides a frame self-adaptive parameter aiming at the problem of detail layer excessive enhancement. The enhancement algorithm generates an adaptive parameter according to the extracted detail intensity when the gray level of the detail layer is truncated, and transmits the adaptive parameter to the quantization stage of the image layer.
The generation process of the frame adaptive parameters is as follows:
wherein bit_sum is the number of pixel bits for processing the infrared image, gamma is a constant, and the algorithm takes 0.15.
The formula measures the extracted detail intensity of the detail frame and enables the extracted detail intensity to be non-linearly mapped into a scaling coefficient of a quantization interval.
The extracted detail intensity of the detail frame is measured through the self-adaptive parameter K, and the extracted detail intensity is mapped into a scaling coefficient of the quantization interval in a nonlinear mode.
The gray mapping function of the base map is as follows:
wherein x is an input normalized gray value, b and c are adjustment parameters, b is 0.45, and c is 4.
S5, mapping the gray scale of the curve: the curve gray mapping is to stretch the gray area to be enhanced according to the gray characteristics of the base image and the detail image so as to achieve the enhancement effect.
For a base layer with large gray interval span, compared with basic gamma transformation, the method utilizes a sigmoid activation function:
the transformation curve is further optimized, so that the gray stretching area and the expansion coefficient are more reasonable and smooth.
For the detail layer, the gray scale interval is relatively concentrated and has smaller value.
The invention uses gamma transformation to map the image, and meanwhile, the self-adaptive gain of the region needs to be considered together, and the calculation process is shown in the following formula:
f detail (x(i,j))=x(i,j) γ ·G(i,j),
wherein, gamma is 2, i epsilon [0, W ], j epsilon [0, H ] is pixel position coordinate.
The base layer and the detail layer are quantized after curve mapping, i.e. the gray scale is mapped to the gray scale interval of the 8-bit image. The calculation process is shown in the following two formulas respectively.
f base_8bit (x)=x·255,
f detail_8bit (x)=x·255·K,
Wherein x is a normalized gray value obtained after mapping each layer, and K is a frame adaptive parameter. Then after quantization the two layers can be linearly superimposed according to a certain ratio p. The following formula is shown:
f 8bit (i,j)=f base8bit (i,j)·(1-p)+f detail8bit (i,j)·p,
the value of the algorithm p is 0.4. F finally obtained 8bit The value of (2) is the gray level of the layer obtained after the enhancement.
The algorithm is written by using a C language, and the compiling operation environment is visual Stdio 2017. The input image is a 14-bit infrared original image, and an 8-bit visible single-channel image is obtained after algorithm enhancement.
The above embodiments are merely illustrative of the principles of the present invention and its effectiveness, and some practical embodiments, and variations and modifications may be made by those skilled in the art without departing from the inventive concept, which are all within the scope of the present invention.

Claims (5)

1. A bilateral filtering-based image detail self-adaptive enhancement method is characterized by comprising the following steps of: comprises the following steps of
S1, firstly, the infrared original image f with high bit width is processed in Inputting the two-sided filter, and obtaining a base layer f with large dynamic state through two-sided filtering base
Wherein the bilateral filter s (i-x, j-y) =g (x, y) ·r (i-x, j-y) acts as a gaussian low pass filter kernel in the spatial domainMiddle gray scale domain difference filtering kernel> x and y are the abscissa and ordinate, respectively, of a pixel point in a sliding window of size w x w, and i and j are the abscissa and ordinate, respectively, sigma, of a pixel point of the input image g Sum sigma r Variance parameters of a Gaussian low-pass filter kernel and a gray domain difference filter kernel are respectively represented;
s2, utilizing a non-sharpening mask image enhancement method to pass through an infrared original image f in Subtracting base layer f base Then subtracting the minimum value of the detail graph to carry out non-negative treatment, and separating the detail graph:
f detail (i,j)=f in (i,j)-f base (i,j)-min{f in (u,v)-f base (u,v)},
wherein the value range of u is 0,1 and … W, the value range of v is 0,1 and … H, W and H are the width and height of the infrared original image matrix, and min is the region minimum operator;
s3, gray level truncation is carried out according to respective image distribution parameters, and a truncation threshold of a base image layer and a truncation threshold of a detail image are calculated according to the following formula:
f base_min =meant base -K base_min ·Std base
f base_max =meant base +K base_max ·Std base
f detail_min =mean detail -K detail_min ·std detail
f detail_max =meant detail +K detail_max ·std detail
wherein mean base And mean detail Image means, K, representing the base image and the detail image respectively base_min And K base_ma Std is the standard deviation coefficient of the base map base And std detail The estimation method and the constraint condition of the standard deviation of the image and the standard deviation of the image of the detail graph respectively represent the standard deviation of the base graph are shown as follows:
histsum[mean base -K base_min ·std base ,mean base +K base_max ·std base ]=α·H·W,
K detail_min and K detail_max The estimation method and the constraint condition of the standard deviation coefficient are shown as follows:
histsum[mean detail -K detail_min ·std detail ,mean detail +K detail_max ·std detail ]=β·H·W,
wherein histsum [ a, b ] represents the total number of pixels in the gray value interval, alpha is the percentage of pixels reserved after the base layer is truncated, and beta is the percentage of pixels reserved after the detail layer is truncated;
s4, stretching a gray scale region to be enhanced according to respective gray scale characteristics of the base layer and the detail layer through the following gray scale mapping function so as to achieve an enhancement effect:
wherein x is an input normalized gray value, and b and c are adjustment parameters.
2. The method of claim 1, wherein the steps S3 and S4 further include generating an adaptive parameter of a detail frame according to the extracted detail intensity by gray level truncation of the detail layerTransmitting to the quantization stage of the image layer, wherein bit_sum is the pixel bit number for processing the infrared image, and gamma is constant;
the extracted detail intensity of the detail frame is measured through the self-adaptive parameter K, and is mapped into a scaling coefficient of a quantization interval in a nonlinear mode, wherein a gray mapping function of the base map is as follows:
wherein x is an input normalized gray value, and b and c are adjustment parameters.
3. The image detail self-adaptive enhancement method based on bilateral filtering according to claim 2, wherein for a base layer f with a large gray scale interval span base The transformation curve is further optimized by using the following sigmoid activation function, so that the gray stretching area and the stretching coefficient are more reasonable and smooth:
4. a bilateral filtering-based image detail adaptive enhancement method according to claim 3, wherein for a detail layer with a relatively concentrated gray scale interval and a small value, the following gamma transformation is used to map the detail layer:
f detail (x(i,j))=x(i,j) γ ·G(i,j),
wherein, the gamma value is 2, i epsilon [0, W ], j epsilon [0, H ] is pixel position coordinate;
the base layer and detail layer are quantized after curve mapping:
f base_8bit (x)=x·255,
f detail_8bit (x)=x·255·K,
wherein x is a normalized gray value obtained after mapping the respective layers, and if gray is mapped to a gray interval of an 8-bit image, the base layer and the detail layer are linearly overlapped according to a certain proportion p after quantization:
p takes a value of 0.4, the final f 8b The value of (2) is the gray level of the layer obtained after the enhancement.
5. The method for adaptively enhancing image details based on bilateral filtering according to claim 1, 2, 3 or 4, wherein in step S1, the sum Gain of weights of the bilateral filter calculated by the following formula is used as the dynamic variation for measuring the sliding window area:
the larger the fluctuation of gray value of the sliding window area taking the midpoint (i, j) of the image as the center is, the smaller the bilateral filtering weight s is, and the larger the sum Gain of the weights is.
CN202311436747.7A 2023-11-01 2023-11-01 Image detail self-adaptive enhancement method based on bilateral filtering Pending CN117455824A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311436747.7A CN117455824A (en) 2023-11-01 2023-11-01 Image detail self-adaptive enhancement method based on bilateral filtering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311436747.7A CN117455824A (en) 2023-11-01 2023-11-01 Image detail self-adaptive enhancement method based on bilateral filtering

Publications (1)

Publication Number Publication Date
CN117455824A true CN117455824A (en) 2024-01-26

Family

ID=89583167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311436747.7A Pending CN117455824A (en) 2023-11-01 2023-11-01 Image detail self-adaptive enhancement method based on bilateral filtering

Country Status (1)

Country Link
CN (1) CN117455824A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117689673A (en) * 2024-02-04 2024-03-12 湘潭大学 WC particle electron microscope image segmentation and particle size distribution calculation method based on watershed

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117689673A (en) * 2024-02-04 2024-03-12 湘潭大学 WC particle electron microscope image segmentation and particle size distribution calculation method based on watershed
CN117689673B (en) * 2024-02-04 2024-04-23 湘潭大学 WC particle electron microscope image segmentation and particle size distribution calculation method based on watershed

Similar Documents

Publication Publication Date Title
CN107408296B (en) Real-time noise for high dynamic range images is eliminated and the method and system of image enhancement
Vij et al. Enhancement of images using histogram processing techniques
Lee et al. Adaptive multiscale retinex for image contrast enhancement
CN110570374B (en) Processing method for image obtained by infrared sensor
JP4858610B2 (en) Image processing method
CN111583123A (en) Wavelet transform-based image enhancement algorithm for fusing high-frequency and low-frequency information
Ma et al. An effective fusion defogging approach for single sea fog image
CN110189281B (en) Multi-exposure infrared image fusion method
KR20140142381A (en) Method and Apparatus for removing haze in a single image
CN117455824A (en) Image detail self-adaptive enhancement method based on bilateral filtering
CN110675351B (en) Marine image processing method based on global brightness adaptive equalization
Dhariwal Comparative analysis of various image enhancement techniques
CN111563854B (en) Particle swarm optimization method for underwater image enhancement processing
CN117252773A (en) Image enhancement method and system based on self-adaptive color correction and guided filtering
Mu et al. Low and non-uniform illumination color image enhancement using weighted guided image filtering
Yang et al. Low-light image enhancement based on Retinex theory and dual-tree complex wavelet transform
CN112598612A (en) Flicker-free dim light video enhancement method and device based on illumination decomposition
Lei et al. Low-light image enhancement using the cell vibration model
WO2020107308A1 (en) Low-light-level image rapid enhancement method and apparatus based on retinex
Watanabe et al. An adaptive multi-scale retinex algorithm realizing high color quality and high-speed processing
CN110992287B (en) Method for clarifying non-uniform illumination video
CN111275642B (en) Low-illumination image enhancement method based on significant foreground content
Li et al. Saliency guided naturalness enhancement in color images
CN115456912A (en) Tone mapping method based on multi-scale WLS filtering fusion
CN116862809A (en) Image enhancement method under low exposure condition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination