CN107256539A - A kind of image sharpening method based on local contrast - Google Patents

A kind of image sharpening method based on local contrast Download PDF

Info

Publication number
CN107256539A
CN107256539A CN201710438517.2A CN201710438517A CN107256539A CN 107256539 A CN107256539 A CN 107256539A CN 201710438517 A CN201710438517 A CN 201710438517A CN 107256539 A CN107256539 A CN 107256539A
Authority
CN
China
Prior art keywords
mrow
image
mtd
sharpening
msup
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710438517.2A
Other languages
Chinese (zh)
Other versions
CN107256539B (en
Inventor
赵蓝飞
王爱丽
王博
吕鑫淼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Science and Technology
Original Assignee
Harbin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Science and Technology filed Critical Harbin University of Science and Technology
Priority to CN201710438517.2A priority Critical patent/CN107256539B/en
Publication of CN107256539A publication Critical patent/CN107256539A/en
Application granted granted Critical
Publication of CN107256539B publication Critical patent/CN107256539B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

The present invention relates to image sharpening method, specifically related to a kind of image sharpening method based on local contrast, the present invention can excessively strengthen the contrast of image to solve existing sharpening mask algorithm at the edge of image high-contrast, so as to cause the adjacent edges of sharpening image to there is serious Gray Level Jump region, and then influence the shortcoming of sharpening image subjective effect, and a kind of image sharpening method based on local contrast is proposed, turning YUV methods by RGB first extracts the luminance picture not comprising color composition.Secondly a kind of image sharpening model is proposed, gaussian filtering is carried out to Y channel datas and the Weber contrasts of Y passages are calculated;Reuse a piecewise function with S-shaped shape to stretch Weber contrasts, so as to enhance the local detail and texture features of image;Finally, turn RGB methods using YUV, the Y passages after sharpening and undressed U, V channel data are converted into RGB image, so as to realize image sharpening.The present invention is applied to image sharpening instrument.

Description

A kind of image sharpening method based on local contrast
Technical field
The present invention relates to image sharpening method, and in particular to a kind of image sharpening method based on local contrast.
Background technology
Real luminance picture needs just be converted into the gray level image that can be directly displayed by many steps.By It is not accurate enough in electronic device design, the problems such as material existing defects, cause these image-forming steps can for generation gray level image Introduce a certain degree of image blur phenomena.Image blurring image-forming step is typically caused to include:CCD image sensor or CMOS is integrated to the quantity of the photon in certain area, so as to obscure script clearly analog luminance signal;Color reconstruction The difference approach of process reduces definition of image color etc..
Because image sharpening techniques can suppress the problem of image blurring that is introduced by imaging process, therefore the technology is by more next It is applied to more in digital image processing field.The image sharpening method of more prevalence is to sharpen mask (unsharp now Mask, USM) algorithm.The subject matter that USM is present is, at the edge of image high-contrast, exceedingly to strengthen the contrast of image, So as to cause the adjacent edges of sharpening image to there is serious Gray Level Jump region.This region reduces the image of sharpening image Quality, have impact on the subjective effect of sharpening image.
The content of the invention
The invention aims to solve existing sharpening mask algorithm at the edge of image high-contrast excessively to increase The contrast of strong image, so as to cause the adjacent edges of sharpening image to there is serious Gray Level Jump region, and then influences to sharpen The shortcoming of image subjective effect, and propose a kind of image sharpening method based on local contrast.
A kind of image sharpening method based on local contrast, including:
Step 1: input RGB image;
Step 2: the RGB image is converted into yuv data;
Step 3: Y channel data therein to the yuv data carries out gaussian filtering and calculates the Y channel datas Weber contrast u (x, y);
Step 4: build one be used for the Weber contrasts carry out stretching conversion processing piecewise function f (u (x, y));
Step 5: the Y channel datas are handled by equation below:
Wherein, I (x, y) is the data after processing;F (x, y) is Y channel datas;ωcFor Gauss rectangle window center pixel Weight;G is gaussian kernel function, and expression formula is:
Wherein σ2It is the yardstick of Gaussian kernel;
F*G represents that original image carries out convolution algorithm with gaussian kernel function;
Step 6: the corresponding yuv data of Y channel datas after step 5 is handled is converted into RGB image and exported.
Beneficial effects of the present invention are:1st, inhibit the situation of pseudo-edge, that is, avoid the occurrence of produced near graph outline it is bright Secretly contrast violent transitional region;2nd, the overall definition of image is improved;3rd, the present invention is compared to sharpening mask algorithm, information Entropy improves 6.3%, and standard deviation improves 4.9%, and average Euclidean distance improves 9.4%.
Brief description of the drawings
Fig. 1 is the flow chart of the image sharpening method based on local contrast of the present invention;
Fig. 2 is the curve map of contrast stretching function in the present invention;
Fig. 3 is the original image in emulation experiment of the present invention;
Fig. 4 is the image after being sharpened in emulation experiment by USM algorithms to Fig. 3;
Fig. 5 is the image after being sharpened in emulation experiment by the method for the present invention to Fig. 3.
Embodiment
Embodiment one:The image sharpening method based on local contrast of present embodiment, as shown in figure 1, bag Include:
Step 1: input RGB image.
Step 2: RGB image is converted into yuv data.
Step 3: Y channel data therein to yuv data carries out gaussian filtering and calculates Weber pairs of Y channel datas Than degree u (x, y).
Step 4: building a piecewise function f (u (x, y)) for being used to carry out Weber contrasts stretching conversion processing.
Step 5: Y channel datas are handled by equation below.
Wherein, I (x, y) is the data after processing;F (x, y) is Y channel datas;ωcFor Gauss rectangle window center pixel Weight;(x, y) is the plane coordinates of pixel;G is gaussian kernel function, and expression formula is:
Wherein σ2It is the yardstick of Gaussian kernel.F*G represents that original image carries out convolution algorithm with gaussian kernel function.One convolution The yardstick of window size 5 × 5 is shown in 1.5 Gauss rectangular window function following table:
0.0144 0.0281 0.0351 0.0281 0.0144
0.0281 0.0547 0.0683 0.0547 0.0281
0.0351 0.0683 0.0853 0.0683 0.0351
0.0281 0.0547 0.0683 0.0547 0.0281
0.0144 0.0281 0.0351 0.0281 0.0144
Step 6: the corresponding yuv data of Y channel datas after step 5 is handled is converted into RGB image and exported.
From above-mentioned steps as can be seen that present invention offer is a kind of can to suppress the image sharpening operator of this false contouring.It is first First pass through RGB and turn luminance picture (Y passage) of the YUV methods extraction not comprising color composition.Secondly a kind of image sharpening mould is proposed Type, this operator needs to carry out gaussian filtering to Y channel datas and calculates the Weber contrasts of Y passages;Reuse one Piecewise function with S-shaped shape is stretched to Weber contrasts, so as to enhance the local detail and texture features of image; Finally, turn RGB methods using YUV, the Y passages after sharpening and undressed U, V channel data are converted into RGB image, from And realize image sharpening.
Embodiment two:Present embodiment from unlike embodiment one:Step 2 is specially:
Pass through formula
RGB image is converted into yuv data.For conventional RGB Three Channel Color images:First, turned by RGB YUV model separations brightness (Y passages) and two colour difference signal (U passages and V passages) data;Secondly, individually Y channel datas are entered Row is sharpened;Finally, the Y passages after sharpening, U passages, V passages are subjected to YUV and turn the coloured image after RGB conversion is sharpened.
Y represents the Y channel datas in YUV colour coding methods in formula, and its implication F (x, y) is identical, and U represents that colourity is believed Number U, V represent carrier chrominance signal V;R, G, B are respectively the component value of the red, green, blue in rgb color pattern.
Other steps and parameter are identical with embodiment one.
Embodiment three:Present embodiment from unlike embodiment one or two:In step 3, Y passages The Weber contrast u (x, y) of data expression formula is:
Other steps and parameter are identical with embodiment one or two.
Embodiment four:Unlike one of present embodiment and embodiment one to three:
In step 4, the expression formula of the piecewise function f (u (x, y)) for carrying out stretching conversion processing to Weber contrasts For:
Wherein k is the parameter for representing stretching yardstick.This piecewise function can stretch Weber pairs in domain of definition Than degree so as to reach stretching local contrast, the purpose of image detail and textural characteristics is improved.It is local right on the premise of k=2 It is more as shown in Figure 2 than degree stretch function schematic diagram.
The local contrast stretch function of S-shaped shape can strengthen the local contrast of image as can be seen from Figure 2.Specifically: When contrast is close to 0 (i.e. local contrast is minimum, and image detail is most fuzzy), the stretching yardstick of function reaches maximum, table It is now most precipitous for the gradient of curve.Now, the sharpening yardstick of image is bigger, and image detail lifting is more notable;Contrast close -1 Or when 1 (i.e. local contrast is maximum, and image detail is most prominent), the stretching yardstick of function is smaller.Now, image is also obtained Sharpen to a certain extent, but sharpening yardstick is smaller, the details lifting of image is not notable.
Other steps and parameter are identical with one of embodiment one to three.
Embodiment five:Unlike one of present embodiment and embodiment one to four:In step 6, lead to Cross equation below and the corresponding yuv data of Y channel datas after step 5 is handled is converted into RGB image:
Wherein I is the Y channel datas after step 5 is handled, and its implication is identical with I (x, y).
Other steps and parameter are identical with one of embodiment one to four.
<Emulation experiment>
First, simulated environment:
1. software environment:Matlab 2010b, windows 7;
2. hardware environment:The core i7-6700HQ of CPU tetra-, internal memory 8GB DDR4, video card GTX 1060;
3. input and output:Suffix is bmp digital picture.
2nd, analysis of simulation result:
In emulation experiment, free parameter is set as k=2, Gaussian kernel yardstick σ2=1.
For standard digital image cameraman carry out image sharpening, compared for original image, USM algorithms sharpening image, The subjective vision effect of this paper algorithm sharpening images, demonstrates the validity of this paper algorithms.Experimental result is as shown in Figures 3 to 5.
It can be seen that USM algorithms and this paper algorithms both increase the definition of image from Fig. 3 to Fig. 5, improve image Minutia.The sharpening image obtained by USM algorithms generates the violent transition region of comparison of light and shade near cameraman's profile Domain.In the brighter side of transitional region, there is one and prolong contoured profile, width is extremely narrow, excessively bright high-brightness region. In the dark side of intermediate zone, there is one and prolong contoured profile, extremely narrow, the excessively dark low brightness area of width.This mistake Cross the subjective effect that region has had a strong impact on image.This paper algorithms inhibit the ash near this pseudo-edge phenomenon, image border Degree saltus step has obtained effective control, and the overall definition of image is also better than USM algorithms
3rd, objective evaluation index:
Experimental result is evaluated using three kinds of objective evaluation indexs.First evaluation index is the comentropy of image, Comentropy formula is as shown in Equation 7:
Wherein i is grey decision-making, piIt is the probability that grey decision-making i occurs in the picture, H is entropy.The value of comentropy is determined Image comprising information content number, value is bigger, and the information content included is more, and the smaller information content included of value is fewer.Second evaluation The standard deviation of index gray scale, standard deviation formula is as shown in Equation 8:
Wherein N is the sum of all pixels that image is included, and σ is the standard deviation of gray scale.Gray standard deviation reflects the details of image Feature.Standard deviation is bigger, and variation of image grayscale is more violent, and the details of image is abundanter.Standard deviation is smaller, and variation of image grayscale is got over Gently, the details expressive force of image is poor.The third evaluation index is the average value (letter of the Euclidean distance of Image Warping Referred to as be averaged Euclidean distance), formula is as shown in Equation 9:
Wherein E is average Euclidean distance.It was found from formula 9, average Euclidean distance reflects the local contrast of sharpening image Degree and the mean deviation degree of definitely smooth contrast (i.e. local contrast is 0).Average Euclidean distance is bigger, local contrast Higher, image detail feature is more protruded.Average Euclidean distance is smaller, and local contrast is lower, and image detail feature is more not Substantially.
Standard digital image cameraman three kinds of objective evaluation achievement datas contrast is as shown in table 1:
The objective evaluation achievement data of table 1 is contrasted
As known from Table 1 after the processing of image sharpening operator, picture material, details is enriched degree and is all significantly improved. From the point of view of specific, this paper algorithms are above traditional USM algorithms, therefore this at comentropy, standard deviation, the average aspect of Euclidean distance three Literary algorithm can effectively improve the texture of image, minutia.
The present invention can also have other various embodiments, in the case of without departing substantially from spirit of the invention and its essence, this area Technical staff works as can make various corresponding changes and deformation according to the present invention, but these corresponding changes and deformation should all belong to The protection domain of appended claims of the invention.

Claims (5)

1. a kind of image sharpening method based on local contrast, it is characterised in that including:
Step 1: input RGB image;
Step 2: the RGB image is converted into yuv data;
Step 3: Y channel data therein to the yuv data carries out gaussian filtering and calculates the Y channel datas Weber contrast u (x, y);(x, y) is the plane coordinates of pixel;
Step 4: building a piecewise function f (u (x, y)) for being used to carry out the Weber contrasts stretching conversion processing;
Step 5: the Y channel datas are handled by equation below:
<mrow> <mi>I</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mo>(</mo> <mi>f</mi> <mo>(</mo> <mrow> <mi>u</mi> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>)</mo> <mo>+</mo> <mn>1</mn> <mo>)</mo> <mo>&amp;times;</mo> <mo>(</mo> <mo>&amp;lsqb;</mo> <mi>F</mi> <mo>*</mo> <mi>G</mi> <mo>&amp;rsqb;</mo> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> <mo>-</mo> <msub> <mi>&amp;omega;</mi> <mi>c</mi> </msub> <mo>&amp;times;</mo> <mi>F</mi> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> <mrow> <mn>1</mn> <mo>-</mo> <msub> <mi>&amp;omega;</mi> <mi>c</mi> </msub> <mo>&amp;times;</mo> <mrow> <mo>(</mo> <mi>f</mi> <mo>(</mo> <mrow> <mi>u</mi> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>)</mo> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow>
Wherein, I (x, y) is the data after processing;F (x, y) is Y channel datas;ωcFor the weight of Gauss rectangle window center pixel; G is gaussian kernel function, and expression formula is:
<mrow> <mi>G</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mn>2</mn> <msup> <mi>&amp;pi;&amp;sigma;</mi> <mn>2</mn> </msup> </mrow> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <msup> <mi>x</mi> <mn>2</mn> </msup> <mo>+</mo> <msup> <mi>y</mi> <mn>2</mn> </msup> </mrow> <mrow> <mn>2</mn> <msup> <mi>&amp;sigma;</mi> <mn>2</mn> </msup> </mrow> </mfrac> <mo>)</mo> </mrow> </mrow>
Wherein σ2It is the yardstick of Gaussian kernel;
F*G represents that Y channel datas carry out convolution algorithm with gaussian kernel function;
Step 6: the corresponding yuv data of Y channel datas after step 5 is handled is converted into RGB image and exported.
2. the image sharpening method according to claim 1 based on local contrast, it is characterised in that step 2 is specific For:
Pass through formulaRGB image is converted into yuv data;
Wherein Y represents the Y channel datas in YUV colour coding methods, and its implication is identical with F (x, y), and U represents carrier chrominance signal U, V Represent carrier chrominance signal V;R, G, B are respectively the component value of the red, green, blue in rgb color pattern.
3. the image sharpening method according to claim 1 based on local contrast, it is characterised in that in step 3, institute The expression formula for stating the Weber contrast u (x, y) of Y channel datas is:
<mrow> <mi>u</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mi>F</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mo>&amp;lsqb;</mo> <mi>F</mi> <mo>*</mo> <mi>G</mi> <mo>&amp;rsqb;</mo> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>-</mo> <mn>1.</mn> </mrow>
4. the image sharpening method according to claim 1 based on local contrast, it is characterised in that in step 4, is used It is in the expression formula for the piecewise function f (u (x, y)) that stretching conversion processing is carried out to the Weber contrasts:
<mrow> <mi>f</mi> <mrow> <mo>(</mo> <mi>u</mi> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msup> <mrow> <mo>(</mo> <mi>u</mi> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> <mi>k</mi> </msup> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>u</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&amp;GreaterEqual;</mo> <mn>0</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <msup> <mrow> <mo>(</mo> <mi>u</mi> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> <mi>k</mi> </msup> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>u</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <mn>0</mn> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
Wherein k is the parameter for representing stretching yardstick.
5. the image sharpening method according to claim 1 based on local contrast, it is characterised in that in step 6, leads to Cross equation below and the corresponding yuv data of Y channel datas after step 5 is handled is converted into RGB image:
<mrow> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mi>R</mi> <mo>=</mo> <mi>I</mi> <mo>+</mo> <mn>1.4075</mn> <mo>&amp;times;</mo> <mrow> <mo>(</mo> <mi>V</mi> <mo>-</mo> <mn>128</mn> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>G</mi> <mo>=</mo> <mi>I</mi> <mo>-</mo> <mn>0.3455</mn> <mo>&amp;times;</mo> <mrow> <mo>(</mo> <mi>U</mi> <mo>-</mo> <mn>128</mn> <mo>)</mo> </mrow> <mo>-</mo> <mn>0.7169</mn> <mo>&amp;times;</mo> <mrow> <mo>(</mo> <mi>V</mi> <mo>-</mo> <mn>128</mn> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>B</mi> <mo>=</mo> <mi>I</mi> <mo>+</mo> <mn>1.779</mn> <mo>&amp;times;</mo> <mrow> <mo>(</mo> <mi>U</mi> <mo>-</mo> <mn>128</mn> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow>
Wherein I is the Y channel datas after step 5 is handled, and its implication is identical with I (x, y).
CN201710438517.2A 2017-06-12 2017-06-12 Image sharpening method based on local contrast Expired - Fee Related CN107256539B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710438517.2A CN107256539B (en) 2017-06-12 2017-06-12 Image sharpening method based on local contrast

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710438517.2A CN107256539B (en) 2017-06-12 2017-06-12 Image sharpening method based on local contrast

Publications (2)

Publication Number Publication Date
CN107256539A true CN107256539A (en) 2017-10-17
CN107256539B CN107256539B (en) 2020-06-02

Family

ID=60024496

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710438517.2A Expired - Fee Related CN107256539B (en) 2017-06-12 2017-06-12 Image sharpening method based on local contrast

Country Status (1)

Country Link
CN (1) CN107256539B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108805214A (en) * 2018-06-19 2018-11-13 太原学院 A kind of similar image matching process and system based on FUZZY WEIGHTED histogram
CN113393397A (en) * 2021-06-21 2021-09-14 杭州微帧信息科技有限公司 Method and system for enhancing image contrast
CN114708342A (en) * 2022-06-06 2022-07-05 广东欧谱曼迪科技有限公司 File conversion method, device, equipment, storage medium and simulation verification method
CN115953331A (en) * 2023-03-14 2023-04-11 极限人工智能有限公司 Image edge enhancement method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002089043A1 (en) * 2001-04-30 2002-11-07 Yeda Research And Development Co., Ltd Method and apparatus for image enhancement for the visually impaired
US20080050031A1 (en) * 2006-08-24 2008-02-28 Goh Itoh Image processing apparatus and imaging device
CN102831591A (en) * 2012-06-27 2012-12-19 北京航空航天大学 Gaussian filter-based real-time defogging method for single image
CN104299204A (en) * 2013-07-17 2015-01-21 王垒 Histogram local image contrast enhancing method and histogram local image contrast enhancing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002089043A1 (en) * 2001-04-30 2002-11-07 Yeda Research And Development Co., Ltd Method and apparatus for image enhancement for the visually impaired
US20080050031A1 (en) * 2006-08-24 2008-02-28 Goh Itoh Image processing apparatus and imaging device
CN102831591A (en) * 2012-06-27 2012-12-19 北京航空航天大学 Gaussian filter-based real-time defogging method for single image
CN104299204A (en) * 2013-07-17 2015-01-21 王垒 Histogram local image contrast enhancing method and histogram local image contrast enhancing device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
JIN-HWAN KIM ETAL.: "Optimized contrast enhancement for real-time image and video dehazing", 《J. VIS. COMMUN. IMAGE R.》 *
QIONG SONG ETAL.: "High dynamic range infrared images detail enhancement based on local edge preserving filter", 《INFRARED PHYSICS & TECHNOLOGY》 *
SHYAM LAL ETAL.: "Efficient Algorithm for Contrast Enhancement of Natural Images", 《THE INTERNATIONAL ARAB JOURNAL OF INFORMATION TECHNOLOGY》 *
王大雷: "雾天图像增强技术的分析与研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108805214A (en) * 2018-06-19 2018-11-13 太原学院 A kind of similar image matching process and system based on FUZZY WEIGHTED histogram
CN108805214B (en) * 2018-06-19 2021-07-23 太原学院 Similar image matching method and system based on fuzzy weighted histogram
CN113393397A (en) * 2021-06-21 2021-09-14 杭州微帧信息科技有限公司 Method and system for enhancing image contrast
CN114708342A (en) * 2022-06-06 2022-07-05 广东欧谱曼迪科技有限公司 File conversion method, device, equipment, storage medium and simulation verification method
CN115953331A (en) * 2023-03-14 2023-04-11 极限人工智能有限公司 Image edge enhancement method and system
CN115953331B (en) * 2023-03-14 2023-12-29 极限人工智能有限公司 Image edge enhancement method and system

Also Published As

Publication number Publication date
CN107256539B (en) 2020-06-02

Similar Documents

Publication Publication Date Title
Shi et al. Normalised gamma transformation‐based contrast‐limited adaptive histogram equalisation with colour correction for sand–dust image enhancement
CN104537615B (en) A kind of local Retinex Enhancement Methods based on HSV color spaces
CN107256539A (en) A kind of image sharpening method based on local contrast
CN104537634B (en) The method and system of raindrop influence is removed in dynamic image
CN106097279B (en) A kind of high-dynamics image Enhancement Method
CN107358585B (en) Foggy day image enhancement method based on fractional order differential and dark channel prior
CN105023256B (en) A kind of image defogging method and system
Wang et al. Variational single nighttime image haze removal with a gray haze-line prior
CN104063848A (en) Enhancement method and device for low-illumination image
CN104766319B (en) Lifting night takes pictures the method for image registration accuracy
CN104036468B (en) Single-frame image super-resolution reconstruction method based on the insertion of pre-amplification non-negative neighborhood
CN104574328A (en) Color image enhancement method based on histogram segmentation
TWI698124B (en) Image adjustment method and associated image processing circuit
CN112785534A (en) Ghost-removing multi-exposure image fusion method in dynamic scene
CN104021567B (en) Based on the fuzzy altering detecting method of image Gauss of first numeral law
US11887218B2 (en) Image optimization method, apparatus, device and storage medium
CN111951172A (en) Image optimization method, device, equipment and storage medium
CN116468640B (en) Video image enhancement method for Internet teaching
CN106933579A (en) Image rapid defogging method based on CPU+FPGA
CN104021527A (en) Rain and snow removal method in image
CN103489168A (en) Enhancing method and system for infrared image being converted to pseudo color image in self-adaptive mode
CN111145105A (en) Image rapid defogging method and device, terminal and storage medium
CN105426847A (en) Nonlinear enhancing method for low-quality natural light iris images
CN104616259B (en) A kind of adaptive non-local mean image de-noising method of noise intensity
CN103514588B (en) Image enchancing method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200602

CF01 Termination of patent right due to non-payment of annual fee