CN116452447A - Low-illumination high-definition image processing method - Google Patents

Low-illumination high-definition image processing method Download PDF

Info

Publication number
CN116452447A
CN116452447A CN202310367292.1A CN202310367292A CN116452447A CN 116452447 A CN116452447 A CN 116452447A CN 202310367292 A CN202310367292 A CN 202310367292A CN 116452447 A CN116452447 A CN 116452447A
Authority
CN
China
Prior art keywords
image
map
illumination
low
adaptive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310367292.1A
Other languages
Chinese (zh)
Inventor
陈鹏
张荣春
林辉
王辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Aerospace Kate Mechanical And Electrical Technology Co ltd
Original Assignee
Chengdu Aerospace Kate Mechanical And Electrical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Aerospace Kate Mechanical And Electrical Technology Co ltd filed Critical Chengdu Aerospace Kate Mechanical And Electrical Technology Co ltd
Priority to CN202310367292.1A priority Critical patent/CN116452447A/en
Publication of CN116452447A publication Critical patent/CN116452447A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to a low-illumination high-definition image processing method, which comprises the following steps: respectively solving a bright channel image and a dark channel image of the low-illumination image by using a bright channel prior and a dark channel prior; and obtaining a self-adaptive atmospheric illumination map through the bright channel image. According to the low-illumination high-definition image processing method, the accurate edge map of the gray map is calculated by calculating the gray map of the input original image: calculating gradient amplitude values of the bright pixel points and the dark pixel points of the gray level map by adopting different edge detection operators respectively to obtain a rough edge map; denoising, repairing and refining the rough edge map to obtain a precise edge map, and calculating an illumination map according to the gray map and the precise edge map: and subtracting the gray value of the corresponding pixel point in the accurate edge map from the gray value of each pixel point of the gray map in the logarithmic domain.

Description

Low-illumination high-definition image processing method
Technical Field
The invention relates to the technical field of image processing, in particular to a low-illumination high-definition image processing method.
Background
With the rapid development of computer technology and ARM technology, image processing is widely applied in various application fields such as aerospace, military, biomedicine and artificial intelligence, image enhancement provides a key step for image processing, and for pictures in dark environments, many unknown contents cannot be displayed, which leads to important information loss, such as the fields of traffic conditions of video monitoring shooting, daily color image processing, images of ships and night criminals, mine images and the like in low-illumination conditions such as night and darkness, so that the research on the low-illumination image enhancement method is very valuable.
In normal use, shooting is performed by a star light level low-illumination 200W high definition, auxiliary shooting is performed by an electronic red dot and a dividing line, and auxiliary shooting is performed by AI intelligent recognition tracking, a gray level conversion method represented by a histogram equalization method, a homomorphic filtering method based on an illumination-reflection model, a gradient domain enhancement method, a retinex enhancement method and the like are performed on the shot image, the gray level distribution of the image can be more uniform and the contrast of the image is enhanced by the gray level conversion method represented by the histogram equalization method, but the frequency information and the detail information of the image are not considered, an over-enhanced image is easy to generate, the image is divided into a high frequency part and a low frequency part by the homomorphic filtering method based on the illumination-reflection model, filtering is performed to achieve the purpose of enhancing the contrast of the image and simultaneously compressing the dynamic range of the image, but the over-enhancement phenomenon occurs, the gradient domain enhancement method processes the gradient of the original image by reducing the gradient value of the image, and enhancing the image edge by increasing the local gradient value, and the image edge has the defects that the image is more uniform to a certain extent, and the image is not suitable for a real-time algorithm in the gradient domain reconstruction.
Disclosure of Invention
(one) solving the technical problems
Aiming at the defects of the prior art, the invention provides a low-illumination high-definition image processing method which has the advantages of clearer imaging, high practicability and the like, and solves the problem of low imaging definition in post-processing shooting.
(II) technical scheme
In order to achieve the purposes of clearer imaging and high practicability, the invention provides the following technical scheme: a low-illuminance high-definition image processing method, the method comprising the steps of:
1) Respectively solving a bright channel image and a dark channel image of the low-illumination image by using a bright channel prior and a dark channel prior;
2) Obtaining a self-adaptive atmospheric illumination map through the bright channel image;
3) Obtaining a self-adaptive transfer function diagram through the dark channel image and the self-adaptive atmospheric illumination diagram;
4) Restoring a scene image according to the low-illumination image, the self-adaptive atmospheric illumination map and the self-adaptive transfer function map in the atmospheric scattering physical model, wherein the restored scene image is determined according to the following expression:
J(x)=I(x)-A(x)A(x)(max(t(x),t0))+1;
wherein x represents two-dimensional space coordinates, J (x) is a restored scene image, I (x) is a low-illumination image, A (x) is an adaptive atmospheric illumination map, t (x) is an adaptive transfer function map, and t0 has a value of 0.1.
Preferably, in step 1), the specific method for respectively obtaining the bright channel image and the dark channel image of the low-illumination image by using the bright channel prior and the dark channel prior includes:
and (3) obtaining a bright channel image of the low-illumination image by using the bright channel prior:
when the low-illumination image is a gray image, the gray low-illumination image is used as a guide image of a GuidedFilter filter;
the guide image is determined according to the following expression:
Iguide(x)=I(x);
wherein Iguide (x) is a guide image, and I (x) is a gray-scale low-illumination image;
performing maximum filtering on the obtained guide image to obtain a rough extracted bright channel image;
the coarsely extracted bright channel image is determined according to the following expression:
Ilig(x)=maxx∈Ω(x)(Iguide(x));
wherein Ilig (x) is a rough extracted bright channel image, x is a two-dimensional space coordinate, and Ω (x) is a square neighborhood centered on the coordinate x;
when the low-illumination image is a color image, calculating the maximum value of R, G, B color channels at each pixel point of the color low-illumination image as a guide image of a GuidedFilter filter;
the guide image is determined according to the following expression:
Iguide(x)=maxc∈{R,G,B}(Ic(x));
wherein Iguide (x) is a guide image, c is a color channel, c E { R, G, B } are R, G, B color channels, and Ic is a color channel of a color low-illumination image;
performing maximum filtering on the obtained guide image to obtain a rough extracted bright channel image;
the coarsely extracted bright channel image is determined according to the following expression:
Ilig(x)=maxx∈Ω(x)(Iguide(x));
wherein Ilig (x) is a rough extracted bright channel image, x is a two-dimensional space coordinate, and Ω (x) is a square neighborhood centered on the coordinate x;
and (3) carrying out edge-preserving smooth filtering on the rough extracted bright channel image through a GuidedFilter filter by utilizing the guide image, and finally obtaining a refined bright channel image Ilight (x).
Preferably, in step 3), the process comprises,
the specific method for obtaining the self-adaptive transfer function diagram through the dark channel image and the self-adaptive atmospheric illumination diagram comprises the following steps:
obtaining an adaptive transfer function diagram by using the dark channel image and the adaptive atmospheric illumination diagram;
the adaptive transfer function map is determined according to the following expression:
idark (x) =A (x) (1-t (x)), namely: t (x) =1-widmark (x) a (x);
wherein t (x) is an adaptive transfer function diagram, w is a correction factor, and 0 < w is less than or equal to 1, so as to reserve a small part of dark area and increase the depth perception of a scene, and w=0.95 is taken here.
Preferably, in step 4), the specific method for restoring the scene image according to the low-illumination image, the adaptive atmospheric illumination map and the adaptive transfer function map in the atmospheric scattering physical model is as follows: restoring the scene image through the atmospheric scattering physical model by utilizing the low-illumination image, the self-adaptive atmospheric illumination map and the self-adaptive transfer function map;
the restored scene image is determined according to the following expression:
J(x)=I(x)-A(x)A(x)(max(t(x),t0))+1;
wherein J (x) is a restored scene image, I (x) is a low-illumination image, A (x) is an adaptive atmospheric illumination image, t (x) is an adaptive transfer function image, and t0 has a value of 0.1, so as to avoid that the scene image J (x) contains too much noise when t0 tends to 0
And outputting the formed image through the Wifi module.
Preferably, the method comprises the steps of:
calculating a gray scale map of the input original image;
calculating an accurate edge map of the gray map: calculating gradient amplitude values of the bright pixel points and the dark pixel points of the gray level map by adopting different edge detection operators respectively to obtain a rough edge map; denoising, repairing and refining the rough edge map to obtain an accurate edge map;
calculating an illumination map according to the gray level map and the accurate edge map: subtracting the gray value of the corresponding pixel point in the accurate edge map from the gray value of each pixel point of the gray map in the logarithmic domain to obtain an illumination map in the logarithmic domain;
adjusting the contrast of the illumination map;
synthesizing an enhanced gray scale image according to the accurate edge image and the illumination image with adjusted contrast: in the logarithmic domain, adding the gray value of each pixel point of the accurate edge map with the gray value of the corresponding pixel point in the illumination map after the contrast adjustment to obtain a gray map with enhanced brightness in the logarithmic domain;
and calculating and outputting the enhanced color image according to the enhanced gray level image.
Preferably, refining the rough edge map after repair to obtain a precise edge map includes:
initializing all pixel points of the I' edge to be 0;
the pixel points are reassigned by adopting the following method:
1) If Iedge (x, y) is 0 and Iedge (x+1, y) is 1, then Iedge' (x+M, y) is 1;
2) If Iedge (x, y) is 0 and Iedge (x, y+1) is 1, then I' edge (x, y+M) is 1;
preferably, where (x, y) is the current pixel point, M represents the half width or half height of the template used in repairing the edge map, iedge represents the rough edge map after repairing, and I' edge represents the precise edge map.
(III) beneficial effects
Compared with the prior art, the invention provides a low-illumination high-definition image processing method, which has the following beneficial effects:
1. according to the low-illumination high-definition image processing method, the accurate edge map of the gray map is calculated by calculating the gray map of the input original image: calculating gradient amplitude values of the bright pixel points and the dark pixel points of the gray level map by adopting different edge detection operators respectively to obtain a rough edge map; denoising, repairing and refining the rough edge map to obtain a precise edge map, and calculating an illumination map according to the gray map and the precise edge map: subtracting the gray value of the corresponding pixel point in the accurate edge map from the gray value of each pixel point of the gray map in the logarithmic domain to obtain an illumination map in the logarithmic domain, adjusting the contrast of the illumination map, and synthesizing an enhanced gray map according to the accurate edge map and the illumination map with the contrast adjusted: and in the logarithmic domain, adding the gray value of each pixel point of the accurate edge map with the gray value of the corresponding pixel point in the illumination map after the contrast adjustment to obtain a gray map with enhanced brightness on the logarithmic domain, calculating and outputting an enhanced color image according to the enhanced gray map, shooting by using a starlight level low-illumination 200W high definition, and carrying out auxiliary shooting by using an electronic red point and a dividing line, wherein AI intelligent identification tracking is arranged for carrying out auxiliary shooting.
2. According to the low-illumination high-definition image processing method, the rough edge map after repair is refined, and the obtaining of the accurate edge map comprises the following steps: initializing all pixel points of the I' edge to be 0, and reassigning the pixel points by adopting the following method:
1) If Iedge (x, y) is 0 and Iedge (x+1, y) is 1, then Iedge' (x+M, y) is 1;
2) If Iedge (x, y) is 0 and Iedge (x, y+1) is 1, then I' edge (x, y+M) is 1;
wherein, (x, y) is the current pixel point, M represents half width or half height of the template used for repairing the edge map, iedge represents the rough edge map after repairing, and I' edge represents the accurate edge map.
Drawings
FIG. 1 is a flow chart of a low-illumination high-definition image processing method according to the present invention;
FIG. 2 is a flowchart of a refined bright/dark channel image of a low-illumination high-definition image processing method according to the present invention;
fig. 3 is a flowchart of image enhancement of a low-illumination high-definition image processing method according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1-3, a low-illumination high-definition image processing method includes the following steps:
1) Respectively solving a bright channel image and a dark channel image of the low-illumination image by using a bright channel prior and a dark channel prior;
2) Obtaining a self-adaptive atmospheric illumination map through the bright channel image;
3) Obtaining a self-adaptive transfer function diagram through the dark channel image and the self-adaptive atmospheric illumination diagram;
4) Restoring a scene image according to the low-illumination image, the self-adaptive atmospheric illumination map and the self-adaptive transfer function map in the atmospheric scattering physical model, wherein the restored scene image is determined according to the following expression:
J(x)=I(x)-A(x)A(x)(max(t(x),t0))+1;
wherein x represents two-dimensional space coordinates, J (x) is a restored scene image, I (x) is a low-illumination image, A (x) is an adaptive atmospheric illumination map, t (x) is an adaptive transfer function map, and t0 has a value of 0.1.
When the method is used, the gray level diagram of an input original image is calculated, the accurate edge diagram of the gray level diagram is calculated, and different edge detection operators are respectively adopted for the bright pixel points and the dark pixel points of the gray level diagram to calculate gradient amplitude values, so that a rough edge diagram is obtained; denoising, repairing and refining the rough edge map to obtain a precise edge map, and calculating an illumination map according to the gray map and the precise edge map: subtracting the gray value of the corresponding pixel point in the accurate edge map from the gray value of each pixel point of the gray map in the logarithmic domain to obtain an illumination map in the logarithmic domain, adjusting the contrast of the illumination map, and synthesizing an enhanced gray map according to the accurate edge map and the illumination map with the contrast adjusted: and in the logarithmic domain, adding the gray value of each pixel point of the accurate edge map with the gray value of the corresponding pixel point in the illumination map after the contrast adjustment to obtain a gray map with enhanced brightness on the logarithmic domain, calculating and outputting an enhanced color image according to the enhanced gray map, shooting by using a starlight level low-illumination 200W high definition, and carrying out auxiliary shooting by using an electronic red point and a dividing line, wherein AI intelligent identification tracking is arranged for carrying out auxiliary shooting.
In summary, according to the low-illumination high-definition image processing method, the rough edge map after repair is refined, and the obtaining of the accurate edge map includes: initializing all pixel points of the I' edge to be 0, and reassigning the pixel points by adopting the following method:
1) If Iedge (x, y) is 0 and Iedge (x+1, y) is 1, then Iedge' (x+M, y) is 1;
2) If Iedge (x, y) is 0 and Iedge (x, y+1) is 1, then I' edge (x, y+M) is 1;
wherein, (x, y) is the current pixel point, M represents half width or half height of the template used for repairing the edge map, iedge represents the rough edge map after repairing, and I' edge represents the accurate edge map.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. A low-illuminance high-definition image processing method, characterized in that the method comprises the steps of:
1) Respectively solving a bright channel image and a dark channel image of the low-illumination image by using a bright channel prior and a dark channel prior;
2) Obtaining a self-adaptive atmospheric illumination map through the bright channel image;
3) Obtaining a self-adaptive transfer function diagram through the dark channel image and the self-adaptive atmospheric illumination diagram;
4) Restoring a scene image according to the low-illumination image, the self-adaptive atmospheric illumination map and the self-adaptive transfer function map in the atmospheric scattering physical model, wherein the restored scene image is determined according to the following expression:
J(x)=I(x)-A(x)A(x)(max(t(x),t0))+1;
wherein x represents two-dimensional space coordinates, J (x) is a restored scene image, I (x) is a low-illumination image, A (x) is an adaptive atmospheric illumination map, t (x) is an adaptive transfer function map, and t0 has a value of 0.1.
2. The method for processing a low-illumination high-definition image according to claim 1, wherein in step 1), the specific method for respectively obtaining the bright channel image and the dark channel image of the low-illumination image by using the bright channel prior and the dark channel prior is as follows:
and (3) obtaining a bright channel image of the low-illumination image by using the bright channel prior:
when the low-illumination image is a gray image, the gray low-illumination image is used as a guide image of a GuidedFilter filter;
the guide image is determined according to the following expression:
Iguide(x)=I(x);
wherein Iguide (x) is a guide image, and I (x) is a gray-scale low-illumination image;
performing maximum filtering on the obtained guide image to obtain a rough extracted bright channel image;
the coarsely extracted bright channel image is determined according to the following expression:
Ilig(x)=maxx∈Ω(x)(Iguide(x));
wherein Ilig (x) is a rough extracted bright channel image, x is a two-dimensional space coordinate, and Ω (x) is a square neighborhood centered on the coordinate x;
when the low-illumination image is a color image, calculating the maximum value of R, G, B color channels at each pixel point of the color low-illumination image as a guide image of a GuidedFilter filter;
the guide image is determined according to the following expression:
Iguide(x)=maxc∈{R,G,B}(Ic(x));
wherein Iguide (x) is a guide image, c is a color channel, c E { R, G, B } are R, G, B color channels, and Ic is a color channel of a color low-illumination image;
performing maximum filtering on the obtained guide image to obtain a rough extracted bright channel image;
the coarsely extracted bright channel image is determined according to the following expression:
Ilig(x)=maxx∈Ω(x)(Iguide(x));
wherein Ilig (x) is a rough extracted bright channel image, x is a two-dimensional space coordinate, and Ω (x) is a square neighborhood centered on the coordinate x;
and (3) carrying out edge-preserving smooth filtering on the rough extracted bright channel image through a GuidedFilter filter by utilizing the guide image, and finally obtaining a refined bright channel image Ilight (x).
3. The method for processing a low-illuminance high-definition image according to claim 1, wherein in step 2), the specific method for obtaining the adaptive atmospheric illumination map from the bright channel image is as follows:
obtaining a self-adaptive atmospheric illumination map by utilizing the bright channel image;
the adaptive atmospheric illumination map is determined according to the following expression:
ilight (x) =A (x) t (x) +A (x) (1-t (x)) is: a (x) =ilight (x);
wherein Ilight (x) is a thinned bright channel image, A (x) is a self-adaptive atmospheric illumination map, and t (x) is a self-adaptive transfer function map.
4. A low-luminance high-definition image processing method according to claim 1, wherein in step 3),
the specific method for obtaining the self-adaptive transfer function diagram through the dark channel image and the self-adaptive atmospheric illumination diagram comprises the following steps:
obtaining an adaptive transfer function diagram by using the dark channel image and the adaptive atmospheric illumination diagram;
the adaptive transfer function map is determined according to the following expression:
idark (x) =A (x) (1-t (x)), namely: t (x) =1-widmark (x) a (x);
wherein t (x) is an adaptive transfer function diagram, w is a correction factor, and 0 < w is less than or equal to 1, so as to reserve a small part of dark area and increase the depth perception of a scene, and w=0.95 is taken here.
5. The method for processing a low-illumination high-definition image according to claim 1, wherein in step 4), the specific method for restoring the scene image according to the low-illumination image, the adaptive atmospheric illumination map and the adaptive transfer function map in the atmospheric scattering physical model is as follows: restoring the scene image through the atmospheric scattering physical model by utilizing the low-illumination image, the self-adaptive atmospheric illumination map and the self-adaptive transfer function map;
the restored scene image is determined according to the following expression:
J(x)=I(x)-A(x)A(x)(max(t(x),t0))+1;
wherein J (x) is a restored scene image, I (x) is a low-illumination image, A (x) is an adaptive atmospheric illumination image, t (x) is an adaptive transfer function image, and t0 has a value of 0.1, so as to avoid that the scene image J (x) contains too much noise when t0 tends to 0
And outputting the formed image through the Wifi module.
6. A low-illuminance high-definition image processing method according to claim 1, characterized in that the method comprises the steps of:
calculating a gray scale map of the input original image;
calculating an accurate edge map of the gray map: calculating gradient amplitude values of the bright pixel points and the dark pixel points of the gray level map by adopting different edge detection operators respectively to obtain a rough edge map; denoising, repairing and refining the rough edge map to obtain an accurate edge map;
calculating an illumination map according to the gray level map and the accurate edge map: subtracting the gray value of the corresponding pixel point in the accurate edge map from the gray value of each pixel point of the gray map in the logarithmic domain to obtain an illumination map in the logarithmic domain;
adjusting the contrast of the illumination map;
synthesizing an enhanced gray scale image according to the accurate edge image and the illumination image with adjusted contrast: in the logarithmic domain, adding the gray value of each pixel point of the accurate edge map with the gray value of the corresponding pixel point in the illumination map after the contrast adjustment to obtain a gray map with enhanced brightness in the logarithmic domain;
and calculating and outputting the enhanced color image according to the enhanced gray level image.
7. The low-illuminance high-definition image processing method according to claim 1, wherein:
refining the rough edge map after repair to obtain a precise edge map, including:
initializing all pixel points of the I' edge to be 0;
the pixel points are reassigned by adopting the following method:
1) If Iedge (x, y) is 0 and Iedge (x+1, y) is 1, then Iedge' (x+M, y) is 1;
2) If Iedge (x, y) is 0 and Iedge (x, y+1) is 1, then I' edge (x, y+M) is 1.
8. The method of claim 7, wherein (x, y) is a current pixel, M is a half width or half height of a template used for repairing the edge map, iedge is a rough edge map after repairing, and I' edge is a precise edge map.
CN202310367292.1A 2023-04-07 2023-04-07 Low-illumination high-definition image processing method Pending CN116452447A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310367292.1A CN116452447A (en) 2023-04-07 2023-04-07 Low-illumination high-definition image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310367292.1A CN116452447A (en) 2023-04-07 2023-04-07 Low-illumination high-definition image processing method

Publications (1)

Publication Number Publication Date
CN116452447A true CN116452447A (en) 2023-07-18

Family

ID=87135103

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310367292.1A Pending CN116452447A (en) 2023-04-07 2023-04-07 Low-illumination high-definition image processing method

Country Status (1)

Country Link
CN (1) CN116452447A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117635506A (en) * 2024-01-24 2024-03-01 成都航天凯特机电科技有限公司 Image enhancement method and device based on AI-energized Mean Shift algorithm

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117635506A (en) * 2024-01-24 2024-03-01 成都航天凯特机电科技有限公司 Image enhancement method and device based on AI-energized Mean Shift algorithm
CN117635506B (en) * 2024-01-24 2024-04-05 成都航天凯特机电科技有限公司 Image enhancement method and device based on AI-energized Mean Shift algorithm

Similar Documents

Publication Publication Date Title
Hu et al. Single image defogging based on illumination decomposition for visual maritime surveillance
Shi et al. Normalised gamma transformation‐based contrast‐limited adaptive histogram equalisation with colour correction for sand–dust image enhancement
Wang et al. Biologically inspired image enhancement based on Retinex
CN108090888B (en) Fusion detection method of infrared image and visible light image based on visual attention model
CN109064426B (en) Method and device for suppressing glare in low-illumination image and enhancing image
Gao et al. Detail preserved single image dehazing algorithm based on airlight refinement
CN111968065B (en) Self-adaptive enhancement method for image with uneven brightness
CN109389569B (en) Monitoring video real-time defogging method based on improved DehazeNet
CN110298796B (en) Low-illumination image enhancement method based on improved Retinex and logarithmic image processing
Yu et al. Image and video dehazing using view-based cluster segmentation
Yang et al. Low-light image enhancement based on Retinex theory and dual-tree complex wavelet transform
Wei et al. An image fusion dehazing algorithm based on dark channel prior and retinex
CN116452447A (en) Low-illumination high-definition image processing method
Liu et al. Texture filtering based physically plausible image dehazing
CN112435184A (en) Haze sky image identification method based on Retinex and quaternion
CN115587945A (en) High dynamic infrared image detail enhancement method, system and computer storage medium
Mishra et al. Underwater image enhancement using multiscale decomposition and gamma correction
Si et al. A novel method for single nighttime image haze removal based on gray space
CN117422631A (en) Infrared image enhancement method based on adaptive filtering layering
CN112750089A (en) Optical remote sensing image defogging method based on local block maximum and minimum pixel prior
CN110992287B (en) Method for clarifying non-uniform illumination video
CN115564682A (en) Uneven-illumination image enhancement method and system
Parihar Histogram modification and DCT based contrast enhancement
Ju et al. VRHI: Visibility restoration for hazy images using a haze density model
Mishra et al. A Review Paper on Low Light Image Enhancement Methods for Un-uniform Illumination

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination