CN114565519A - Image fog penetration method and device and storage medium - Google Patents

Image fog penetration method and device and storage medium Download PDF

Info

Publication number
CN114565519A
CN114565519A CN202210037436.2A CN202210037436A CN114565519A CN 114565519 A CN114565519 A CN 114565519A CN 202210037436 A CN202210037436 A CN 202210037436A CN 114565519 A CN114565519 A CN 114565519A
Authority
CN
China
Prior art keywords
fog
output
image
algorithm
fog penetration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210037436.2A
Other languages
Chinese (zh)
Inventor
陈善文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wanney Science And Technology Co ltd
Original Assignee
Shenzhen Wanney Science And Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wanney Science And Technology Co ltd filed Critical Shenzhen Wanney Science And Technology Co ltd
Priority to CN202210037436.2A priority Critical patent/CN114565519A/en
Publication of CN114565519A publication Critical patent/CN114565519A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an image fog penetration method, an image fog penetration device and a storage medium. The image fog penetration method comprises the following steps: acquiring an original image; carrying out fog penetration processing on the original image by utilizing a fog penetration algorithm to obtain an output image; calculating the output characteristics, the output noise and/or the fog penetration processing time length of the output image; and dynamically adjusting the intensity of the fog penetration algorithm according to the output characteristics, the output noise and/or the fog penetration processing time. The invention can flexibly adapt to fog penetration requirements under different scenes in practical application, and optimizes the observation effect and observation distance of the photoelectric observation equipment when the visibility is low.

Description

Image fog penetration method and device and storage medium
Technical Field
The invention relates to the technical field of image processing, in particular to an image fog penetration method, an image fog penetration device and a storage medium.
Background
Under the weather conditions of rain, snow, fog, haze, dust and the like, the atmospheric visibility is low, the quality of outdoor shot images and recorded videos is reduced, and the photoelectric integrated outdoor vision system is seriously influenced to play the effect. Therefore, it is important to realize fog-through processing (also referred to as defogging processing) for a picture. In the prior art, various fog penetration processing methods have been proposed, but different requirements on various aspects such as fog penetration effect, calculated amount and real-time performance in practical application are difficult to meet.
Disclosure of Invention
In order to solve the problems, the invention provides an image fog penetration method, an image fog penetration device and a storage medium, which can flexibly adapt to fog penetration requirements in different scenes in practical application, thereby improving the observation effect when the atmospheric visibility is low with low cost.
A first aspect of the present invention relates to an image fog-penetrating method, including:
acquiring an original image;
carrying out fog penetration processing on the original image by utilizing a fog penetration algorithm to obtain an output image;
calculating the output characteristics, the output noise and/or the fog penetration processing time length of the output image;
and dynamically adjusting the intensity of the fog penetration algorithm according to the output characteristics, the output noise and/or the fog penetration processing time.
Optionally, after an original image is acquired, preprocessing the original image to obtain partition coordinate information;
the fog penetrating processing of the original image by utilizing the fog penetrating algorithm to obtain an output image comprises the following steps:
determining a selected partition in the original image according to the partition coordinate information;
and carrying out fog penetration treatment on the selected subarea by utilizing a fog penetration algorithm to obtain the output image.
Optionally, the dynamically adjusting the strength of the fog penetration algorithm according to the output characteristics, the output noise and/or the fog penetration processing time length includes:
comparing the output characteristics, the output noise and/or the fog penetration processing time length with preset thresholds respectively;
and adjusting the intensity of the fog penetration algorithm according to the comparison result.
Optionally, the dynamically adjusting the strength of the fog penetration algorithm according to the output characteristics, the output noise and/or the fog penetration processing time length includes:
carrying out weighted calculation on the values of the output characteristics, the output noise and/or the fog penetration processing time length;
obtaining the intensity grade corresponding to the calculation result in a table look-up mode;
and adjusting the intensity of the fog penetration algorithm according to the intensity level.
Optionally, the output characteristic of the output imageThe feature comprises a boundary feature value, wherein the boundary feature value B is Be*Tw+Ce*Ccw+Ss*CswIn which B iseIs the boundary value of the output image, CeIs the contrast value, S, of the output imagesIs the value of the gray transition, T, of the output imagew、Ccw、CswIs a weight and Tw+Ccw+Csw=1。
Optionally, the method further includes, after the original image is obtained, preprocessing the original image to obtain image features, where the image features include a gray level mean value, gray level histogram information, and/or a contrast of the original image;
the fog penetrating processing of the original image by utilizing the fog penetrating algorithm to obtain an output image comprises the following steps:
acquiring the initial intensity of the fog penetration algorithm according to the image characteristics;
and carrying out fog penetration processing on the original image by utilizing the fog penetration algorithm of the initial intensity to obtain an output image.
Optionally, the performing fog penetration processing on the original image by using a fog penetration algorithm includes:
selecting an algorithm template, wherein the algorithm template is a two-dimensional matrix of (n +1) × (n +1), and n is more than or equal to 0; the larger n in the algorithm template is, the stronger the fog penetration algorithm is;
carrying out fog penetration treatment according to the following formula:
Figure BDA0003468513220000031
wherein S isijFor elements in the selected algorithm template, the SavgThe average gray value of the original image is obtained; ssetSetting a gray value; fcThe fog penetration enhancement coefficient; pnnThe gray value of the pixel with the coordinate of (n, n) in the processed output image is obtained.
A second aspect of the present invention relates to an image fog-penetrating apparatus comprising:
the image preprocessing module is used for acquiring an original image;
the fog penetration algorithm module is used for carrying out fog penetration processing on the original image by utilizing a fog penetration algorithm to obtain an output image, and dynamically adjusting the intensity of the fog penetration algorithm according to the output characteristic, the output noise and/or the fog penetration processing time;
and the output analysis module is used for calculating the output characteristics, the output noise and/or the fog penetration processing time length of the output image.
Optionally, the output analysis module comprises:
the output characteristic analysis module is used for calculating the output characteristic of the output image;
the output noise analysis module is used for calculating the output noise of the output image;
and the timing module is used for calculating the fog penetration processing time length.
Optionally, the image preprocessing module is further configured to preprocess the original image to obtain partition coordinate information;
the fog penetrating processing of the original image by utilizing the fog penetrating algorithm to obtain an output image comprises the following steps:
determining a selected partition in the original image according to the partition coordinate information;
and carrying out fog penetration treatment on the selected subarea by utilizing a fog penetration algorithm to obtain the output image.
A third aspect of the invention relates to a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the method according to the first aspect.
A fourth aspect of the invention relates to an electronic device comprising a memory having a computer program stored thereon and a processor implementing the method according to the first aspect when executing the program.
According to the invention, the intensity of the fog penetration algorithm is dynamically adjusted according to the output characteristics, the output noise and/or the fog penetration processing time length of the output image after the fog penetration processing, the fog penetration requirements under different scenes in practical application can be flexibly adapted, the definition of the video image is improved at lower cost, the real-time property is considered, the observation effect and the observation distance of the photoelectric observation equipment when the visibility is low are optimized, the hardware system overhead is smaller, and the power consumption is lower.
Drawings
FIG. 1 is a schematic flow chart of an image fog-penetration method according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of an image fog-penetrating device according to an embodiment of the present invention;
FIG. 3 is a block schematic diagram of an exemplary electronic device capable of implementing embodiments of the present invention.
Detailed Description
In order to better understand the technical solution, the technical solution will be described in detail with reference to the drawings and the specific embodiments.
Referring to fig. 1, an image fog-through method according to an embodiment of the present invention is shown. The method comprises the following steps:
s101, acquiring an original image;
wherein the original image is optionally one frame of a succession of images in a video.
S102, carrying out fog penetration treatment on the original image by utilizing a fog penetration algorithm to obtain an output image;
s103, calculating the output characteristics, the output noise and/or the fog penetration processing time length of the output image;
and S104, dynamically adjusting the intensity of the fog penetration algorithm according to the output characteristics, the output noise and/or the fog penetration processing time.
Optionally, in step S101, after an original image is acquired, preprocessing the original image to obtain partition coordinate information;
in step S102, the performing fog-penetrating processing on the original image by using a fog-penetrating algorithm to obtain an output image includes:
determining a selected partition in the original image according to the partition coordinate information;
and carrying out fog penetration treatment on the selected subarea by utilizing a fog penetration algorithm to obtain the output image. Therefore, the fog penetration processing is carried out on the local area of the original image instead of the whole image, the algorithm intensity is adjusted according to the processing effect, the calculated amount of the fog penetration processing is reduced, and the operation efficiency is improved.
In step S102, an output image after the fog penetrating process is obtained, where the output image is obtained by performing the fog penetrating process on the entire original image or on a selected partition in the original image.
Optionally, in step S103, the output feature of the output image includes a boundary feature value, where B is Be*Tw+Ce*Ccw+Ss*CswIn which B iseIs the boundary value of the output image, CeIs a contrast value, S, of said output imagesIs the gray level transition value, T, of the output imagew、Ccw、CswIs a weight and Tw+Ccw+Csw=1。
The boundary characteristic value can represent the fog penetration treatment effect, and the larger the boundary characteristic value is, the clearer the image is, and the better the fog penetration treatment effect is; on the contrary, the more blurred the image, the worse the fog-penetrating treatment effect.
When the intensity of the used fog penetration algorithm is higher, the fog penetration treatment effect is better, but the noise is higher, the fog penetration treatment time is longer, and the real-time performance is poor. Therefore, in step S104, the strength of the fog-penetrating algorithm needs to be determined by balancing the above factors.
As an embodiment, the dynamically adjusting the intensity of the fog penetrating algorithm according to the output characteristics, the output noise and/or the fog penetrating processing time length comprises:
comparing the output characteristics, the output noise and/or the fog penetration processing time length with preset thresholds respectively;
and adjusting the strength of the fog penetration algorithm according to the comparison result, such as increasing or decreasing the strength of the fog penetration algorithm.
The output characteristic is compared with a first preset threshold value to obtain a first comparison result, the output noise is compared with a second preset threshold value to obtain a second comparison result, the fog penetration processing duration is compared with a third preset threshold value to obtain a third comparison result, and then the intensity of the fog penetration algorithm is adjusted according to one or more of the three comparison results according to a preset rule, for example, the intensity of the fog penetration algorithm is increased under the condition that the fog penetration processing duration is smaller than the third preset threshold value and the output noise is lower than the second preset threshold value, so that the fog penetration processing effect is improved.
The adjustment of the algorithm strength is a dynamic process, after the strength of the fog penetrating algorithm is adjusted according to the comparison result, the steps S102-S104 are executed iteratively according to the adjusted algorithm strength until the preset condition is met, and the final strength of the fog penetrating algorithm is determined. The preset condition is, for example, a preset iteration number or an output characteristic, an output noise and/or a fog penetration processing time reaching a preset threshold. In this example, the original image processed at each iteration is the same.
Optionally, if the original image is an image in a video, the plurality of images have the same imaging condition, and in order to improve the overall processing efficiency of the video, after the intensity of the fog-penetrating algorithm is adjusted according to the comparison result, steps S101 to S104 are iteratively executed according to the adjusted algorithm intensity until a preset condition is met, and the final intensity of the fog-penetrating algorithm is determined. In this example, the original image processed at each iteration is different.
According to another embodiment, said dynamically adjusting the intensity of said fog-penetrating algorithm according to said output characteristics, output noise and/or fog-penetrating duration comprises:
performing weighted calculation, such as summation, on the values of the output characteristics, the output noise and/or the fog penetration processing time length;
obtaining the intensity grade corresponding to the calculation result in a table look-up mode;
and adjusting the intensity of the fog penetration algorithm according to the intensity level.
Wherein the weighting calculation is, for example, a weighted sum. The table used in the table look-up manner is generated in advance, for example, the intensity levels of the fog penetrating algorithm to be adopted by different weighting calculation results are predetermined according to empirical values to form a table.
Optionally, in step S102, the performing fog-penetrating processing on the original image by using a fog-penetrating algorithm includes:
selecting an algorithm template, wherein the algorithm template is a two-dimensional matrix of (n +1) × (n +1), and n is more than or equal to 0; the larger n in the algorithm template is, the stronger the fog penetration algorithm is;
carrying out fog penetration treatment according to the following formula:
Figure BDA0003468513220000081
wherein S isijFor the elements in the selected algorithm template, corresponding to the gray values of the corresponding pixels in the original image, said SavgThe average gray value of the original image is obtained; ssetSetting a gray value; fcThe fog penetration enhancement coefficient; pnnThe gray value of the pixel with the coordinate of (n, n) in the processed output image is obtained.
Optionally, the step S101 further includes preprocessing the original image to obtain an image feature, where the image feature includes a gray-scale mean value, gray-scale histogram information, and/or a contrast of the original image. In step S102, the initial intensity of the fog penetrating algorithm is obtained according to the image characteristics, and the fog penetrating algorithm with the initial intensity is used to perform fog penetrating processing on the original image, so as to obtain an output image.
By the image fog penetration method, the intensity of the fog penetration algorithm is dynamically adjusted according to the image characteristics of the original image, the output characteristics of the output image after fog penetration processing, the output noise and/or the fog penetration processing time length, the fog penetration requirement in practical application can be flexibly adapted, a video image with higher definition is obtained with lower cost and smaller noise, the real-time performance is considered, and the observation effect and the observation distance of the photoelectric observation equipment when the visibility is low are optimized.
Referring to fig. 2, there is shown an image fog-penetrating apparatus according to an embodiment of the present invention, including:
an image preprocessing module 201, configured to obtain an original image;
the fog penetrating algorithm module 202 is used for performing fog penetrating processing on the original image by utilizing a fog penetrating algorithm to obtain an output image, and dynamically adjusting the intensity of the fog penetrating algorithm according to the output characteristic, the output noise and/or the fog penetrating processing time;
and the output analysis module 203 is used for calculating the output characteristics, the output noise and/or the fog penetration processing time length of the output image.
Optionally, the output analysis module 203 comprises:
an output feature analysis module 2031 configured to calculate output features of the output image;
an output noise analysis module 2032 for calculating output noise of the output image;
a timing module 2033 configured to calculate the fog penetration processing time.
Optionally, the image preprocessing module 201 is further configured to preprocess the original image to obtain partition coordinate information;
the fog penetrating processing of the original image by utilizing the fog penetrating algorithm to obtain an output image comprises the following steps:
determining a selected partition in the original image according to the partition coordinate information;
and carrying out fog penetration treatment on the selected subarea by utilizing a fog penetration algorithm to obtain the output image.
Optionally, the output feature of the output image comprises a boundary feature value, and the boundary feature value B ═ Be*Tw+Ce*Ccw+Ss*CswIn which B iseIs the boundary value of the output image, CeIs the contrast value, S, of the output imagesIs the value of the gray transition, T, of the output imagew、Ccw、CswIs a weight and Tw+Ccw+Csw=1。
Specifically, the input of the image preprocessing module 201 is an original image, the output end of the image preprocessing module is connected to an input end of the fog penetrating algorithm module 202, the original image, the preprocessed partition coordinate information, the preprocessed image features, and the like are input into the fog penetrating algorithm module 202, and the output end of the fog penetrating algorithm module 202 is connected to the input end of a next-stage processing module (not shown), and is simultaneously connected to the input ends of the output feature analysis module 2031 and the output noise analysis module 2032. The fog penetrating algorithm module 202 performs fog penetrating processing on the original image to obtain an output image, the output image is output to the output feature analysis module 2031 and the output noise analysis module 2032, the output feature analysis module 2031 analyzes the output image to obtain an output feature, the output noise analysis module 2032 analyzes the output image to obtain an output noise, and the output feature and the output noise are transmitted to the fog penetrating algorithm module 202. The fog penetrating algorithm module 202 is further connected to the timing module 2033, when the original image is input into the fog penetrating algorithm module 202, the fog penetrating algorithm module 202 sends a reset signal to the timing module 2033, the timing module 2033 starts timing, when the fog penetrating algorithm module 202 outputs an image, optionally, the output noise analysis module 2032 sends a timing end signal, the timing module 2033 terminates timing, thereby obtaining a time difference between input and output of the image in the fog penetrating algorithm module 202, and defining the time difference as time consumed by executing the module, that is, time duration of fog penetrating processing.
Further, the fog penetrating algorithm module 202 further includes a fog penetrating processing module 2021 and an intensity adjusting module 2022, where the fog penetrating processing module 2021 is configured to perform fog penetrating processing on the original image by using a fog penetrating algorithm to obtain an output image. The intensity adjusting module 2022 is configured to dynamically adjust the intensity of the fog penetrating algorithm according to the output characteristic, the output noise, and/or the fog penetrating processing duration.
The dynamically adjusting the strength of the fog penetration algorithm according to the output characteristics, the output noise and/or the fog penetration processing time length comprises:
comparing the output characteristics, the output noise and/or the fog penetration processing time length with preset thresholds respectively;
and adjusting the strength of the fog penetration algorithm according to the comparison result, such as increasing or decreasing the strength of the fog penetration algorithm.
According to another embodiment, said dynamically adjusting the intensity of said fog-penetrating algorithm according to said output characteristics, output noise and/or fog-penetrating duration comprises:
performing weighted calculation, such as summation, on the values of the output characteristics, the output noise and/or the fog penetration processing time length;
obtaining the intensity grade corresponding to the calculation result in a table look-up mode;
and adjusting the intensity of the fog penetration algorithm according to the intensity level.
It can be understood that each module of the image fog-penetrating device provided in the above embodiment of the present invention is used for correspondingly implementing each step in the above image fog-penetrating method, and specific contents are not described herein again.
According to the image fog penetration device provided by the embodiment of the invention, the intensity of the fog penetration algorithm is dynamically adjusted according to the output characteristics, the output noise and/or the fog penetration processing time length of the output image after fog penetration processing, the fog penetration requirement in practical application can be flexibly adapted, a video image with higher definition is obtained with lower cost and smaller noise, the real-time performance is considered, and the observation effect and the observation distance of the photoelectric observation equipment in the low visibility are optimized. The image fog penetration device is simple in structure, low in system overhead and power consumption, and can be popularized and applied.
According to another embodiment of the present invention, a flow of performing the image fog-penetrating method on the image fog-penetrating device of the present invention is specifically explained as follows:
a. the image preprocessing module acquires an original image for preprocessing to obtain the partition coordinate information of the original image, the global gray level mean value of the original image, the gray level histogram information and/or the contrast.
The subarea of the original image is a selected area with the most remarkable characteristics in a complete image and is used as a selected subarea for subsequent processing.
The parameters of the original image are as follows:
image width: the amount of W is greater than the amount of W,
image height: h
Image abscissa: x
Image ordinate: y is
Image column coordinate start position: o isx
Image line coordinate start position: o isy
Selecting the number of the horizontal pixels of the partition: px
Selecting the number of vertical pixels of the subarea: py
Selecting the total pixel number of the partitions: a ═ Px*Py
Thus, the partition coordinate information obtained after the preprocessing is (O)x,Oy)~(Ox+ Px,Oy+Py) I.e. with (O)x,Oy) And (O)x+Px,Oy+Py) A rectangular area of vertices.
Gray value range: s is belonged to (0, 2)q-1),Smin=0,Smax=2q-1,q= 5,6,7,8,9……
Histogram: hi∈(HSmin,HSmax) The number of pixels with gray value equal to i belongs to (0, 2)q-1)
Mean gray level:
Figure BDA0003468513220000131
Sithe gray value of the ith pixel point is referred to;
contrast value:
Figure BDA0003468513220000132
to simplify the calculation, formulas may be used
Figure BDA0003468513220000133
b. The fog penetrating algorithm module 202 performs fog penetrating processing on the original image by using a fog penetrating algorithm to obtain an output image.
Setting a gray level SsetThe image gradation is SavgCoefficient of fog penetration enhancement Fc(∈[0.0,12.0]db), the two-dimensional template sequence used by the fog-penetrating algorithm is as follows:
TS0=[S0],
Figure BDA0003468513220000134
Figure BDA0003468513220000135
Figure BDA0003468513220000136
Figure BDA0003468513220000137
Figure BDA0003468513220000141
TS0,TS1,TS2,TS3,TS4,TS5… …, corresponding to algorithm templates with different depths, the calculation amount is gradually increased, and the consumed system time is gradually increased, which is defined as the algorithm depth. Element S in the matrix00,S01,……,S0i,……,Sj0,Sj1,……,SjiCorresponding to the gray value of the corresponding pixel of the current original image.
The calculation method is as follows:
TS0:P0=(S0-Savg)*Fc+Sset
TS1
Figure BDA0003468513220000142
TS2
Figure BDA0003468513220000143
TS3
Figure BDA0003468513220000144
TS4
Figure BDA0003468513220000145
TS5
Figure BDA0003468513220000146
……
TSn
Figure BDA0003468513220000147
after removing the image boundary, the area needs to be calculated:
Figure BDA0003468513220000148
image boundary adoption TS0Treatment of P0=(S0-Savg)*Fc+Sset
The method for processing the finite word length effect of the calculation result comprises the following steps:
Figure BDA0003468513220000149
Pnnaccording to TSnAfter the template calculation, the coordinates are the gray scale values of the (n, n) output pixels.
c. An output feature analysis module calculates output features of the output image.
The outline boundary judgment of the output image can adopt the existing mature algorithm, and is not described in detail here. The longer the continuous boundary, and even the closed loop, the better the algorithm.
Boundary threshold value: t isb
Boundary length: t isl
Boundary value: b ise=1(Tl≥Tb),Be=0(Tl<Tb)
Boundary weight: t isw
Grayscale mean of output image:
Figure BDA0003468513220000151
wherein SiThe gray value of the ith pixel point in the output image is referred to.
The contrast is calculated as follows:
contrast value:
Figure BDA0003468513220000152
to simplify the calculation, formulas may be used
Figure BDA0003468513220000153
Contrast ratio weight: ccw
The gray level jump is calculated as follows:
number of consecutive pixels: p iss(≥5)
Gray level difference threshold value: st(∈[5,2q-1],q=8)
Gray level transition value:
Figure BDA0003468513220000154
wherein P isiTo sequentially two-dimensionally roll up a template CtxAnd CtyAnd calculating the result of the corresponding coordinates.
The two-dimensional convolution template is, for example:
Figure BDA0003468513220000161
gray level transition weight: csw
Boundary characteristic value: b ═ Be*Tw+Ce*Ccw+Ss*Csw,(Tw+Ccw+ Csw1), the larger the boundary characteristic value is, the sharper the image is; otherwise, the image is obtainedThe more blurred.
d. An output noise analysis module calculates noise of the output image.
The noise of the output image can be obtained by adopting the existing image noise calculation method. Optionally, the specific calculation method used here is:
outside the contour closed loop of the moving target, selecting N relatively fixed points, and calculating a standard difference:
Figure BDA0003468513220000162
to increase the calculation speed, the following calculation formula can also be adopted:
ση=(|X1-X|+|X2-X|+...+|XN-X|)/N。
e. the input of the original image triggers a timer to start timing, and the time at the moment is Ti(ii) a The output of the output image triggers the timer to stop, at which time ToCalculating the time difference tau ═ To-TiAnd the time difference is the duration of the fog penetration algorithm.
f. And the fog penetrating algorithm module dynamically adjusts the intensity of the fog penetrating algorithm according to the boundary characteristic value, the output noise and/or the fog penetrating processing time.
Taking continuous videos in severe weather such as haze and the like as an example, the real-time requirement is high, and the calculation is carried out according to the frame rate of FHz, so that the whole system has to be used for a single image
Figure BDA0003468513220000163
All image processing algorithms delay threshold values within seconds after calculation is finished
Figure BDA0003468513220000171
Seconds, based on empirical values for machine time
Figure BDA0003468513220000172
Within the range, the effect is better, and the processing time of the fog penetration algorithm is determinedThe threshold value is
Figure BDA0003468513220000173
And seconds. Optionally, the threshold may also be set based on the specific scenario
Figure BDA0003468513220000174
Or
Figure BDA0003468513220000175
And so on. Likewise, a noise threshold for the output noise is determined based on the demand.
And respectively comparing the boundary characteristic value, the output noise and/or the fog penetration processing time with corresponding threshold values. During fog-through treatment
Figure BDA0003468513220000176
When the output noise is lower than the noise threshold, a deeper algorithm template is selected, so that the algorithm strength is increased; in that
Figure BDA0003468513220000177
When the time or output noise is higher than the noise threshold, a simpler template is selected, and the algorithm intensity is reduced to reduce the processing time or the output noise.
After each adjustment, the adjusted algorithm template is used for carrying out fog penetration processing on the current image or the next image, the algorithm intensity is adjusted again according to the output noise and the fog penetration processing time length of the output image until the algorithm intensity is converged to a certain algorithm intensity, and at the moment, the output noise is lower than the noise threshold value and the algorithm processing time length tau is obtainedtApproach to
Figure BDA0003468513220000178
And second, performing fog penetration treatment on subsequent images in the video by using the algorithm template at the moment. Therefore, the fog penetration processing can not increase image noise obviously, and can not make the delay of the system difficult to tolerate.
The above example only illustrates one implementation manner, and other adjustment manners may be adopted according to requirements, for example, priority is given to fog penetration effect, and details are not repeated herein.
In some embodiments of the invention, an electronic device is also provided. The electronic device includes: a memory having a computer program stored thereon and a processor implementing the method as described above when executing the program. Further, a computer-readable storage medium is also provided, on which a computer program is stored which, when being executed by a processor, carries out the method as described above. FIG. 3 shows a schematic block diagram of an electronic device 700 that may be used to implement embodiments of the present disclosure. As shown in fig. 3, electronic device 700 includes a Central Processing Unit (CPU)701 that may perform various appropriate actions and processes in accordance with computer program instructions stored in a Read Only Memory (ROM)702 or computer program instructions loaded from a storage unit 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data required for the operation of the electronic device 700 can also be stored. The CPU 701, the ROM702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
A number of components in the electronic device 700 are connected to the I/O interface 705, including: an input unit 706 such as a keyboard, mouse, etc.; an output unit 707 such as various types of displays, speakers, and the like; a storage unit 708 such as a magnetic disk, optical disk, or the like; and a communication unit 709 such as a network card, modem, wireless communication transceiver, etc. The communication unit 709 allows the electronic device 700 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The processing unit 701 performs the various methods and processes described above. For example, in some embodiments, the methods may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 708. In some embodiments, part or all of a computer program may be loaded onto and/or installed onto device 700 via ROM702 and/or communications unit 709. When the computer program is loaded into the RAM 703 and executed by the CPU 701, one or more steps of the methods described above may be performed. Alternatively, in other embodiments, CPU 701 may be configured to perform the method in any other suitable manner (e.g., by way of firmware).
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a load programmable logic device (CPLD), and the like.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium may include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Further, while operations are depicted in a particular order, this should be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (10)

1. An image fog-penetrating method, comprising:
acquiring an original image;
carrying out fog penetration processing on the original image by utilizing a fog penetration algorithm to obtain an output image;
calculating the output characteristics, the output noise and/or the fog penetration processing time length of the output image;
and dynamically adjusting the intensity of the fog penetration algorithm according to the output characteristics, the output noise and/or the fog penetration processing time.
2. The method according to claim 1, characterized in that after an original image is acquired, the original image is preprocessed to obtain partition coordinate information;
the fog penetrating processing of the original image by utilizing the fog penetrating algorithm to obtain an output image comprises the following steps:
determining a selected partition in the original image according to the partition coordinate information;
and carrying out fog penetration treatment on the selected subarea by utilizing a fog penetration algorithm to obtain the output image.
3. The method of claim 1 or 2, wherein dynamically adjusting the intensity of the fog-penetrating algorithm as a function of the output characteristic, output noise, and/or fog-penetrating duration comprises:
comparing the output characteristics, the output noise and/or the fog penetration processing time length with preset thresholds respectively;
and adjusting the intensity of the fog penetration algorithm according to the comparison result.
4. The method of claim 1 or 2, wherein dynamically adjusting the intensity of the fog-penetrating algorithm as a function of the output characteristic, output noise, and/or fog-penetrating duration comprises:
carrying out weighted calculation on the values of the output characteristics, the output noise and/or the fog penetration processing time length;
obtaining the intensity grade corresponding to the calculation result in a table look-up mode;
and adjusting the intensity of the fog penetration algorithm according to the intensity level.
5. The method of claim 1, wherein the output features of the output image comprise boundary feature values, wherein B is Be*Tw+Ce*Ccw+Ss*CswIn which B iseIs a boundary value, C, of the output imageeIs the contrast value, S, of the output imagesIs the gray level transition value, T, of the output imagew、Ccw、CswIs a weight and Tw+Ccw+Csw=1。
6. The method according to claim 1 or 2, further comprising, after obtaining an original image, preprocessing the original image to obtain image features, wherein the image features comprise a gray level mean, gray level histogram information and/or contrast of the original image;
the fog penetrating processing of the original image by utilizing the fog penetrating algorithm to obtain an output image comprises the following steps:
acquiring the initial intensity of the fog penetration algorithm according to the image characteristics;
and carrying out fog penetration processing on the original image by utilizing the fog penetration algorithm of the initial intensity to obtain an output image.
7. The method of claim 5, wherein the fog-penetrating the original image using a fog-penetrating algorithm comprises:
selecting an algorithm template, wherein the algorithm template is a two-dimensional matrix of (n +1) × (n +1), and n is more than or equal to 0; the larger n in the algorithm template is, the stronger the fog penetration algorithm is;
carrying out fog penetration treatment according to the following formula:
Figure FDA0003468513210000031
wherein S isijFor elements in the selected algorithm template, the SavgThe average gray value of the original image is obtained; ssetSetting a gray value; fcThe fog penetration enhancement coefficient; pnnThe gray value of the pixel with the coordinate of (n, n) in the processed output image is obtained.
8. An image fog-penetrating apparatus, comprising:
the image preprocessing module is used for acquiring an original image;
the fog penetration algorithm module is used for carrying out fog penetration processing on the original image by utilizing a fog penetration algorithm to obtain an output image, and dynamically adjusting the intensity of the fog penetration algorithm according to the output characteristic, the output noise and/or the fog penetration processing time;
and the output analysis module is used for calculating the output characteristics, the output noise and/or the fog penetration processing time length of the output image.
9. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
10. An electronic device comprising a memory and a processor, the memory having a computer program stored thereon, wherein the processor when executing the program performs the method of any of claims 1-7.
CN202210037436.2A 2022-01-13 2022-01-13 Image fog penetration method and device and storage medium Pending CN114565519A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210037436.2A CN114565519A (en) 2022-01-13 2022-01-13 Image fog penetration method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210037436.2A CN114565519A (en) 2022-01-13 2022-01-13 Image fog penetration method and device and storage medium

Publications (1)

Publication Number Publication Date
CN114565519A true CN114565519A (en) 2022-05-31

Family

ID=81711909

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210037436.2A Pending CN114565519A (en) 2022-01-13 2022-01-13 Image fog penetration method and device and storage medium

Country Status (1)

Country Link
CN (1) CN114565519A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015192718A1 (en) * 2014-06-18 2015-12-23 深圳市金立通信设备有限公司 Image processing method and apparatus
CN106023110A (en) * 2016-05-20 2016-10-12 河海大学 Image defogging method with high fidelity
CN107872608A (en) * 2016-09-26 2018-04-03 华为技术有限公司 Image capture device and image processing method
CN108093175A (en) * 2017-12-25 2018-05-29 北京航空航天大学 A kind of adaptive defogging method of real-time high-definition video and device
EP3745348A1 (en) * 2019-05-27 2020-12-02 Canon Kabushiki Kaisha Image processing for removing fog or haze in images

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015192718A1 (en) * 2014-06-18 2015-12-23 深圳市金立通信设备有限公司 Image processing method and apparatus
CN106023110A (en) * 2016-05-20 2016-10-12 河海大学 Image defogging method with high fidelity
CN107872608A (en) * 2016-09-26 2018-04-03 华为技术有限公司 Image capture device and image processing method
CN108093175A (en) * 2017-12-25 2018-05-29 北京航空航天大学 A kind of adaptive defogging method of real-time high-definition video and device
EP3745348A1 (en) * 2019-05-27 2020-12-02 Canon Kabushiki Kaisha Image processing for removing fog or haze in images

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HENG LIU 等: "Large size single image fast defogging and the real time video defogging FPGA architecture", 《NEUROCOMPUTING》, 1 June 2017 (2017-06-01), pages 97 *
白雪纯: "降质视频图像清晰化处理及其应用研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》, 15 January 2021 (2021-01-15), pages 138 - 1346 *

Similar Documents

Publication Publication Date Title
US10521885B2 (en) Image processing device and image processing method
CN109743473A (en) Video image 3 D noise-reduction method, computer installation and computer readable storage medium
WO2016206087A1 (en) Low-illumination image processing method and device
CN102446352B (en) Method of video image processing and device
CN108665428B (en) Image enhancement method, device, equipment and storage medium
CN109767408B (en) Image processing method, image processing device, storage medium and computer equipment
US9704227B2 (en) Method and apparatus for image enhancement
CN112529854B (en) Noise estimation method, device, storage medium and equipment
JP2004310475A (en) Image processor, cellular phone for performing image processing, and image processing program
CN113449730A (en) Image processing method, system, automatic walking device and readable storage medium
WO2021128498A1 (en) Image adaptive noise reduction method and apparatus
CN110378860B (en) Method, device, computer equipment and storage medium for repairing video
CN115984570A (en) Video denoising method and device, storage medium and electronic device
US9036938B2 (en) Image processing apparatus, image processing method, and program
CN113011433B (en) Filtering parameter adjusting method and device
CN110136085B (en) Image noise reduction method and device
CN115660994B (en) Image enhancement method based on regional least square estimation
CN114565519A (en) Image fog penetration method and device and storage medium
CN111445411A (en) Image denoising method and device, computer equipment and storage medium
Nguyen et al. FPGA-based Haze removal architecture using multiple-exposure fusion
CN114298936A (en) Noise reduction and sharpening combined processing method and device
CN113438386A (en) Dynamic and static judgment method and device applied to video processing
CN112950515A (en) Image processing method and device, computer readable storage medium and electronic device
Kim Edge-preserving and adaptive transmission estimation for effective single image haze removal
CN112752064A (en) Processing method and system for power communication optical cable monitoring video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination