CN112435183A - Image noise reduction method and device and storage medium - Google Patents

Image noise reduction method and device and storage medium Download PDF

Info

Publication number
CN112435183A
CN112435183A CN202011290556.0A CN202011290556A CN112435183A CN 112435183 A CN112435183 A CN 112435183A CN 202011290556 A CN202011290556 A CN 202011290556A CN 112435183 A CN112435183 A CN 112435183A
Authority
CN
China
Prior art keywords
noise reduction
visible
intensity
brightness
infrared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011290556.0A
Other languages
Chinese (zh)
Inventor
冉昭
张东
王松
刘晓沐
俞克强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202011290556.0A priority Critical patent/CN112435183A/en
Publication of CN112435183A publication Critical patent/CN112435183A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20182Noise reduction or smoothing in the temporal domain; Spatio-temporal filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image noise reduction method, an image noise reduction device and a storage medium, wherein the image noise reduction method firstly acquires an infrared image and a visible light image at the current moment, the infrared image comprises an infrared brightness channel, and the visible light image comprises a visible brightness channel and a visible color channel; and then, respectively carrying out space domain noise reduction and time domain noise reduction on the infrared image and the visible light image by utilizing signals of the infrared brightness channel, the visible brightness channel and the visible color channel. According to the method and the device, when the infrared image and the visible light image are subjected to space domain noise reduction and time domain noise reduction, signals of one or more channels in an infrared brightness channel of the infrared image, a visible brightness channel of the visible light image and a visible color channel can be used, and the space domain or the time domain noise reduction aiming at the visible light image can be realized by different noise reduction algorithms according to two different channel characteristics of brightness and color, so that the two different channels are subjected to differentiated noise reduction processing. Therefore, the signal-to-noise ratio of the image is improved.

Description

Image noise reduction method and device and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image denoising method and apparatus, and a storage medium.
Background
In the process of acquiring a monitored image by monitoring equipment, the monitoring equipment is generally influenced by external environment and software and hardware, for example, the monitoring equipment is affected by defects of sensor materials, circuit structures, transmission media, recording equipment and the like, so that noise in the image is often difficult to avoid. Therefore, it is necessary to develop a noise reduction algorithm to process the image to reduce the noise in the image and improve the signal-to-noise ratio of the image, so as to make the visual appearance better.
Disclosure of Invention
The technical problem mainly solved by the present application is to provide an image noise reduction method and apparatus and a storage medium, which can reduce noise in an image.
In order to solve the technical problem, the application adopts a technical scheme that:
provided is an image noise reduction method including:
acquiring an infrared image and a visible light image at the current moment, wherein the infrared image comprises an infrared brightness channel, and the visible light image comprises a visible brightness channel and a visible color channel;
and respectively carrying out space domain noise reduction and time domain noise reduction on the infrared image and the visible light image by utilizing signals of the infrared brightness channel, the visible brightness channel and the visible color channel.
In order to solve the above technical problem, another technical solution adopted by the present application is:
provided is an image noise reduction device including: a memory and a processor coupled to each other; wherein the memory stores program instructions, and the processor is capable of executing the program instructions to implement the image denoising method according to the above technical solution.
In order to solve the above technical problem, another technical solution adopted by the present application is:
there is provided a computer readable storage medium having stored thereon program instructions executable by a processor to implement the image denoising method according to the above technical solution.
The beneficial effect of this application is: different from the situation of the prior art, the image denoising method provided by the application firstly acquires an infrared image and a visible light image at the current moment, wherein the infrared image comprises an infrared brightness channel, and the visible light image comprises a visible brightness channel and a visible color channel; and then, respectively carrying out space domain noise reduction and time domain noise reduction on the infrared image and the visible light image by utilizing signals of the infrared brightness channel, the visible brightness channel and the visible color channel. That is to say, according to the present application, when performing spatial domain noise reduction and temporal domain noise reduction on an infrared image and a visible light image, signals of one or more channels of an infrared luminance channel of the infrared image, a visible luminance channel of the visible light image, and a visible color channel may be used, and for spatial domain or temporal domain noise reduction of the visible light image, different noise reduction algorithms may be implemented according to two different channel characteristics of luminance and color, and differential noise reduction processing may be performed on the two different channels. Therefore, the image denoising method can process the image, reduce the noise in the image, improve the signal to noise ratio of the image and enable the visual impression to be better.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts. Wherein:
FIG. 1 is a schematic flowchart illustrating an embodiment of an image denoising method according to the present application;
FIG. 2 is a schematic flow chart illustrating an embodiment of step S12 in FIG. 1;
FIG. 3 is a flowchart illustrating an embodiment of step S21 in FIG. 2;
FIG. 4 is a flowchart illustrating an embodiment of step S22 in FIG. 2;
FIG. 5 is a flowchart illustrating an embodiment of step S42 in FIG. 4;
FIG. 6 is a functional relationship diagram of the noise reduction intensity of the infrared luminance differentiation airspace and the motion information;
FIG. 7 is a schematic flow chart illustrating another embodiment of step S22 in FIG. 2;
FIG. 8 is a schematic structural diagram of an embodiment of an image noise reduction apparatus according to the present application;
FIG. 9 is a schematic structural diagram of an embodiment of a computer-readable storage medium according to the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments that can be obtained by a person skilled in the art without making any inventive step based on the embodiments in the present application belong to the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of an image denoising method according to the present application, including the following steps:
step S11, acquiring an infrared image and a visible light image at the current time, where the infrared image includes an infrared luminance channel and the visible light image includes a visible luminance channel and a visible color channel. Taking the visible light image in YUV format as an example, the visible luminance channel refers to the Y channel, and the visible color channel refers to the UV channel.
In the process of acquiring a monitored image, the monitoring device is usually interfered by external environment and software and hardware, for example, due to defects of sensor materials, electronic components and circuit structures, transmission media, recording devices, and the like, so that noise in the image is often difficult to avoid.
The presence of noise in the image not only has poor visual appearance, but also directly affects the performance of advanced computer vision applications, such as face recognition, vehicle detection, and the like. For this reason, noise suppression needs to be performed through a noise reduction algorithm, however, when noise in an image is severe, the noise reduction algorithm is difficult to effectively distinguish between a signal and the noise, and great challenges are brought to the noise reduction algorithm. This phenomenon is increasingly obvious under low illumination environment, and in practical application scene, the low illumination condition is more common again, though use white light filling lamp can compensate the weak defect of ambient light better, but also can bring comparatively serious light pollution, and the use scene receives certain restriction. Compared with the prior art, the infrared light supplement lamp does not bring excessive light pollution, and simultaneously, the infrared image collected by the infrared camera has a higher signal-to-noise ratio, but the infrared image has inherent defects, namely the infrared image hardly carries any color information of an object and cannot better reflect the real information of the object. The use of either visible or infrared images alone has disadvantages. For the two images with complementarity, if the noise reduction of the visible light image can be guided through the infrared image information, the visible light image can be used for effectively discriminating the signals and the noise, so that the noise reduction effect of the visible light image is better, the real scene information can be reflected, and the defect that the infrared image signals cannot reflect the object color information is avoided. Therefore, a combined noise reduction algorithm needs to be developed to process the image, and by mining infrared image information and feeding the infrared image information back to the noise reduction process of the visible light image, the noise reduction effect of the visible light image is better, and the visual impression is better.
However, the infrared image itself may have a low signal-to-noise ratio region, an infrared information loss region, an overexposure region, and the like, and the infrared image also needs noise reduction processing. Of course, the infrared luminance channel of the infrared image after noise reduction may also be used to perform noise reduction guidance on the visible light image. In addition, after the infrared image and the visible light image are acquired, the infrared image and the visible light image are registered and aligned, so that subsequent noise reduction processing is facilitated.
And step S12, respectively performing spatial domain noise reduction and temporal domain noise reduction on the infrared image and the visible light image by using signals of the infrared brightness channel, the visible brightness channel and the visible color channel. After the infrared image and the visible light image at the current moment are obtained, the color information of the infrared channel is omitted because the infrared image hardly carries the color information of the shot object, and the infrared brightness channel, the visible brightness channel and the visible color channel can be obtained. The visible color channel is not limited to one, that is, the present application does not limit the expression format of the visible light image, such as HSV and YUV, as long as the luminance channel and the color channel are included. When there are multiple color channels, the same noise reduction algorithm is used for the multiple color channels. The YUV is taken as an example to explain here, if there are two U channels and two V channels in a color channel, the same denoising algorithm is applied to the U channel and the V channel to obtain denoising strengths at various positions, then the two denoising strengths are weighted and fused to obtain the denoising strength corresponding to the visible color channel, and the weighting coefficient can be specified from the outside. Hereinafter, "three channels" includes an infrared luminance channel, a visible luminance channel, and a visible color channel, wherein when there are a plurality of visible color channels, "three channels" include the visible color channel after fusion.
After the signals of the three channels are obtained, the signals can be used for respectively carrying out space domain noise reduction and time domain noise reduction on the infrared image and the visible light image. When the noise reduction is performed on the visible light image in a space domain or a time domain, different noise reduction intensities can be adopted for the visible brightness channel and the visible color channel so as to improve the performance of the noise reduction algorithm. In addition, the order of spatial domain noise reduction and time domain noise reduction is not limited, and time domain noise reduction can be performed first and then spatial domain noise reduction can be performed, or spatial domain noise reduction can be performed first and then time domain noise reduction can be performed.
In the embodiment, when performing spatial domain noise reduction and time domain noise reduction on the infrared image and the visible light image, signals of one or more channels of an infrared brightness channel of the infrared image, a visible brightness channel of the visible light image and a visible color channel can be used, and for spatial domain or time domain noise reduction of the visible light image, different noise reduction algorithms can be realized according to two different channel characteristics of brightness and color, so that differentiated noise reduction processing is performed on the two different channels. Therefore, the image denoising method can be used for processing the image, reducing the noise in the image, improving the signal to noise ratio of the image and enabling the visual impression to be better.
In some embodiments, referring to fig. 2, fig. 2 is a flowchart illustrating an embodiment of step S12 in fig. 1, and the following steps may be performed to perform spatial domain noise reduction and temporal domain noise reduction on the infrared image and the visible light image, respectively.
And step S21, acquiring motion information corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel respectively.
Specifically, referring to fig. 3, fig. 3 is a flowchart illustrating an embodiment of step S21 in fig. 2, and the motion information corresponding to the infrared luminance channel, the visible luminance channel, and the visible color channel may be obtained through the following steps.
Step S31, respectively acquiring a first frame difference image, a second frame difference image and a third frame difference image of the current frame image and the previous frame image of the infrared luminance channel, the visible luminance channel and the visible color channel. Here, the third frame difference image may be obtained through a single color channel, or may be obtained by fusing a plurality of color channels, and the fusion weight may be specified through the outside.
In any channel, the current frame image is used to subtract the previous frame image to obtain the frame difference image of the previous frame image and the next frame image. In another embodiment, when the corresponding noise-reduced image exists in the previous frame image, the noise-reduced image of the previous frame image may be subtracted from the current frame image to obtain the corresponding frame difference image. And if the total number of the three channels of the infrared brightness channel, the visible brightness channel and the visible color channel is three, the corresponding three frame difference images can be obtained.
Step S32, performing mean filtering processing on the first frame difference image, the second frame difference image, and the third frame difference image, and taking the absolute value result as the motion information corresponding to the infrared luminance channel, the visible luminance channel, and the visible color channel, respectively.
After frame difference images corresponding to each of the three channels are obtained, n × n mean filtering processing is performed on each frame difference image, and then an absolute value of a mean result is taken as corresponding motion information, wherein the size of a window n for mean filtering processing is specified by the outside, and specific operations of the mean filtering processing are the same as those disclosed in the prior art, and are not described herein again. After the processing, the motion information corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel is obtained. In addition, the larger the value of the motion information of a certain pixel point is, the more the pixel point is biased to the motion area, otherwise, the more the pixel point is biased to the static area.
The method and the device adopt a frame difference and mean filtering mode to obtain the motion information of each channel, and perform noise reduction processing subsequently, so that the algorithm is mature, and the accuracy and the efficiency of the noise reduction algorithm are improved.
And step S22, acquiring infrared brightness space domain noise reduction intensity, visible brightness space domain noise reduction intensity and visible color space domain noise reduction intensity respectively corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel by utilizing the motion information, and acquiring infrared brightness time domain noise reduction intensity, visible brightness time domain noise reduction intensity and visible color time domain noise reduction intensity respectively corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel by utilizing the motion information.
After the motion information corresponding to the three channels is obtained, the motion information is used to obtain the spatial domain noise reduction intensity and the temporal domain noise reduction intensity corresponding to the three channels, that is, three spatial domain noise reduction intensities and three temporal domain noise reduction intensities are obtained in total, so as to perform spatial domain noise reduction and temporal domain noise reduction on the infrared image and the visible light image respectively in different channels, and a specific process of obtaining the noise reduction intensity will be described below.
And step S23, respectively performing space domain noise reduction on the infrared brightness channel, the visible brightness channel and the visible color channel by utilizing the infrared brightness space domain noise reduction intensity, the visible brightness space domain noise reduction intensity and the visible color space domain noise reduction intensity so as to respectively perform space domain noise reduction on the infrared image and the visible light image, and respectively performing time domain noise reduction on the infrared brightness channel, the visible brightness channel and the visible color channel by utilizing the infrared brightness time domain noise reduction intensity, the visible brightness time domain noise reduction intensity and the visible color time domain noise reduction intensity so as to respectively perform time domain noise reduction on the infrared image and the visible light image.
After the three space domain noise reduction strengths and the three time domain noise reduction strengths are obtained, the three space domain noise reduction strengths are utilized to respectively perform space domain noise reduction on the three corresponding channels, and the three time domain noise reduction strengths are utilized to respectively perform time domain noise reduction on the three corresponding channels, so that the noise reduction processing on the infrared image and the visible light image is completed. The specific spatial domain or temporal domain denoising process is the same as that in the prior art, and is not described herein again.
In the embodiment, the motion information of each channel is obtained by adopting a frame difference and mean filtering mode, three space domain noise reduction intensities and three time domain noise reduction intensities corresponding to the three channels are further obtained by utilizing the motion information, and the infrared image and the visible light image are subjected to noise reduction processing in the three channels respectively. The method is used for reducing noise in a space domain or a time domain of a visible light image, different noise reduction intensities can be realized according to two different channel characteristics of brightness and color, and the two different channels are subjected to differentiated noise reduction processing. Therefore, the image denoising method can be used for processing the image, reducing the noise in the image and improving the performance of the denoising algorithm.
In some embodiments, referring to fig. 4, fig. 4 is a flowchart illustrating an embodiment of step S22 in fig. 2, and the spatial noise reduction strength corresponding to each of the three channels may be obtained through the following steps.
And step S41, obtaining the infrared brightness initial airspace noise reduction intensity, the visible brightness initial airspace noise reduction intensity and the visible color initial airspace noise reduction intensity which respectively correspond according to the edge information and the non-edge information of the infrared brightness channel, the visible brightness channel and the visible color channel.
The wavelet denoising based on the hard threshold is taken as an example for explanation, the initial spatial domain denoising strength is the size of the hard threshold, for the non-edge region of the image, a larger hard threshold is designated for region smoothing, and for the edge region of the image, a smaller hard threshold is designated to achieve the purpose of edge preservation. The distinction between the edge region and the non-edge region can be determined by using an edge detection operator such as sobel, Prewitt and the like. In addition, when the edge region and the non-edge region are calculated, the image depended on can be the current frame image or any historical frame noise reduction image. And respectively executing the operations in the infrared brightness channel, the visible brightness channel and the visible color channel so as to obtain the initial noise reduction intensity of the infrared brightness airspace, the initial noise reduction intensity of the visible brightness airspace and the initial noise reduction intensity of the visible color airspace, and the total three initial noise reduction intensities of the airspace. Of course, in other noise reduction algorithm processes such as NLM, BM3D, the initial spatial noise reduction strength of different channels may be specified according to the specific algorithm process.
And step S42, acquiring infrared brightness difference airspace noise reduction intensity, visible brightness difference airspace noise reduction intensity and visible color difference airspace noise reduction intensity by utilizing the motion information, and acquiring airspace fusion noise reduction intensity by utilizing the infrared brightness initial airspace noise reduction intensity and the visible brightness initial airspace noise reduction intensity.
Specifically, referring to fig. 5, fig. 5 is a flowchart illustrating an embodiment of step S42 in fig. 4, and the infrared luminance difference spatial domain noise reduction intensity, the visible luminance difference spatial domain noise reduction intensity, and the visible color difference spatial domain noise reduction intensity may be obtained through the following steps.
In step S51, it is determined whether the motion information is smaller than the first threshold.
And step S52, if yes, assigning the corresponding differential airspace noise reduction intensity as a first intensity.
Step S53, otherwise, it is further determined whether the motion information is greater than a second threshold, where the second threshold is greater than the first threshold.
And step S54, if yes, assigning the corresponding differential spatial domain noise reduction intensity as a second intensity, wherein the second intensity is greater than the first intensity.
And step S55, otherwise, assigning the corresponding differential spatial noise reduction intensity as a third intensity, wherein the third intensity is greater than or equal to the first intensity and less than or equal to the second intensity, and is positively correlated with the motion information.
And (4) respectively repeating the steps S51-S55 aiming at the infrared brightness channel, the visible brightness channel and the visible color channel, so as to respectively obtain the differential airspace noise reduction intensity corresponding to each channel. Specifically, taking the infrared luminance channel as an example for explanation, please refer to fig. 6, where fig. 6 is a functional relationship diagram of the infrared luminance differentiation spatial domain noise reduction intensity and the motion information, in the diagram, T1 and T2 respectively correspond to a first threshold and a second threshold of the motion information of the infrared luminance channel, and S1 and S2 respectively correspond to a first intensity and a second intensity of the infrared luminance differentiation spatial domain noise reduction intensity of the infrared luminance channel. As can be seen from fig. 6, in the infrared luminance channel, when the motion information is less than the first threshold T1, the infrared luminance differentiation spatial noise reduction intensity is the first intensity S1; when the motion information is greater than the second threshold T2, the infrared brightness differentiation spatial domain noise reduction intensity is a second intensity S2; when the motion information is between the first threshold T1 and the second threshold T2, the infrared luminance differentiation spatial noise reduction intensity is a third intensity, wherein the third intensity is between the first intensity S1 and the second intensity S2, and the third intensity is positively correlated with the motion information, and the third intensity linearly changes from S1 to S2 as the motion information changes from T1 to T2.
Aiming at three channels, namely an infrared brightness channel, a visible brightness channel and a visible color channel, a first threshold value and a second threshold value, a first intensity and a second intensity of each channel can be specified from the outside and can be the same or different, and in each channel, the functional relation between the differentiated airspace noise reduction intensity and the motion information can be the same or different. When the motion information value of a certain pixel point in a certain channel is smaller than a first threshold value, the pixel point is located in a static area, and when the motion information value is larger than a second threshold value, the pixel point is located in a motion area.
According to the embodiment, the differential processing is carried out on the moving area and the static area when the airspace noise reduction strength is obtained, the differential airspace noise reduction strength of the moving area is obviously higher than that of the static area, the trailing effect caused by simply increasing the noise reduction strength of the moving area can be effectively inhibited, the final noise reduction effect is better, and the accuracy of the noise reduction algorithm is improved.
Further, the spatial domain fusion noise reduction intensity can be obtained by utilizing the infrared brightness initial spatial domain noise reduction intensity and the visible brightness initial spatial domain noise reduction intensity through the following steps:
and taking the sum of the product of the infrared brightness initial spatial domain noise reduction intensity and the first spatial domain fusion coefficient and the product of the visible brightness initial spatial domain noise reduction intensity and the second spatial domain fusion coefficient as the spatial domain fusion noise reduction intensity, wherein the sum of the first spatial domain fusion coefficient and the second spatial domain fusion coefficient is 1.
That is, the spatial domain fusion noise reduction intensity beta is calculated by the following formulafuvisy
betafuvisy=betaorinir*m1+betaorivisy*m2
Wherein, beta isorinirAnd betaorivisyRespectively representing the noise reduction intensity of an infrared brightness initial space domain and the noise reduction intensity of a visible brightness initial space domain, m1And m2Respectively representing a first airspace fusion coefficient and a second airspace fusion coefficient, the value ranges are all 0-1, and m is1+m2=0。
Specifically, the airspace fusion noise reduction intensity beta can be calculated according to the quality difference of the infrared image and the visible light imagefuvisyHere, the brightness of the image is taken as an example to determine the quality of the image. First, a first brightness range and a second brightness range are defined, if the brightness of a certain pixel point in the infrared image is within the first brightness range, the quality of the pixel point in the infrared image is excellent, and if the brightness of the certain pixel point in the visible image is within the second brightness range, the quality of the pixel point in the visible image is excellent. The first luminance range and the second luminance range are both designated from the outside, and may be the same or different.
When the brightness of a certain pixel point in the infrared image is within a first brightness range and the brightness of a corresponding pixel point in the visible light image is within a second brightness range, the quality of the infrared image and the quality of the visible light image are considered to be excellent, and m can be defined1=m20.5, that is, the fusion weight of the infrared luminance channel and the visible luminance channel is equivalent.
When the brightness of a certain pixel point in the infrared image is within a first brightness range and the brightness of a corresponding pixel point in the visible light image is not within a second brightness range, the quality of the infrared image is considered to be better than that of the visible light image, and m can be defined1>0.5>m2I.e. mainly in the infrared luminance channelAnd (5) spatial domain fusion.
When the brightness of a certain pixel point in the infrared image is not in a first brightness range and the brightness of a corresponding pixel point in the visible light image is in a second brightness range, the quality of the visible light image is considered to be better than that of the infrared image, and m can be defined2>0.5>m1That is, spatial domain fusion is performed mainly on the visible luminance channel.
When the brightness of a certain pixel point in the infrared image is not in a first brightness range and the brightness of a corresponding pixel point in the visible light image is not in a second brightness range, the quality of the infrared image and the quality of the visible light image are considered to be poor, and m is still defined at the moment2>0.5>m1That is, the spatial domain fusion is performed mainly by the visible brightness channel, but the initial spatial domain noise reduction intensity of the visible brightness used in the formula is adjusted to the initial spatial domain noise reduction intensity of the visible brightness of the historical noise-reduced visible light image.
Of course, in other embodiments, indexes such as variance of an image and high and low frequency information may be used as criteria for determining whether the infrared image and the visible light image are good or bad, and the present application does not limit the criteria.
In the embodiment, when the infrared image is used for guiding the airspace noise reduction process of the visible light image, various conditions that the infrared image has poor quality compared with the visible light image, has better quality compared with the visible light image, has equivalent quality with the visible light image and the like are considered. And for the situation that the quality of both the visible light image and the infrared image is poor, the historical denoising result with less noise and better edge information is used for generating the airspace fusion denoising intensity, so that the airspace denoising has better differentiation degree to the edge region and the non-edge region, the finally presented denoising effect is better, and the accuracy of the denoising algorithm is improved.
And step S43, acquiring infrared brightness spatial domain noise reduction intensity by using the infrared brightness initial spatial domain noise reduction intensity and the infrared brightness differentiated spatial domain noise reduction intensity, acquiring visible brightness spatial domain noise reduction intensity by using the spatial domain fusion noise reduction intensity and the visible brightness differentiated spatial domain noise reduction intensity, and acquiring visible color spatial domain noise reduction intensity by using the visible color initial spatial domain noise reduction intensity and the visible color differentiated spatial domain noise reduction intensity.
Specifically, step S43 includes:
obtaining a first product of the infrared brightness differentiation spatial domain noise reduction intensity and a first weight and a second product of the infrared brightness initial spatial domain noise reduction intensity and a second weight, and taking the sum of the first product and the second product as the infrared brightness spatial domain noise reduction intensity, wherein the sum of the first weight and the second weight is 1; obtaining a third product of the visible brightness differential spatial domain noise reduction intensity and a third weight and a fourth product of the spatial domain fusion noise reduction intensity and a fourth weight, and taking the sum of the third product and the fourth product as the visible brightness spatial domain noise reduction intensity, wherein the sum of the first alkali weight and the fourth weight is 1; and obtaining a fifth product of the visible color differential spatial noise reduction intensity and a fifth weight and a sixth product of the visible color initial spatial noise reduction intensity and a sixth weight, and taking the sum of the fifth product and the sixth product as the visible color spatial noise reduction intensity, wherein the sum of the fifth weight and the sixth weight is 1.
That is, the infrared luminance spatial domain noise reduction intensity, the visible luminance spatial domain noise reduction intensity, and the visible color spatial domain noise reduction intensity may be calculated using the following formulas:
betanir=alpha0*betamovenir+(1-alpha0)*betaorinir
betavisy=alpha1*betamovevisy+(1-alpha1)*betafuvisy
betavisuv=alpha2*betamovevisuv+(1-alpha2)*betaorivisuv
wherein, beta isnir、betavisyAnd betavisuvRespectively representing the noise reduction intensity of an infrared brightness space domain, the noise reduction intensity of a visible brightness space domain and the noise reduction intensity of a visible color space domain, betamovenir、betamovevisyAnd betamovevisuvRespectively represents the noise reduction intensity of an infrared brightness difference airspace, the noise reduction intensity of a visible brightness difference airspace and the noise reduction intensity of a visible color difference airspaceorinirAnd betaorivisuvRespectively representing the noise reduction intensity of the initial space domain of the infrared brightness and the initial space domain reduction of the visible colorNoise intensity, betafuvisyThe spatial domain fusion noise reduction intensity is represented, and the alpha0, the alpha1 and the alpha2 respectively represent a first weight, a third weight and a fifth weight which are preset, can be specified from the outside, and all the value ranges are 0-1.
In the embodiment, the spatial domain noise reduction intensities corresponding to the three channels are respectively weighted and calculated according to the externally specified right coefficients, the guidance of the infrared image on the spatial domain noise reduction of the visible light image is realized in the calculation process, the differential spatial domain noise reduction intensities aiming at the moving area and the static area are also introduced into the final calculation process, the finally presented noise reduction effect is better, and the accuracy of the noise reduction algorithm is improved.
In some embodiments, referring to fig. 7, fig. 7 is a schematic flowchart of another embodiment of step S22 in fig. 2, and the time domain noise reduction intensities corresponding to the three channels may be obtained through the following steps.
And step S61, acquiring infrared brightness initial time domain noise reduction intensity, visible brightness initial time domain noise reduction intensity and visible color initial time domain noise reduction intensity respectively corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel by utilizing the motion information, and assigning the infrared brightness time domain noise reduction intensity as the infrared brightness initial time domain noise reduction intensity.
Specifically, after the motion information corresponding to the three channels is obtained in the steps S31-S32, the infrared brightness initial time domain noise reduction intensity, the visible brightness initial time domain noise reduction intensity, and the visible color initial time domain noise reduction intensity are assigned to the motion information corresponding to the infrared brightness channel, the visible brightness channel, and the visible color channel, respectively. And further assigning the infrared brightness time domain noise reduction intensity as the infrared brightness initial time domain noise reduction intensity, namely, the infrared brightness time domain noise reduction intensity is equal to the motion information corresponding to the infrared brightness channel.
And step S62, acquiring a first time domain fusion noise reduction intensity by using the infrared brightness initial time domain noise reduction intensity and the visible brightness initial time domain noise reduction intensity, and assigning the visible brightness time domain noise reduction intensity as the first time domain fusion noise reduction intensity.
Specifically, the first time domain fusion noise reduction strength may be obtained by:
and taking the sum of the product of the infrared brightness initial time domain noise reduction intensity and the first time domain fusion coefficient and the product of the visible brightness initial time domain noise reduction intensity and the second time domain fusion coefficient as the first time domain fusion noise reduction intensity, wherein the sum of the first time domain fusion coefficient and the second time domain fusion coefficient is 1.
That is, the first time-domain fusion noise reduction strength gama is calculated using the following formulafuvisy
gamafuvisy=gamaorinir*n1+gamaorivisy*n2
Wherein, gamafuvisyRepresenting the first time domain fusion noise reduction strength, gamaorinirAnd bamaorivisyRespectively representing the initial time domain noise reduction intensity of the infrared brightness and the initial time domain noise reduction intensity of the visible brightness, n1And n2Respectively representing a first time domain fusion coefficient and a second time domain fusion coefficient, the value ranges are all 0-1, and n1+n2=0。
The same process as the spatial domain fusion, the first time domain fusion noise reduction intensity gama can be calculated according to the quality difference of the infrared image and the visible light imagefuvisyHere, the brightness of the image is taken as an example to determine the quality of the image. First, a first brightness range and a second brightness range are defined, if the brightness of a certain pixel point in the infrared image is within the first brightness range, the quality of the pixel point in the infrared image is excellent, and if the brightness of the certain pixel point in the visible image is within the second brightness range, the quality of the pixel point in the visible image is excellent. The first luminance range and the second luminance range are both designated from the outside, and may be the same or different.
When the brightness of a certain pixel point in the infrared image is within a first brightness range and the brightness of a corresponding pixel point in the visible light image is not within a second brightness range, the quality of the infrared image is considered to be better than that of the visible light image, and n can be defined1>0.5>n2Namely, the time domain fusion is performed mainly by the infrared brightness channel.
When the brightness of a certain pixel point in the infrared image is not within a first brightness range and the brightness of a corresponding pixel point in the visible light image is within a second brightness range, the quality of the visible light image is considered to be better than that of the infrared image, and n can be defined2>0.5>n1I.e. the time domain fusion is performed mainly on the visible luminance channel.
When the brightness of a certain pixel point in the infrared image is within a first brightness range and the brightness of a corresponding pixel point in the visible light image is within a second brightness range, or when the brightness of a certain pixel point in the infrared image is not within the first brightness range and the brightness of a corresponding pixel point in the visible light image is not within the second brightness range, the quality of the infrared image and the quality of the visible light image are considered to be both excellent or poor, and n can be defined1=n20.5, that is, the fusion weight of the infrared luminance channel and the visible luminance channel is equivalent.
After the first time domain fusion noise reduction intensity is obtained through the steps, the visible brightness time domain noise reduction intensity is assigned as the first time domain fusion noise reduction intensity, namely, the infrared brightness channel is used for guiding the time domain noise reduction of the visible brightness channel.
And step S63, acquiring a second time domain fusion noise reduction intensity by using the visible color initial time domain noise reduction intensity and the first time domain fusion noise reduction intensity, and assigning the visible color time domain noise reduction intensity as the second time domain fusion noise reduction intensity.
Specifically, the second time domain fusion noise reduction strength may be obtained by:
and taking the sum of the product of the visible color initial time domain noise reduction intensity and the third time domain fusion coefficient and the product of the first time domain fusion noise reduction intensity and the fourth time domain fusion coefficient as the second time domain fusion noise reduction intensity, wherein the sum of the third time domain fusion coefficient and the fourth time domain fusion coefficient is 1.
That is, the second time-domain fusion noise reduction strength gama may be calculated using the following formulafuvisuv
gamafuvisuv=gamaorivisuv*n3+gamafuvisy*n4
Wherein, gamafuvisuvRepresenting a second time-domain fusion noise reduction strength, gamafuvisyRepresenting the first time domain fusion noise reduction strength, gamaorivisuvRepresenting the initial temporal noise reduction intensity, n, of the visible color3And n4Respectively representing a third time domain fusion coefficient and a fourth time domain fusion coefficient, the value range is 0-1, and n3+n4=0。
Because the first time domain fusion noise reduction intensity is obtained by weighted calculation of the infrared brightness initial time domain noise reduction intensity and the visible brightness initial time domain noise reduction intensity, the time domain noise reduction process of the visible color channel is specially processed by the embodiment, so that the visible color channel can effectively utilize the information of the infrared brightness channel and the visible brightness channel at the same time. That is to say, the embodiment guides the time domain noise reduction of the visible brightness channel and the visible color channel by using the infrared image, so that the noise reduction effect presented finally is better, and the accuracy of the noise reduction algorithm is improved.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an embodiment of an image noise reduction apparatus according to the present application, where the image noise reduction apparatus includes a memory 810 and a processor 820 coupled to each other, where the memory 810 stores program instructions, and the processor 820 can execute the program instructions to implement an image noise reduction method according to any of the above embodiments. For details, reference may be made to the above embodiments, which are not described in detail.
In addition, the present application further provides a computer-readable storage medium, please refer to fig. 9, fig. 9 is a schematic structural diagram of an embodiment of the computer-readable storage medium of the present application, the storage medium 900 stores program instructions 910, and the program instructions 910 can be executed by a processor to implement the image denoising method according to any of the above embodiments. For details, reference may be made to the above embodiments, which are not described in detail.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (13)

1. An image noise reduction method, comprising:
acquiring an infrared image and a visible light image at the current moment, wherein the infrared image comprises an infrared brightness channel, and the visible light image comprises a visible brightness channel and a visible color channel;
and respectively carrying out space domain noise reduction and time domain noise reduction on the infrared image and the visible light image by utilizing signals of the infrared brightness channel, the visible brightness channel and the visible color channel.
2. The image denoising method according to claim 1, wherein the step of performing spatial domain denoising and temporal domain denoising on the infrared image and the visible light image respectively using the signals of the infrared luminance channel, the visible luminance channel, and the visible color channel comprises:
acquiring motion information corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel respectively;
acquiring infrared brightness space domain noise reduction intensity, visible brightness space domain noise reduction intensity and visible color space domain noise reduction intensity respectively corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel by using the motion information, and acquiring infrared brightness time domain noise reduction intensity, visible brightness time domain noise reduction intensity and visible color time domain noise reduction intensity respectively corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel by using the motion information;
the infrared brightness space domain noise reduction intensity, the visible brightness space domain noise reduction intensity and the visible color space domain noise reduction intensity are respectively used for carrying out space domain noise reduction on the infrared brightness channel, the visible brightness channel and the visible color channel so as to respectively carry out space domain noise reduction on the infrared image and the visible light image, and the infrared brightness time domain noise reduction intensity, the visible brightness time domain noise reduction intensity and the visible color time domain noise reduction intensity are respectively used for carrying out time domain noise reduction on the infrared brightness channel, the visible brightness channel and the visible color channel so as to respectively carry out time domain noise reduction on the infrared image and the visible light image.
3. The image noise reduction method according to claim 2, wherein the step of obtaining the motion information corresponding to the infrared luminance channel, the visible luminance channel, and the visible color channel respectively comprises:
respectively acquiring a first frame difference image, a second frame difference image and a third frame difference image of a current frame image and a previous frame image of the infrared brightness channel, the visible brightness channel and the visible color channel;
and respectively carrying out mean value filtering processing on the first frame difference image, the second frame difference image and the third frame difference image, and taking an absolute value result as the motion information respectively corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel.
4. The image denoising method according to claim 3, wherein the step of obtaining the infrared luminance spatial domain denoising intensity, the visible luminance spatial domain denoising intensity, and the visible color spatial domain denoising intensity respectively corresponding to the infrared luminance channel, the visible luminance channel, and the visible color channel by using the motion information comprises:
acquiring corresponding infrared brightness initial airspace noise reduction intensity, visible brightness initial airspace noise reduction intensity and visible color initial airspace noise reduction intensity according to the respective edge information and non-edge information of the infrared brightness channel, the visible brightness channel and the visible color channel;
acquiring infrared brightness difference airspace noise reduction intensity, visible brightness difference airspace noise reduction intensity and visible color difference airspace noise reduction intensity by utilizing the motion information, and acquiring airspace fusion noise reduction intensity by utilizing the infrared brightness initial airspace noise reduction intensity and the visible brightness initial airspace noise reduction intensity;
the method comprises the steps of obtaining the noise reduction intensity of an infrared brightness airspace by utilizing the noise reduction intensity of the infrared brightness initial airspace and the noise reduction intensity of an infrared brightness differentiation airspace, obtaining the noise reduction intensity of a visible brightness airspace by utilizing the noise reduction intensity of an airspace fusion noise reduction intensity and the noise reduction intensity of a visible brightness differentiation airspace, and obtaining the noise reduction intensity of a visible color airspace by utilizing the noise reduction intensity of the visible color initial airspace and the noise reduction intensity of the visible color differentiation airspace.
5. The image denoising method according to claim 4, wherein the step of obtaining the infrared luminance-differentiated spatial domain denoising strength, the visible luminance-differentiated spatial domain denoising strength, and the visible color-differentiated spatial domain denoising strength by using the motion information comprises:
judging whether the motion information is smaller than a first threshold value;
if so, assigning the corresponding differential airspace noise reduction intensity as a first intensity;
otherwise, further judging whether the motion information is larger than a second threshold value, wherein the second threshold value is larger than the first threshold value;
if so, assigning the corresponding differential airspace noise reduction intensity as a second intensity, wherein the second intensity is greater than the first intensity;
otherwise, assigning the corresponding differential spatial noise reduction intensity as a third intensity, wherein the third intensity is greater than or equal to the first intensity and less than or equal to the second intensity, and is positively correlated with the motion information.
6. The image denoising method according to claim 4, wherein the step of obtaining the spatial domain fusion denoising strength by using the infrared luminance initial spatial domain denoising strength and the visible luminance initial spatial domain denoising strength comprises:
and taking the sum of the product of the infrared brightness initial airspace noise reduction intensity and a first airspace fusion coefficient and the product of the visible brightness initial airspace noise reduction intensity and a second airspace fusion coefficient as the airspace fusion noise reduction intensity, wherein the sum of the first airspace fusion coefficient and the second airspace fusion coefficient is 1.
7. The method of image denoising according to claim 6, wherein the step of obtaining the infrared luminance spatial domain denoising intensity using the infrared luminance initial spatial domain denoising intensity and the infrared luminance differentiation spatial domain denoising intensity, obtaining the visible luminance spatial domain denoising intensity using the spatial domain fusion denoising intensity and the visible luminance differentiation spatial domain denoising intensity, and obtaining the visible color spatial domain denoising intensity using the visible color initial spatial domain denoising intensity and the visible color differentiation spatial denoising intensity comprises:
obtaining a first product of the infrared brightness differentiation spatial domain noise reduction intensity and a first weight and a second product of the infrared brightness initial spatial domain noise reduction intensity and a second weight, and taking the sum of the first product and the second product as the infrared brightness spatial domain noise reduction intensity, wherein the sum of the first weight and the second weight is 1; and
obtaining a third product of the visible brightness differential spatial domain noise reduction intensity and a third weight and a fourth product of the spatial domain fusion noise reduction intensity and a fourth weight, and taking the sum of the third product and the fourth product as the visible brightness spatial domain noise reduction intensity, wherein the sum of the first base weight and the fourth weight is 1; and
obtaining a fifth product of the visible color differential spatial noise reduction intensity and a fifth weight and a sixth product of the visible color initial spatial noise reduction intensity and a sixth weight, and taking the sum of the fifth product and the sixth product as the visible color spatial noise reduction intensity, wherein the sum of the fifth weight and the sixth weight is 1.
8. The image denoising method of claim 3, wherein the step of obtaining the infrared luminance temporal domain denoising intensity, the visible luminance temporal domain denoising intensity, and the visible color temporal domain denoising intensity respectively corresponding to the infrared luminance channel, the visible luminance channel, and the visible color channel by using the motion information comprises:
acquiring infrared brightness initial time domain noise reduction intensity, visible brightness initial time domain noise reduction intensity and visible color initial time domain noise reduction intensity which respectively correspond to the infrared brightness channel, the visible brightness channel and the visible color channel by using the motion information, and assigning the infrared brightness time domain noise reduction intensity as the infrared brightness initial time domain noise reduction intensity;
acquiring a first time domain fusion noise reduction intensity by using the infrared brightness initial time domain noise reduction intensity and the visible brightness initial time domain noise reduction intensity, and assigning the visible brightness time domain noise reduction intensity as the first time domain fusion noise reduction intensity;
and acquiring a second time domain fusion noise reduction intensity by using the visible color initial time domain noise reduction intensity and the first time domain fusion noise reduction intensity, and assigning the visible color time domain noise reduction intensity as the second time domain fusion noise reduction intensity.
9. The image denoising method of claim 8, wherein the step of obtaining the infrared luminance initial temporal denoising intensity, the visible luminance initial temporal denoising intensity, and the visible color initial temporal denoising intensity corresponding to the infrared luminance channel, the visible luminance channel, and the visible color channel respectively by using the motion information comprises:
and assigning the infrared brightness initial time domain noise reduction intensity, the visible brightness initial time domain noise reduction intensity and the visible color initial time domain noise reduction intensity to the motion information respectively corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel.
10. The method of claim 8, wherein the step of obtaining a first time domain fusion noise reduction strength using the infrared luminance initial time domain noise reduction strength and the visible luminance initial time domain noise reduction strength comprises:
and taking the sum of the product of the infrared brightness initial time domain noise reduction intensity and the first time domain fusion coefficient and the product of the visible brightness initial time domain noise reduction intensity and the second time domain fusion coefficient as the first time domain fusion noise reduction intensity, wherein the sum of the first time domain fusion coefficient and the second time domain fusion coefficient is 1.
11. The method of image denoising of claim 10, wherein the step of obtaining a second temporal fusion denoising intensity using the initial temporal denoising intensity of visible color and the first temporal fusion denoising intensity comprises:
and taking the sum of the product of the visible color initial time domain noise reduction intensity and a third time domain fusion coefficient and the product of the first time domain fusion noise reduction intensity and a fourth time domain fusion coefficient as the second time domain fusion noise reduction intensity, wherein the sum of the third time domain fusion coefficient and the fourth time domain fusion coefficient is 1.
12. An image noise reduction apparatus, comprising:
a memory and a processor coupled to each other;
wherein the memory stores program instructions executable by the processor to implement the image noise reduction method of any one of claims 1 to 11.
13. A computer-readable storage medium, characterized in that the storage medium has stored thereon program instructions executable by a processor to implement the image denoising method according to any one of claims 1-11.
CN202011290556.0A 2020-11-17 2020-11-17 Image noise reduction method and device and storage medium Pending CN112435183A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011290556.0A CN112435183A (en) 2020-11-17 2020-11-17 Image noise reduction method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011290556.0A CN112435183A (en) 2020-11-17 2020-11-17 Image noise reduction method and device and storage medium

Publications (1)

Publication Number Publication Date
CN112435183A true CN112435183A (en) 2021-03-02

Family

ID=74692697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011290556.0A Pending CN112435183A (en) 2020-11-17 2020-11-17 Image noise reduction method and device and storage medium

Country Status (1)

Country Link
CN (1) CN112435183A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538255A (en) * 2021-05-31 2021-10-22 浙江大华技术股份有限公司 Motion fusion noise reduction method and device and computer readable storage medium
CN114674817A (en) * 2022-05-30 2022-06-28 秦皇岛水熊科技有限公司 Colorimetric value signal denoising and smoothing processing method of spectral titration method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345735A (en) * 2013-07-16 2013-10-09 上海交通大学 Compressed space-time multi-sensor fusion tracking method based on Kalman filter
US20140301662A1 (en) * 2013-03-17 2014-10-09 ISC8 Inc. Analysis, Labeling and Exploitation of Sensor Data in Real Time
CN106952245A (en) * 2017-03-07 2017-07-14 深圳职业技术学院 A kind of processing method and system for visible images of taking photo by plane
CN107945149A (en) * 2017-12-21 2018-04-20 西安工业大学 Strengthen the auto Anti-Blooming Method of IHS Curvelet conversion fusion visible ray and infrared image
CN110490811A (en) * 2019-05-31 2019-11-22 杭州海康威视数字技术股份有限公司 Image noise reduction apparatus and image denoising method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140301662A1 (en) * 2013-03-17 2014-10-09 ISC8 Inc. Analysis, Labeling and Exploitation of Sensor Data in Real Time
CN103345735A (en) * 2013-07-16 2013-10-09 上海交通大学 Compressed space-time multi-sensor fusion tracking method based on Kalman filter
CN106952245A (en) * 2017-03-07 2017-07-14 深圳职业技术学院 A kind of processing method and system for visible images of taking photo by plane
CN107945149A (en) * 2017-12-21 2018-04-20 西安工业大学 Strengthen the auto Anti-Blooming Method of IHS Curvelet conversion fusion visible ray and infrared image
CN110490811A (en) * 2019-05-31 2019-11-22 杭州海康威视数字技术股份有限公司 Image noise reduction apparatus and image denoising method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
彭海: "红外与可见光图像融合方法研究", 中国优秀硕士学位论文电子全文数据库信息科技, no. 7, 15 July 2012 (2012-07-15) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538255A (en) * 2021-05-31 2021-10-22 浙江大华技术股份有限公司 Motion fusion noise reduction method and device and computer readable storage medium
CN114674817A (en) * 2022-05-30 2022-06-28 秦皇岛水熊科技有限公司 Colorimetric value signal denoising and smoothing processing method of spectral titration method

Similar Documents

Publication Publication Date Title
Zhang et al. Enhancing underwater image via color correction and bi-interval contrast enhancement
US11108970B2 (en) Flicker mitigation via image signal processing
US8699818B2 (en) Method, system, and program for determining image quality based on pixel changes between image frames
Dong et al. Underwater image enhancement via integrated RGB and LAB color models
Jin et al. Quaternion-based impulse noise removal from color video sequences
JP2015188234A (en) Depth estimation based on global motion
CN108174057B (en) Method and device for rapidly reducing noise of picture by utilizing video image inter-frame difference
JP7124037B2 (en) Method, computer program product, apparatus and frequency modulated continuous wave radar system
CN112435183A (en) Image noise reduction method and device and storage medium
JP2000050109A (en) Nonlinear image filter for removing noise
CN111311524B (en) MSR-based high dynamic range video generation method
CN106412383A (en) Processing method and apparatus of video image
Mathias et al. Underwater image restoration based on diffraction bounded optimization algorithm with dark channel prior
CN110246088B (en) Image brightness noise reduction method based on wavelet transformation and image noise reduction system thereof
WO2021238655A1 (en) Image processing method and apparatus, storage medium and terminal
WO2023273868A1 (en) Image denoising method and apparatus, terminal, and storage medium
KR20050007106A (en) Methods and apparatus for adaptive reduction of ringing artifacts
CN109523474A (en) A kind of enhancement method of low-illumination image based on greasy weather degradation model
CN111539895B (en) Video denoising method and device, mobile terminal and storage medium
Jeon et al. Low-light image enhancement using inverted image normalized by atmospheric light
JP4932910B2 (en) Method and system for reducing mosquito noise in digital images
CN112446889A (en) Medical video reading method based on ultrasound
Ponomaryov et al. Fuzzy color video filtering technique for sequences corrupted by additive Gaussian noise
Gao et al. Single image haze removal algorithm using pixel-based airlight constraints
CN116468636A (en) Low-illumination enhancement method, device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination