CN115334294B - Video noise reduction method of local self-adaptive force - Google Patents

Video noise reduction method of local self-adaptive force Download PDF

Info

Publication number
CN115334294B
CN115334294B CN202210775955.9A CN202210775955A CN115334294B CN 115334294 B CN115334294 B CN 115334294B CN 202210775955 A CN202210775955 A CN 202210775955A CN 115334294 B CN115334294 B CN 115334294B
Authority
CN
China
Prior art keywords
local
bandwidth
noise reduction
curve
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210775955.9A
Other languages
Chinese (zh)
Other versions
CN115334294A (en
Inventor
凌毅
范益波
曾晓洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fudan University
Original Assignee
Fudan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fudan University filed Critical Fudan University
Priority to CN202210775955.9A priority Critical patent/CN115334294B/en
Publication of CN115334294A publication Critical patent/CN115334294A/en
Application granted granted Critical
Publication of CN115334294B publication Critical patent/CN115334294B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Picture Signal Circuits (AREA)

Abstract

The invention belongs to the technical field of image signal processing, and particularly relates to a video noise reduction method with local self-adaptive dynamics. The method comprises the following steps: generating a local gain map; local bandwidth is generated by matching local brightness with BW0 and BW1 bandwidth curves; taking the local bandwidth as a 3dB bandwidth control point, and obtaining a final weight value through a similarity-to-weight curve; and finally, carrying out weighting, limiting and fusion on the original image to obtain a final noise reduction result. The invention can accurately obtain the local gain map, and can obtain the local bandwidth by matching with the brightness noise curve, so as to fundamentally simulate the noise model of each stage affected by the digital gain, and adapt to each pixel of the same frame by using different noise reduction strategies, thereby achieving more accurate and better local noise reduction effect.

Description

Video noise reduction method of local self-adaptive force
Technical Field
The invention belongs to the technical field of image signal processing, and particularly relates to a video noise reduction method with local self-adaptive dynamics.
Background
The bare data sent by the original image sensor may contain various noises, and may be classified into random noises and fixed noises, wherein the random noises are irrelevant to periods and positions, and the fixed noises occur at fixed positions of the image due to inconsistent characteristics of pixels. These noises deteriorate the image quality and also determine the sensitivity of the image sensor. For different noise types, corresponding noise reduction methods are needed, and main technologies of video noise reduction include spatial domain noise reduction, time domain noise reduction, transform domain noise reduction and the like.
The common video noise reduction is performed in the bayer domain or YUV domain, and the bayer domain generally adopts spatial domain noise reduction. The bayer domain noise reduction is typically located on the pipeline before bayer linear domain, such as Tonemapping. The original noise model is reserved to the greatest extent in the Bayer domain, so that better noise reduction effect is achieved, and meanwhile, the cost for noise reduction in the Bayer domain is higher, and the technology is more difficult. However, the bayer linear field is not completely linear, because the presence of various digital gains, which often vary from region to region and from scene to scene, destroys the noise distribution to different extents. Common bayer domain noise reduction algorithms, such as bilateral filtering-based, traditional non-local mean algorithm-based, have no solution to the above problems.
The invention provides a video noise reduction method with local self-adaptive force, which replaces the traditional fitting mode to fundamentally simulate various noise models damaged by digital gain and adapt to different pixels in the same frame to use different noise reduction strategies. And taking factors influencing noise forms as input, outputting local noise reduction bandwidths, determining the weight of the current pixel participating in noise reduction by combining the bandwidths with a similarity-to-weight curve, and finally embodying the noise reduction effect of the current pixel to realize local self-adaptive noise reduction force.
Disclosure of Invention
The invention aims to provide a video noise reduction method with local self-adaptive force aiming at the problem of noise model destruction in Bayer domain linear noise reduction.
The video noise reduction method of the local self-adaptive dynamics adopts the method of the local self-adaptive dynamics to replace the traditional fitting mode, so that various noise models damaged by digital gain are basically simulated, and different noise reduction strategies are used for different pixels in the same frame. And taking factors influencing noise forms as input, outputting local noise reduction bandwidth, determining the weight of the current pixel participating in noise reduction by combining with a similarity-to-weight curve, and finally embodying the noise reduction effect of the current pixel to realize local self-adaptive noise reduction.
The invention provides a video noise reduction method with local self-adaptive force, which comprises the following specific steps:
(1) Generating a local gain map;
(2) Local bandwidth is generated by matching local brightness with BW0 and BW1 bandwidth curves;
(3) Taking the local bandwidth as a 3dB bandwidth control point, and converting the local bandwidth into a weight curve through similarity to obtain a final weight value;
(4) And finally, carrying out weighting, limiting and fusion on the original image to obtain a final noise reduction result.
The local gain map is generated in the step (1), and the specific flow is as follows: the digital gain signals of automatic exposure control, automatic white balance channel gain, lens shading correction channel gain, channel gain used in high dynamic range fusion (including a motion area and an overexposure area) and the like are multiplied together point by point, the gathering process needs to perform functions of storage, synchronization and the like, each pixel is completely aligned, and finally a local gain map is generated.
Step (2) of generating a local bandwidth by matching local brightness and BW0 and BW1 bandwidth curves, wherein the specific flow is as follows:
(2.1) extracting pixels of the same channel from the original Bayer image, taking the current pixel as a 7x7 brightness window, and obtaining the brightness of the central pixel through a low-pass filter, namely: local brightness;
(2.2) sending the local gain map obtained in the step (1) to a local bandwidth 0 curve (BW 0) module (fig. 3), dividing the local brightness by the local gain map, and then sending the local brightness to a local bandwidth 1 curve (BW 1) module (fig. 4); the Local bandwidth 0 curve (BW 0) is input into the Local gain and output into the local_BW0, wherein the Local bandwidth 0 curve adopts 16 sections of fitting curves (figure 3), curve control point coefficients are calibrated according to different image sensors (CMOS sensors), and different gains correspond to different noise reduction bandwidths. The Local bandwidth 1 curve (BW 1) is input into the original brightness of the pixel (the original brightness is obtained by dividing the Local brightness by the Local gain) and is output into local_BW1, wherein the Local bandwidth 1 curve adopts 16 sections of fitting curves (fig. 4), the curve control point coefficients are calibrated according to different image sensors (CMOSSensor), and different sensing brightness corresponds to different noise reduction bandwidths. And finally multiplying the local bandwidth 0 by the local bandwidth 1 to obtain the final local bandwidth.
In the step (3), the smaller the similarity value is, the larger the corresponding weight should be, and in the step (3), the local bandwidth is used as a 3dB bandwidth control point, and the final weight value is obtained through a similarity-to-weight curve (figure 5); the specific flow is as follows: the similarity table of each point and the center point in the 7x7 window taking the current pixel as the center is recorded as S [7,7], 49 points are similar, the similarity of the 49 points is converted into corresponding 49 weight values, and the corresponding 49 weight values are recorded as weight tables W [7,7]; specifically, a low-pass filter as shown in fig. 5 is used for fitting, and the 3dB bandwidth of the low-pass filter is controlled by the local bandwidth obtained in step (2), and is divided into 9 steps. Thus, the noise reduction bandwidth of each pixel is realized, and each pixel obtains the most proper noise reduction force according to the self condition. Similarly, according to the similarity-to-weight curve, all the S [7,7]49 points can be converted into a W [7,7] weight table, as shown in FIG. 6. The weight of each 7x7 adjacent pixel required by noise reduction and weighting is obtained so far, and the noise reduction process of the current pixel can be started.
And (4) carrying out weighting, limiting and merging with the original graph to obtain a final noise reduction result, wherein the specific flow is as follows: the 7x7 weight table W [7,7] noise reduction process generated in the step (3) is relatively simple, and the final noise reduction output result can be obtained by only fusing the weighting, limiting and original images, as shown in fig. 7. The specific formula is as follows:
Figure BDA0003727160900000021
fusion_o=nr_o+raw [24] (1-alpha), where alpha= [ 0-1 ];
wherein NR_o is the result of the current pixel after noise reduction, and fusion_o is the output of the result of the current pixel after noise reduction and the pixel before original noise reduction after Fusion; clip represents a truncate operation, where less than 0 is limited to 0 and greater than 4095 is limited to 4095; raw [24] represents 7×7 raw bare data centered on the current pixel, and W [ i ] represents the weight corresponding to 7×7 pixels centered on the current pixel, that is, W [7,7].
The invention has the following specific benefits: the video noise reduction method with the local self-adaptive force can accurately obtain a local gain map, is matched with a brightness noise curve to obtain a local bandwidth, fundamentally simulates noise models of all stages affected by digital gain, is adapted to each pixel of the same frame, and uses different noise reduction strategies to achieve more accurate and better local noise reduction effects.
Drawings
Fig. 1 is a flow chart of local bandwidth generation for local adaptive noise reduction dynamics.
Fig. 2 is a local gain map generation.
Fig. 3 is a local bandwidth 0 curve.
Fig. 4 is a local bandwidth 1 curve.
FIG. 5 is a similarity-to-weight curve.
Fig. 6 is a 7x7 weight table generation.
Fig. 7 is a noise reduction fusion process.
Fig. 8 illustrates the position of the method of the present invention in a video noise reduction reference flow.
Detailed Description
The noise reduction method of local adaptive dynamics of the present invention is embedded in a specific noise reduction algorithm, such as a local gain and weight calculation part in video noise reduction RawNR (bare graph noise reduction), as shown in fig. 8. And outputting the fused video with the noise reduction by inputting the bare video image. The specific embodiments of the present invention are as follows:
fig. 1 is a diagram of local bandwidth generation, which is first required to generate a local gain map, and is completed in a local gain module, wherein the local gain is obtained by integrating various digital gain signals such as digital gain of automatic exposure control, automatic white balance channel gain, lens shading correction channel gain, channel gain used in high dynamic range fusion, and the like, and the integration process is required to completely align each pixel by functions such as line storage, synchronization, and the like, and finally the local gain map is generated. Meanwhile, the 7x7 brightness window sent by the channel window can obtain the local brightness of the center through a low-pass filter, the local gain map is sent to the local bandwidth 0 curve module, and the local brightness divided by the local gain map is sent to the local bandwidth 1 curve module. And the local bandwidth 0 curve is sent out to a 0,16 section of fitting curve according to the input local gain, calibration of different sensors is completed, and different gains correspond to different noise reduction bandwidths. The local bandwidth 1 curve is sent out to be local bandwidth 1 according to the original brightness obtained by dividing the input local brightness by the local gain, and the 16 sections of fitting curves are calibrated according to different sensors, and different sensing brightness corresponds to different noise reduction bandwidths. And finally multiplying the local bandwidth 0 by the local bandwidth 1 to obtain the final local bandwidth.
Fig. 2 is a detailed process of local gain map generation, because the general pipeline design has WBG (white balance gain), LSC (lens shading correction), HDRFusion (high dynamic range) modules before RawNR, where WBG includes global automatic white balance gain and automatic exposure digital gain, LSC includes local channel gain for lens shading correction brightness shading and color shading, HDRFusion includes that the gains of selecting different areas to use long/medium/short exposure in the fusion process are different, and these different gains all change the noise form of each pixel finally. The local gain map of the upper graph is that all the relevant digital gains are fused together to form a complete local gain map, and the local gain map is accurate to each pixel.
Fig. 3 is a graph of local bandwidth 0, which is obtained by calibrating a specific image sensor by using local gains as inputs, fitting the local gains according to 16-segment broken lines, and corresponding to different local bandwidths 0 (noise intensities) of different local gains.
Fig. 4 is a graph of local bandwidth 1, which is obtained by dividing local brightness by original local brightness of local gain as input, fitting according to 16 segments of broken lines, and the different local original brightness corresponds to different local bandwidth 1 (noise intensity) and is obtained by calibrating a specific image sensor.
The outputs of the above two curves are multiplied to obtain the final local bandwidth, which is fed as a parameter to the similarity-to-weight curve, as shown in fig. 5.
The smaller the S value the corresponding weight should be greater, here fitted with the low-pass filter described in fig. 5, divided into 9 steps, the 3dB bandwidth of which is controlled by the local bandwidth obtained above. Thus, the noise reduction bandwidth of each pixel is realized, and each pixel obtains the most proper noise reduction force according to the self condition.
According to the similarity-to-weight curve, all of the S7 < 7 > 49 points can be converted into a W7 < 7 > weight table, as shown in FIG. 6. The weight of each 7x7 adjacent pixel required by noise reduction and weighting is obtained so far, and the noise reduction process of the current pixel can be started.
The 7x7 weight noise reduction process is relatively simple, and the final noise reduction output result can be obtained by only fusing the weighting, limiting and original images.

Claims (1)

1. A video noise reduction method of local self-adaptive force is characterized by comprising the following specific steps:
(1) Generating a local gain map;
(2) Local bandwidth is generated by matching local brightness with BW0 and BW1 bandwidth curves;
(3) Taking the local bandwidth as a 3dB bandwidth control point, and obtaining a weight value through similarity to weight curve;
(4) Finally, weighting, limiting and fusing the original images to obtain a final noise reduction result;
the local gain map is generated in the step (1), and the specific flow is as follows: the digital gain of automatic exposure control, the automatic white balance channel gain, the lens shading correction channel gain and various digital gain signals of channel gain used in high dynamic range fusion are multiplied together point by point, the gathering process is stored and synchronized, each pixel is completely aligned, and finally a local gain map is generated;
in the step (2), the BW0 bandwidth curve is also called a local bandwidth 0 curve, and the BW1 bandwidth curve is also called a local bandwidth 1 curve; the specific flow is as follows:
(2.1) extracting pixels of the same channel from the original bayer image, and obtaining the brightness of a central pixel by using a low-pass filter through a 7x7 brightness window with the current pixel as the center, namely: local brightness;
(2.2) sending the local gain map obtained in the step (1) to a local bandwidth 0 curve module, dividing the local brightness by the local gain map, and then sending the local brightness to a local bandwidth 1 curve module;
the Local bandwidth 0 curve inputs the Local gain and outputs local_BW0; here, the local bandwidth 0 curve adopts 16 sections of fitting curves, curve control point coefficients are calibrated according to different image sensors, and different gains correspond to different noise reduction bandwidths;
the Local bandwidth 1 curve is input with original brightness of pixels, the original brightness is obtained by dividing the Local brightness by Local gain, and Local BW1 is output, wherein the Local bandwidth 1 curve adopts 16 sections of fitting curves, curve control point coefficients are calibrated according to different image sensors, and different sensing brightness corresponds to different noise reduction bandwidths;
finally, multiplying the local bandwidth 0 by the local bandwidth 1 to obtain a final local bandwidth;
in the step (3), the smaller the similarity value is, the larger the corresponding weight is, and in the step (3), the local bandwidth is used as a 3dB bandwidth control point, and the final weight value is obtained through a similarity-to-weight curve; the specific flow is as follows: the similarity table of each point and the center point in the 7x7 window taking the current pixel as the center is marked as S [7,7], 49 points are similar, and the similarity table is converted into a corresponding weight table and marked as W [7,7]; the fitting is carried out by adopting a low-pass filter, the low-pass filter is divided into 9 steps, the 3dB bandwidth of the low-pass filter is controlled by the local bandwidth obtained in the step (2), so that the noise reduction bandwidth of each pixel is realized, and the most suitable noise reduction force is obtained by each pixel according to the situation of the pixel; similarly, according to the similarity-to-weight curve, converting S [7,7]49 points into W [7,7] weight table to obtain the weight of each 7x7 adjacent pixel required by noise reduction and weight calculation;
and (4) carrying out weighting, limiting and fusion on the original graph to obtain a final noise reduction result, wherein the specific formula is as follows:
Figure FDA0004224227250000011
fusion_o=nr_o+raw [24] (1-alpha), where alpha= [ 0-1 ];
wherein NR_o is the result of the current pixel after noise reduction, and fusion_o is the output of the result of the current pixel after noise reduction and the pixel before original noise reduction after Fusion; clip represents a truncate operation, where less than 0 is limited to 0 and greater than 4095 is limited to 4095; raw [24] represents 7×7 raw bare data centered on the current pixel, and W [ i ] represents the weight corresponding to 7×7 pixels centered on the current pixel, that is, W [7,7].
CN202210775955.9A 2022-07-02 2022-07-02 Video noise reduction method of local self-adaptive force Active CN115334294B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210775955.9A CN115334294B (en) 2022-07-02 2022-07-02 Video noise reduction method of local self-adaptive force

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210775955.9A CN115334294B (en) 2022-07-02 2022-07-02 Video noise reduction method of local self-adaptive force

Publications (2)

Publication Number Publication Date
CN115334294A CN115334294A (en) 2022-11-11
CN115334294B true CN115334294B (en) 2023-06-20

Family

ID=83916985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210775955.9A Active CN115334294B (en) 2022-07-02 2022-07-02 Video noise reduction method of local self-adaptive force

Country Status (1)

Country Link
CN (1) CN115334294B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117115003A (en) * 2023-02-15 2023-11-24 荣耀终端有限公司 Method and device for removing noise

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106339989A (en) * 2016-08-13 2017-01-18 浙江莱达信息技术有限公司 Medical X-ray image adaptive noise reduction method
CN112819721A (en) * 2021-02-04 2021-05-18 湖南兴芯微电子科技有限公司 Method and system for reducing noise of image color noise

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9148580B2 (en) * 2013-07-16 2015-09-29 Texas Instruments Incorporated Transforming wide dynamic range images to reduced dynamic range images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106339989A (en) * 2016-08-13 2017-01-18 浙江莱达信息技术有限公司 Medical X-ray image adaptive noise reduction method
CN112819721A (en) * 2021-02-04 2021-05-18 湖南兴芯微电子科技有限公司 Method and system for reducing noise of image color noise

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
采用背景提取和自适应滤波的视频降噪算法;崔建伟;谷源涛;唐昆;;电视技术(第S2期);全文 *

Also Published As

Publication number Publication date
CN115334294A (en) 2022-11-11

Similar Documents

Publication Publication Date Title
US11849224B2 (en) Global tone mapping
CN110378859B (en) Novel high dynamic range image generation method
US8144214B2 (en) Imaging apparatus, imaging method, integrated circuit, and storage medium
US8169500B2 (en) Dynamic range compression apparatus, dynamic range compression method, computer-readable recording medium, integrated circuit, and imaging apparatus
JP6083897B2 (en) Imaging apparatus and image signal processing apparatus
CN110033418B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110022469B (en) Image processing method, image processing device, storage medium and electronic equipment
CN102340673B (en) White balance method for video camera aiming at traffic scene
KR20030097687A (en) Image processing apparatus, camera apparatus, and automatic exposure control method
WO2019104047A1 (en) Global tone mapping
US20180197282A1 (en) Method and device for producing a digital image
CN111586310B (en) Real-time high-dynamic imaging method and imaging system
CN115334294B (en) Video noise reduction method of local self-adaptive force
JP2011041056A (en) Imaging apparatus and imaging method
CN110572585B (en) Image processing method, image processing device, storage medium and electronic equipment
CN114862698A (en) Method and device for correcting real overexposure image based on channel guidance
CN114785995A (en) Automatic white balance implementation method based on FPGA
US9013605B2 (en) Apparatus and method for processing intensity of image in digital camera
JP3652902B2 (en) White balance adjustment device
JP6423668B2 (en) Image processing apparatus, control method therefor, and program
CN112714301A (en) Dual-mode image signal processor and image sensor
CN116723413A (en) RAW domain image denoising method and shooting device
CN103916594A (en) Imaging apparatus and imaging method
CN102318333A (en) Image processing device, method, and program
TW202011729A (en) Video recording device and video operation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant