CN110796614A - Image processing method, image processing apparatus, and computer-readable storage medium - Google Patents

Image processing method, image processing apparatus, and computer-readable storage medium Download PDF

Info

Publication number
CN110796614A
CN110796614A CN201910994775.8A CN201910994775A CN110796614A CN 110796614 A CN110796614 A CN 110796614A CN 201910994775 A CN201910994775 A CN 201910994775A CN 110796614 A CN110796614 A CN 110796614A
Authority
CN
China
Prior art keywords
image
current frame
target
video image
frame video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910994775.8A
Other languages
Chinese (zh)
Other versions
CN110796614B (en
Inventor
李志成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910994775.8A priority Critical patent/CN110796614B/en
Publication of CN110796614A publication Critical patent/CN110796614A/en
Application granted granted Critical
Publication of CN110796614B publication Critical patent/CN110796614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application discloses an image processing method, an image processing device and a computer readable storage medium, wherein the method comprises the following steps: acquiring a current frame video image, and performing image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image; if the noise intensity of the target image is greater than or equal to the preset image noise intensity, starting an image denoising function, and determining a target image denoising parameter required for performing an image denoising operation on the current frame video image according to the target image noise intensity; and executing image denoising operation on the current frame video image according to the target image denoising parameter to obtain the current frame video image subjected to image denoising. The embodiment of the application is beneficial to improving the denoising effect of the video image.

Description

Image processing method, image processing apparatus, and computer-readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, and a computer-readable storage medium.
Background
At present, the process of denoising a video image is as follows: a video image denoising device acquires a current frame video image; the video image denoising device executes image denoising operation on the current frame video image by using the fixed image denoising parameters to obtain the current frame video image after image denoising. This approach leads to an unsatisfactory de-noising effect of the video image.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device and a computer readable storage medium, which are used for improving the denoising effect of a video image.
In a first aspect, an embodiment of the present application provides a video image processing method, including:
acquiring a current frame video image, and performing image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image;
if the target image noise intensity is larger than or equal to the preset image noise intensity, starting an image denoising function, and determining a target image denoising parameter required for performing an image denoising operation on the current frame video image according to the target image noise intensity;
and executing image denoising operation on the current frame video image according to the target image denoising parameter to obtain the current frame video image subjected to image denoising.
In a second aspect, an embodiment of the present application provides a video image processing apparatus, including:
the acquisition unit is used for acquiring a current frame video image;
the calculating unit is used for executing image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image;
the switch unit is used for starting an image denoising function if the noise intensity of the target image is greater than or equal to the preset image noise intensity;
the determining unit is used for determining a target image denoising parameter required for executing image denoising operation on the current frame video image according to the target image noise intensity;
and the denoising unit is used for executing image denoising operation on the current frame video image according to the target image denoising parameter to obtain the current frame video image subjected to image denoising.
In a third aspect, an embodiment of the present application provides a video image processing apparatus, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and where the program includes instructions for performing some or all of the steps of the method according to the first aspect of the embodiment of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium for storing a computer program, where the computer program is executed by a processor to implement some or all of the steps described in the method according to the first aspect of the embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in a method as described in the first aspect of embodiments of the present application. The computer program product may be a software installation package.
Compared with the method that the image noise intensity of the current frame video image is not considered, the image denoising operation is directly performed on the current frame video image by using the fixed image denoising parameter to obtain the current frame video image after the image denoising, in the embodiment of the application, if the target image noise intensity of the current frame video image is greater than or equal to the preset image noise intensity, the image denoising function is started, and the image denoising operation is performed on the current frame video image by using the target image denoising parameter determined according to the target image noise intensity to obtain the current frame video image after the image denoising, so that the method is beneficial to improving the denoising effect of the video image.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
Fig. 1A is a schematic flowchart of a first video image processing method according to an embodiment of the present application;
fig. 1B is a schematic diagram illustrating a color space value calculation of a pixel point according to an embodiment of the present disclosure;
fig. 1C is a schematic diagram of color space value calculation of another pixel point according to the embodiment of the present application;
FIG. 1D is a diagram illustrating a computation of a target convolution kernel based on a Laplace mask according to an embodiment of the present application;
fig. 1E is a schematic diagram of a current frame video image before and after image denoising according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a second video image processing method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a third video image processing method according to an embodiment of the present application;
fig. 4 is a block diagram of functional units of a video image processing apparatus according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a video image processing apparatus according to an embodiment of the present application.
Detailed description of the invention
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following are detailed below.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The following describes embodiments of the present application in detail.
Referring to fig. 1A, fig. 1A is a schematic flowchart of a first video image processing method according to an embodiment of the present application, where the video image processing method includes steps 101-103, which are as follows:
101: the video image processing device acquires a current frame video image, and executes image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image.
Among them, image noise is a factor in an image that hinders the human organ from understanding the received information.
In one possible example, the video image processing apparatus performs an image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image, and includes:
the video image processing device executes color space value calculation operation on the current frame video image to obtain the color space value of each pixel point in a plurality of pixel points included in the current frame video image;
the video image processing device executes image blocking operation on the current frame video image to obtain a plurality of sub-frame video image blocks corresponding to the current frame video image;
the video image processing device executes color space value variance calculation operation on a plurality of color space values corresponding to each subframe video image block in the plurality of subframe video image blocks to obtain a plurality of target variances, wherein the plurality of target variances are in one-to-one correspondence with the plurality of subframe video image blocks;
the video image processing device determines the average value of the target variances as the target image noise intensity corresponding to the current frame video image.
Further, firstly, the video image processing device executes image blocking operation on the current frame video image to obtain a plurality of sub-frame video image blocks corresponding to the current frame video image; then, the video image processing device performs color space value calculation operation on the plurality of sub-frame video image blocks to obtain a plurality of color space values corresponding to a plurality of pixel points included in each sub-frame video image block in the plurality of sub-frame video image blocks, wherein the plurality of color space values are in one-to-one correspondence with the plurality of pixel points.
The color space value may be a gray value or a chrominance value.
Specifically, the video image processing apparatus performs a color space value calculation operation on the current frame video image, and an embodiment of obtaining a color space value of each pixel point of a plurality of pixel points included in the current frame video image may be:
the video image processing device executes image resolution acquisition operation on the current frame video image to obtain a target image resolution corresponding to the current frame video image;
if the resolution of the target image is greater than or equal to the preset image resolution, the video image processing device executes color space value calculation operation on the current frame video image in a first color space value calculation mode to obtain the color space value of each pixel point in a plurality of pixel points included in the current frame video image;
and if the resolution of the target image is smaller than the preset image resolution, the video image processing device executes color space value calculation operation on the current frame video image in a second color space value calculation mode to obtain the color space value of each pixel point in a plurality of pixel points included in the current frame video image.
Specifically, the video image processing apparatus performs an image resolution obtaining operation on the current frame video image, and an embodiment of obtaining a target image resolution corresponding to the current frame video image may be as follows:
the video image processing device obtains the number of target pixel points corresponding to the current frame video image;
the video image processing device obtains the size of a target image corresponding to the video image of the current frame;
and the video image processing device determines the ratio of the number of the target pixel points to the size of the target image as the resolution of the target image corresponding to the video image of the current frame.
Specifically, the video image processing apparatus performs the color space value calculation operation on the current frame video image in the first color space value calculation manner, and an implementation manner of obtaining the color space value of each pixel point of the plurality of pixel points included in the current frame video image may be:
the video image processing device obtains an initial color space value of each pixel point in N pixel points included in a current frame video image, wherein N is an integer greater than 1;
the average value of the initial color space values of the pixel points A1-A9 is determined as the color space value of the pixel point A5, the pixel point A5 is any one of N pixel points, the position of the pixel point A1 is the (N-1) th row in the (m-1) th column, the position of the pixel point A2 is the (m-1) th row in the nth column, the position of the pixel point A3 is the (m-1) th row in the (N +1) th column, the position of the pixel point A4 is the (N-1) th row in the mth row, the position of the pixel point A5 is the mth row in the nth column, the position of the pixel point A6 is the (m +1) th row in the (N +1) th column, the position of the pixel point A7 is the (m +1) th row in the (N-1) th column, the position of the pixel point A8 is the (m +1) th row in the nth column, and the position of the pixel point A9 is the (m +1) th;
the video image processing device performs the same operation on (N-1) pixels except for the pixel A5 among the N pixels to obtain the color space value of each pixel among the (N-1) pixels.
For example, as shown in fig. 1B, fig. 1B is a schematic diagram of calculating color space values of pixel points provided in this embodiment, 9 pixel points are a1-a9, an initial color space value of the pixel point a1 is a1, an initial color space value of the pixel point a2 is a2, an initial color space value of the pixel point A3 is A3, an initial color space value of the pixel point a4 is a4, an initial color space value of the pixel point a4 is a4, and a color space value of the pixel point a4 is (a 4+ a4+ a4+ a4+ a4+ a4+ a4+ a 4)/4 a 369.
Specifically, the video image processing apparatus performs the color space value calculation operation on the current frame video image in the second color space value calculation manner, and an implementation manner of obtaining the color space value of each pixel point of the plurality of pixel points included in the current frame video image may be:
the video image processing device obtains an initial color space value of each pixel point in N pixel points included in a current frame video image, wherein N is an integer greater than 1;
the video image processing determines the average value of the initial color space values of pixel points B2, B4, B4, B6 and B8 as the color space value of a pixel point B5, a pixel point B5 is any one of N pixel points, the position of the pixel point B2 is the (p-1) th row and the (q-1) th column, the position of the pixel point B4 is the p-th row and the (q-1) th column, the position of the pixel point B5 is the p-th row and the q-th column, the position of the pixel point B6 is the p-th row and the (q +1) th column, and the position of the pixel point B8 is the (p +1) th row and the q-th column;
the video image processing device performs the same operation on (N-1) pixels except for the pixel B5 among the N pixels to obtain the color space value of each pixel among the (N-1) pixels.
For example, as shown in fig. 1C, fig. 1C is a schematic diagram of color space value calculation of another pixel point provided in this embodiment, 5 pixel points are B2, B4, B5, B6 and B8, an initial color space value of the pixel point B1 is B1, an initial color space value of the pixel point B2 is B2, an initial color space value of the pixel point B3 is B3, an initial color space value of the pixel point B4 is B4, an initial color space value of the pixel point B5 is B5, an initial color space value of the pixel point B6 is B6, an initial color space value of the pixel point B7 is B7, an initial color space value of the pixel point B8 is B8, an initial color space value of the pixel point B9 is B9, and a color space value of the pixel point B5 is (B2+ B4+ 5+ B6+ B8)/5.
Specifically, the video image processing apparatus performs an image blocking operation on the current frame video image to obtain a plurality of sub-frame video image blocks corresponding to the current frame video image, and the implementation manner of the video image processing apparatus may be:
the video image processing device determines the number of target image blocks corresponding to the image resolution range where the target image resolution is located according to the mapping relation between the pre-stored image resolution range and the number of the image blocks;
the video image processing device divides the current frame video image into a sub-frame video image block with the target image block number.
In one possible example, the video image processing apparatus performs an image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image, and includes:
the video image processing device executes image definition calculation operation on the current frame video image to obtain the target image definition corresponding to the current frame video image;
the video image processing device determines a target image noise intensity estimation algorithm corresponding to the target image definition according to the mapping relation between the image definition and the image noise intensity estimation algorithm;
and the video image processing device uses the target image noise intensity estimation algorithm to carry out image noise intensity estimation operation on the current frame video image so as to obtain the target image noise intensity corresponding to the current frame video image.
Specifically, the video image processing apparatus performs an image sharpness calculation operation on the current frame video image, and an embodiment of obtaining a target image sharpness corresponding to the current frame video image may be:
the video image processing device obtains N gray values corresponding to N pixel points included in a current frame video image, wherein the N gray values correspond to the N pixel points one by one, and N is an integer greater than 1;
the video image processing device calculates the mean value and the mean square error of the N gray values, determines the sum of the mean value and the mean square error as a first gray value, and determines the difference of the mean value and the mean square error as a second gray value;
the video image processing device determines an average value of at least one gray value which is greater than or equal to the first gray value in the N gray values as a third gray value, and determines an average value of at least one gray value which is smaller than the second gray value in the N gray values as a fourth gray value;
the video image processing device determines the difference between the third gray value and the fourth gray value as a fifth gray value and determines the sum of the third gray value and the fourth gray value as a sixth gray value;
and the video image processing device determines the ratio of the fifth gray value to the sixth gray value as the definition of the target image corresponding to the current frame video image.
The mapping relation between the image definition and the image noise intensity estimation algorithm is stored in the video image processing device in advance, and is shown in the following table 1:
TABLE 1
Image sharpness Image noise intensity estimation algorithm
Image sharpness
1 Image noise intensity estimation algorithm 1
Image sharpness 2 Image noise intensity estimation algorithm 2
Image sharpness 3 Image noise intensity estimation algorithm 3
…… ……
Wherein, image definition: the image definition 1 is more than the image definition 2 is more than the image definition 3, and the image noise intensity estimation accuracy is as follows: the image noise intensity estimation algorithm 1 > the image noise intensity estimation algorithm 2 > the image noise intensity estimation algorithm 3.
In one possible example, a video image processing apparatus performs an image noise intensity estimation operation on a current frame video image to obtain a target image noise intensity corresponding to the current frame video image, and includes:
the video image processing device determines a target convolution kernel according to a first Laplace mask, a second Laplace mask and a convolution kernel formula which are stored in advance;
the video image processing device obtains the width and the height of the current frame video image and the color space value of each pixel point;
and the video image processing device determines the noise intensity of the target image corresponding to the current frame video image according to the target convolution kernel, the width and the height of the current frame video image, the color space value of each pixel point and a noise intensity formula.
The first laplacian mask is L1, the second laplacian mask is L2, and the convolution kernel formula is: n is 2(L2-L1), and N is the target convolution kernel.
For example, as shown in fig. 1D, fig. 1D is a schematic diagram of calculating a target convolution kernel according to the laplacian mask according to an embodiment of the present application, where L1 includes 9 values (0, 1, -4, 1, 0), L2 includes 9 values (0.5, 0, -2, 0, 0.5), and N includes 9 values (1, -2, 4, -2, 1).
The formula of the noise intensity is as follows:
Figure BDA0002239405250000081
σnthe target image noise intensity corresponding to the current frame video image is W, the width of the current frame video image is W, the height of the current frame video image is H, and I (x, y) is the color space value of each pixel point of the current frame video image.
102: and if the target image noise intensity is greater than or equal to the preset image noise intensity, starting an image denoising function by the video image processing device, and determining a target image denoising parameter required for executing image denoising operation on the current frame video image according to the target image noise intensity.
In one possible example, the video image processing apparatus determines a target image denoising parameter required for performing an image denoising operation on a current frame video image according to a target image noise intensity, and includes:
the video image processing device determines target denoising intensity corresponding to the target image noise intensity according to a pre-stored denoising intensity formula corresponding to a three-dimensional block matching algorithm;
the video image processing device determines a target sliding step length corresponding to the noise intensity of a target image according to a pre-stored sliding step length formula corresponding to a three-dimensional block matching algorithm;
the video image processing device determines a target search radius corresponding to the noise intensity of a target image according to a pre-stored search radius formula corresponding to a three-dimensional block matching algorithm;
the video image processing device determines the target denoising strength, the target sliding step length and the target search radius as target image denoising parameters required for executing image denoising operation on the current frame video image.
The formula of the denoising intensity is as follows:
sigma=min(n/N×100+1,1000),
sigma is the denoising strength, N is the image noise strength, and N is the number of noise strength levels.
The sliding step formula is:
bstep=min(11-n/N×10,4),
bstep is the sliding step size, N is the image noise strength, and N is the number of noise strength levels.
The search radius formula is:
range=min(n/N×10+9,32),
range is the search radius, N is the image noise strength, and N is the number of noise strength levels.
In one possible example, the video image processing apparatus determines a target image denoising parameter required for performing an image denoising operation on a current frame video image according to a target image noise intensity, and includes:
the video image processing device determines a target denoising strength, a target sliding step length and a target search radius corresponding to the target image noise strength according to an image noise strength-denoising strength/sliding step length/search radius table;
the video image processing device determines the target denoising strength, the target sliding step length and the target search radius as target image denoising parameters required for executing image denoising operation on the current frame video image.
The image noise intensity-denoising intensity/sliding step length/search radius table is pre-stored in the video image processing device, and is shown in the following table 2:
TABLE 2
Intensity of image noise De-noising intensity (sigma) Step length of sliding (bstep) Search radius (range)
Image noise intensity 1 Denoising Strength 1 Step length of sliding 1 Search radius 1
Image noise intensity 2 Denoising intensity 2 Step length of sliding 2 Search radius 2
Image noise intensity 3 Intensity of noise removal 3 Step length of sliding 3 Search radius 3
…… …… …… ……
The image noise intensity and the denoising intensity are in a direct proportion relation, the image noise intensity and the sliding step length are in an inverse proportion relation, and the image noise intensity and the search radius are in a direct proportion relation.
It can be seen that the fixed image denoising parameters are directly used without considering the image noise intensity of the current frame video image, and in this example, the image denoising parameters are changed along with the change of the image noise intensity, which is helpful for improving the image denoising effect.
In one possible example, the video image processing apparatus determines a target image denoising parameter required for performing an image denoising operation on the current frame video image according to the target image noise intensity, and includes:
the video image processing device determines a target image noise intensity-denoising intensity/sliding step length/searching radius table corresponding to the target image noise intensity estimation algorithm according to the mapping relation between the image noise intensity estimation algorithm and the image noise intensity-denoising intensity/sliding step length/searching radius table;
the video image processing device determines a target denoising strength, a target sliding step length and a target search radius corresponding to the target image noise strength according to a target image noise strength-denoising strength/sliding step length/search radius table;
and the video image processing device determines the target denoising intensity, the target sliding step length and the target search radius as target image denoising parameters required for executing image denoising operation on the current frame video image.
The mapping relation between the image noise intensity estimation algorithm and the image noise intensity-denoising intensity/sliding step length/search radius table is stored in the video image processing device in advance, and the mapping relation between the image noise intensity estimation algorithm and the image noise intensity-denoising intensity/sliding step length/search radius table is shown in the following table 3:
TABLE 3
Image noise intensity estimation algorithm Image noise intensity-denoising intensity/sliding step length/search radius table
Image noise intensity estimation algorithm 1 Table 1 of image noise intensity-denoise intensity/sliding step length/search radius
Image noise intensity estimation algorithm 2 Table 2 of image noise intensity-denoise intensity/sliding step length/search radius
Image noise intensity estimation algorithm 3 Image noiseAcoustic intensity-denoise intensity/sliding step length/search radius table 3
…… ……
It can be seen that, regardless of the image noise intensity of the current frame video image, the fixed image denoising parameter is directly used, in this example, first, the video image processing device determines the image noise intensity-denoising intensity/sliding step length/searching radius table according to the image noise intensity estimation algorithm, and then, the video image processing device determines the image denoising parameter according to the image noise intensity, and the image denoising parameter changes along with the change of the image noise intensity, which is helpful for improving the image denoising effect.
103: and the video image processing device executes image denoising operation on the current frame video image according to the target image denoising parameter to obtain the current frame video image subjected to image denoising.
For example, as shown in fig. 1D, fig. 1D is a schematic diagram of a current frame video image before and after image denoising according to the embodiment of the present disclosure.
Compared with the method that the image noise intensity of the current frame video image is not considered, the image denoising operation is directly performed on the current frame video image by using the fixed image denoising parameter to obtain the current frame video image after the image denoising, in the embodiment of the application, if the target image noise intensity of the current frame video image is greater than or equal to the preset image noise intensity, the image denoising function is started, and the image denoising operation is performed on the current frame video image by using the target image denoising parameter determined according to the target image noise intensity to obtain the current frame video image after the image denoising, so that the method is beneficial to improving the denoising effect of the video image.
In one possible example, the method further comprises:
and if the noise intensity of the target image is smaller than the preset image noise intensity, the video image processing device keeps the image denoising function in an un-started state.
Therefore, compared with the method that the image noise intensity of the current frame video image is not considered, the image denoising operation is directly performed on the current frame video image by using the fixed image denoising parameter, so that the current frame video image after the image denoising is obtained, in the present example, if the target image noise intensity is smaller than the preset image noise intensity, the video image processing device keeps the image denoising function in the non-started state, so that the loss of the video image details caused by the image denoising on the noise-free video image is avoided, and the improvement of the video image denoising effect is facilitated.
Consistent with the embodiment shown in fig. 1A, please refer to fig. 2, fig. 2 is a schematic flow chart of a second video image processing method provided in the embodiment of the present application, in which the video image processing method includes steps 201 and 209, as follows:
201: the video image processing device acquires a current frame video image.
202: and the video image processing device executes color space value calculation operation on the current frame video image to obtain the color space value of each pixel point in a plurality of pixel points included in the current frame video image.
203: and the video image processing device executes image blocking operation on the current frame video image to obtain a plurality of sub-frame video image blocks corresponding to the current frame video image.
204: the video image processing device executes color space value variance calculation operation on a plurality of color space values corresponding to each of the plurality of sub-frame video image blocks to obtain a plurality of target variances, wherein the plurality of target variances are in one-to-one correspondence with the plurality of sub-frame video image blocks.
205: the video image processing device determines the average value of the target variances as the target image noise intensity corresponding to the current frame video image.
206: and if the noise intensity of the target image is greater than or equal to the preset image noise intensity, starting an image denoising function by the video image processing device.
207: and the video image processing device determines the target denoising strength, the target sliding step length and the target search radius corresponding to the target image noise strength according to the image noise strength-denoising strength/sliding step length/search radius table.
208: the video image processing device determines the target denoising strength, the target sliding step length and the target search radius as target image denoising parameters required for executing image denoising operation on the current frame video image.
209: and the video image processing device executes image denoising operation on the current frame video image according to the target image denoising parameter to obtain the current frame video image subjected to image denoising.
It should be noted that, the specific implementation of the steps of the method shown in fig. 2 can refer to the specific implementation described in the above method, and will not be described here.
Consistent with the embodiment shown in fig. 1A, please refer to fig. 3, where fig. 3 is a flowchart illustrating a third video image processing method provided in the embodiment of the present application, the video image processing method includes steps 301 and 309, which are as follows:
301: the video image processing device acquires a current frame video image.
302: and the video image processing device executes image definition calculation operation on the current frame video image to obtain the target image definition corresponding to the current frame video image.
303: and the video image processing device determines a target image noise intensity estimation algorithm corresponding to the target image definition according to the mapping relation between the image definition and the image noise intensity estimation algorithm.
304: and the video image processing device uses the target image noise intensity estimation algorithm to carry out image noise intensity estimation operation on the current frame video image so as to obtain the target image noise intensity corresponding to the current frame video image.
305: and if the noise intensity of the target image is greater than or equal to the preset image noise intensity, starting an image denoising function by the video image processing device.
306: the video image processing device determines a target image noise intensity-denoising intensity/sliding step length/search radius table corresponding to the target image noise intensity estimation algorithm according to the mapping relation between the image noise intensity estimation algorithm and the image noise intensity-denoising intensity/sliding step length/search radius table.
307: the video image processing device determines a target denoising strength, a target sliding step length and a target search radius corresponding to the target image noise strength according to a target image noise strength-denoising strength/sliding step length/search radius table.
308: and the video image processing device determines the target denoising intensity, the target sliding step length and the target search radius as target image denoising parameters required for executing image denoising operation on the current frame video image.
309: and the video image processing device executes image denoising operation on the current frame video image according to the target image denoising parameter to obtain the current frame video image subjected to image denoising.
It should be noted that, the specific implementation of the steps of the method shown in fig. 3 can refer to the specific implementation described in the above method, and will not be described here.
The above embodiments mainly introduce the scheme of the embodiments of the present application from the perspective of the method-side implementation process. It is understood that the video image processing apparatus includes hardware structures and/or software modules corresponding to the respective functions in order to implement the above-described functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional units of the video image processing device according to the method, for example, each functional unit can be divided corresponding to each function, or two or more functions can be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
The following is an embodiment of the apparatus of the present application, which is used to execute the method implemented by the embodiment of the method of the present application. Referring to fig. 4, fig. 4 is a block diagram illustrating functional units of a video image processing apparatus according to an embodiment of the present disclosure, where the video image processing apparatus 400 includes:
an obtaining unit 401, configured to obtain a current frame video image;
a calculating unit 402, configured to perform an image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image;
the switch unit 403 is configured to start an image denoising function if the target image noise intensity is greater than or equal to a preset image noise intensity;
a determining unit 404, configured to determine, according to the target image noise intensity, a target image denoising parameter required for performing an image denoising operation on the current frame video image;
and a denoising unit 405, configured to perform an image denoising operation on the current frame video image according to the target image denoising parameter, so as to obtain the current frame video image after image denoising.
Compared with the method that the image noise intensity of the current frame video image is not considered, the image denoising operation is directly performed on the current frame video image by using the fixed image denoising parameter to obtain the current frame video image after the image denoising, in the embodiment of the application, if the target image noise intensity of the current frame video image is greater than or equal to the preset image noise intensity, the image denoising function is started, and the image denoising operation is performed on the current frame video image by using the target image denoising parameter determined according to the target image noise intensity to obtain the current frame video image after the image denoising, so that the method is beneficial to improving the denoising effect of the video image.
In a possible example, in terms of performing an image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image, the calculating unit 402 is specifically configured to:
performing color space value calculation operation on the current frame video image to obtain the color space value of each pixel point in a plurality of pixel points included in the current frame video image;
performing image blocking operation on the current frame video image to obtain a plurality of sub-frame video image blocks corresponding to the current frame video image;
performing color space value variance calculation operation on a plurality of color space values corresponding to each of the plurality of subframe video image blocks to obtain a plurality of target variances, wherein the plurality of target variances are in one-to-one correspondence with the plurality of subframe video image blocks;
and determining the average value of the target variances as the target image noise intensity corresponding to the current frame video image.
In a possible example, in terms of performing an image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image, the calculating unit 402 is specifically configured to:
performing image definition calculation operation on the current frame video image to obtain a target image definition corresponding to the current frame video image;
determining a target image noise intensity estimation algorithm corresponding to the target image definition according to the mapping relation between the image definition and the image noise intensity estimation algorithm;
and performing image noise intensity estimation operation on the current frame video image by using the target image noise intensity estimation algorithm to obtain the target image noise intensity corresponding to the current frame video image.
In one possible example, in determining a target image denoising parameter required for performing an image denoising operation on the current frame video image according to the target image noise strength, the determining unit 304 is specifically configured to:
determining a target image noise intensity-denoising intensity/sliding step length/searching radius table corresponding to the target image noise intensity estimation algorithm according to the mapping relation between the image noise intensity estimation algorithm and the image noise intensity-denoising intensity/sliding step length/searching radius table;
determining a target denoising strength, a target sliding step length and a target search radius corresponding to the target image noise strength according to a target image noise strength-denoising strength/sliding step length/search radius table;
and determining the target denoising strength, the target sliding step length and the target search radius as target image denoising parameters required for executing image denoising operation on the current frame video image.
In a possible example, the switch unit 403 is further configured to keep the image denoising function in an inactive state if the target image noise intensity is smaller than the preset image noise intensity.
Consistent with the embodiments shown in fig. 1A, fig. 2 and fig. 3, please refer to fig. 5, fig. 5 is a schematic structural diagram of a video image processing apparatus provided in an embodiment of the present application, where the video image processing apparatus 500 includes a processor, a memory, a communication interface, and one or more programs, the one or more programs are stored in the memory and configured to be executed by the processor, and the programs include instructions for performing the following steps:
acquiring a current frame video image, and performing image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image;
if the target image noise intensity is larger than or equal to the preset image noise intensity, starting an image denoising function, and determining a target image denoising parameter required for performing an image denoising operation on the current frame video image according to the target image noise intensity;
and executing image denoising operation on the current frame video image according to the target image denoising parameter to obtain the current frame video image subjected to image denoising.
Compared with the method that the image noise intensity of the current frame video image is not considered, the image denoising operation is directly performed on the current frame video image by using the fixed image denoising parameter to obtain the current frame video image after the image denoising, in the embodiment of the application, if the target image noise intensity of the current frame video image is greater than or equal to the preset image noise intensity, the image denoising function is started, and the image denoising operation is performed on the current frame video image by using the target image denoising parameter determined according to the target image noise intensity to obtain the current frame video image after the image denoising, so that the method is beneficial to improving the denoising effect of the video image.
In one possible example, in terms of performing an image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image, the program includes instructions specifically configured to perform the following steps:
performing color space value calculation operation on the current frame video image to obtain the color space value of each pixel point in a plurality of pixel points included in the current frame video image;
performing image blocking operation on the current frame video image to obtain a plurality of sub-frame video image blocks corresponding to the current frame video image;
performing color space value variance calculation operation on a plurality of color space values corresponding to each of the plurality of subframe video image blocks to obtain a plurality of target variances, wherein the plurality of target variances are in one-to-one correspondence with the plurality of subframe video image blocks;
and determining the average value of the target variances as the target image noise intensity corresponding to the current frame video image.
In one possible example, in terms of performing an image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image, the program includes instructions specifically configured to perform the following steps:
performing image definition calculation operation on the current frame video image to obtain a target image definition corresponding to the current frame video image;
determining a target image noise intensity estimation algorithm corresponding to the target image definition according to the mapping relation between the image definition and the image noise intensity estimation algorithm;
and performing image noise intensity estimation operation on the current frame video image by using the target image noise intensity estimation algorithm to obtain the target image noise intensity corresponding to the current frame video image.
In one possible example, in determining target image denoising parameters required to perform an image denoising operation on the current frame video image according to the target image noise intensity, the program includes instructions specifically for performing the following steps:
determining a target image noise intensity-denoising intensity/sliding step length/searching radius table corresponding to the target image noise intensity estimation algorithm according to the mapping relation between the image noise intensity estimation algorithm and the image noise intensity-denoising intensity/sliding step length/searching radius table;
determining a target denoising strength, a target sliding step length and a target search radius corresponding to the target image noise strength according to a target image noise strength-denoising strength/sliding step length/search radius table;
and determining the target denoising strength, the target sliding step length and the target search radius as target image denoising parameters required for executing image denoising operation on the current frame video image.
In one possible example, the program further includes instructions for performing the steps of:
and if the noise intensity of the target image is smaller than the preset image noise intensity, keeping the image denoising function in an un-started state.
Embodiments of the present application provide a computer-readable storage medium for storing a computer program, the computer program being executed by a processor to implement part or all of the steps of any one of the methods as described in the above method embodiments, and the computer including a video image processing apparatus.
Embodiments of the present application provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising video image processing means.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific implementation and application scope, and in view of the above, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A video image processing method, comprising:
acquiring a current frame video image, and performing image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image;
if the target image noise intensity is larger than or equal to the preset image noise intensity, starting an image denoising function, and determining a target image denoising parameter required for performing an image denoising operation on the current frame video image according to the target image noise intensity;
and executing image denoising operation on the current frame video image according to the target image denoising parameter to obtain the current frame video image subjected to image denoising.
2. The method according to claim 1, wherein said performing an image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image comprises:
performing color space value calculation operation on the current frame video image to obtain the color space value of each pixel point in a plurality of pixel points included in the current frame video image;
performing image blocking operation on the current frame video image to obtain a plurality of sub-frame video image blocks corresponding to the current frame video image;
performing color space value variance calculation operation on a plurality of color space values corresponding to each of the plurality of subframe video image blocks to obtain a plurality of target variances, wherein the plurality of target variances are in one-to-one correspondence with the plurality of subframe video image blocks;
and determining the average value of the target variances as the target image noise intensity corresponding to the current frame video image.
3. The method according to claim 1, wherein said performing an image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image comprises:
performing image definition calculation operation on the current frame video image to obtain a target image definition corresponding to the current frame video image;
determining a target image noise intensity estimation algorithm corresponding to the target image definition according to the mapping relation between the image definition and the image noise intensity estimation algorithm;
and performing image noise intensity estimation operation on the current frame video image by using the target image noise intensity estimation algorithm to obtain the target image noise intensity corresponding to the current frame video image.
4. The method as claimed in claim 3, wherein said determining target image denoising parameters required for performing an image denoising operation on the current frame video image according to the target image denoising intensity comprises:
determining a target image noise intensity-denoising intensity/sliding step length/searching radius table corresponding to the target image noise intensity estimation algorithm according to the mapping relation between the image noise intensity estimation algorithm and the image noise intensity-denoising intensity/sliding step length/searching radius table;
determining a target denoising strength, a target sliding step length and a target search radius corresponding to the target image noise strength according to a target image noise strength-denoising strength/sliding step length/search radius table;
and determining the target denoising strength, the target sliding step length and the target search radius as target image denoising parameters required for executing image denoising operation on the current frame video image.
5. The method according to any one of claims 1-3, further comprising:
and if the noise intensity of the target image is smaller than the preset image noise intensity, keeping the image denoising function in an un-started state.
6. A video image processing apparatus characterized by comprising:
the acquisition unit is used for acquiring a current frame video image;
the calculating unit is used for executing image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image;
the switch unit is used for starting an image denoising function if the noise intensity of the target image is greater than or equal to the preset image noise intensity;
the determining unit is used for determining a target image denoising parameter required for executing image denoising operation on the current frame video image according to the target image noise intensity;
and the denoising unit is used for executing image denoising operation on the current frame video image according to the target image denoising parameter to obtain the current frame video image subjected to image denoising.
7. The apparatus according to claim 6, wherein in terms of performing an image noise intensity estimation operation on the current frame video image to obtain a target image noise intensity corresponding to the current frame video image, the calculating unit is specifically configured to:
performing image definition calculation operation on the current frame video image to obtain a target image definition corresponding to the current frame video image;
determining a target image noise intensity estimation algorithm corresponding to the target image definition according to the mapping relation between the image definition and the image noise intensity estimation algorithm;
and performing image noise intensity estimation operation on the current frame video image by using the target image noise intensity estimation algorithm to obtain the target image noise intensity corresponding to the current frame video image.
8. The apparatus according to claim 7, wherein in determining a target image denoising parameter required for performing an image denoising operation on the current frame video image according to the target image noise strength, the determining unit is specifically configured to:
determining a target image noise intensity-denoising intensity/sliding step length/searching radius table corresponding to the target image noise intensity estimation algorithm according to the mapping relation between the image noise intensity estimation algorithm and the image noise intensity-denoising intensity/sliding step length/searching radius table;
determining a target denoising strength, a target sliding step length and a target search radius corresponding to the target image noise strength according to a target image noise strength-denoising strength/sliding step length/search radius table;
and determining the target denoising strength, the target sliding step length and the target search radius as target image denoising parameters required for executing image denoising operation on the current frame video image.
9. An image processing apparatus comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing some or all of the steps of the method of any of claims 1-5.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium is used to store a computer program, which is executed by a processor to implement the method according to any of claims 1-5.
CN201910994775.8A 2019-10-18 2019-10-18 Image processing method, device and computer readable storage medium Active CN110796614B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910994775.8A CN110796614B (en) 2019-10-18 2019-10-18 Image processing method, device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910994775.8A CN110796614B (en) 2019-10-18 2019-10-18 Image processing method, device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110796614A true CN110796614A (en) 2020-02-14
CN110796614B CN110796614B (en) 2024-06-28

Family

ID=69439315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910994775.8A Active CN110796614B (en) 2019-10-18 2019-10-18 Image processing method, device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110796614B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111880225A (en) * 2020-09-01 2020-11-03 贵州朗星智能有限公司 Cable detector device and method for dynamically filtering environmental interference
WO2021230708A1 (en) * 2020-05-15 2021-11-18 Samsung Electronics Co., Ltd. Image processing method, electronic device and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003234662A (en) * 2002-02-07 2003-08-22 Canon Inc Signal processor and signal processing method
US20040119861A1 (en) * 2002-08-23 2004-06-24 Stmicroelectronics S.R.L. Method for filtering the noise of a digital image sequence
CN101076078A (en) * 2006-05-17 2007-11-21 广达电脑股份有限公司 Method and device for processing image
CN101094313A (en) * 2007-07-25 2007-12-26 北京中星微电子有限公司 Device and method for restraining noise
CN102547074A (en) * 2012-01-04 2012-07-04 西安电子科技大学 Surfacelet domain BKF model Bayes video denoising method
CN105208376A (en) * 2015-08-28 2015-12-30 青岛中星微电子有限公司 Digital noise reduction method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003234662A (en) * 2002-02-07 2003-08-22 Canon Inc Signal processor and signal processing method
US20040119861A1 (en) * 2002-08-23 2004-06-24 Stmicroelectronics S.R.L. Method for filtering the noise of a digital image sequence
CN101076078A (en) * 2006-05-17 2007-11-21 广达电脑股份有限公司 Method and device for processing image
CN101094313A (en) * 2007-07-25 2007-12-26 北京中星微电子有限公司 Device and method for restraining noise
CN102547074A (en) * 2012-01-04 2012-07-04 西安电子科技大学 Surfacelet domain BKF model Bayes video denoising method
CN105208376A (en) * 2015-08-28 2015-12-30 青岛中星微电子有限公司 Digital noise reduction method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021230708A1 (en) * 2020-05-15 2021-11-18 Samsung Electronics Co., Ltd. Image processing method, electronic device and readable storage medium
CN111880225A (en) * 2020-09-01 2020-11-03 贵州朗星智能有限公司 Cable detector device and method for dynamically filtering environmental interference

Also Published As

Publication number Publication date
CN110796614B (en) 2024-06-28

Similar Documents

Publication Publication Date Title
JP7030493B2 (en) Image processing equipment, image processing methods and programs
CN110706174B (en) Image enhancement method, terminal equipment and storage medium
US20160253787A1 (en) Methods and systems for denoising images
CN106663315B (en) Method for denoising noisy images
CN110796614A (en) Image processing method, image processing apparatus, and computer-readable storage medium
CN109636730B (en) Method for filtering pseudo pixels in a depth map
CN109690612B (en) Image processing apparatus and recording medium
CN107451978B (en) Image processing method, device and equipment
US20150071561A1 (en) Removing noise from an image via efficient patch distance computations
CN112150371B (en) Image noise reduction method, device, equipment and storage medium
CN107784631B (en) Image deblurring method and device
US9286653B2 (en) System and method for increasing the bit depth of images
CN111415317B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108961260B (en) Image binarization method and device and computer storage medium
JPWO2012098854A1 (en) Image processing system, image processing method, and image processing program
CN111709894B (en) Image processing method, image processing device, electronic equipment and storage medium
CN107481203B (en) Image-oriented filtering method and computing device
CN110517201B (en) Method and device for smoothing filtering along environment-friendly edge and electronic equipment
EP3316212A1 (en) Method for deblurring a video, corresponding device and computer program product
CN111260590B (en) Image noise reduction method and related product
CN108475430B (en) Picture quality evaluation method and device
CN111861947B (en) Method and device for improving information entropy of histogram technology enhanced image
CN110580880B (en) RGB (red, green and blue) triangular sub-pixel layout-based sub-pixel rendering method and system and display device
CN108090884B (en) Image optimization method and related device
CN110505485B (en) Motion compensation method, motion compensation device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022106

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant