CN117132474A - Image sharpening method and device, electronic equipment and computer storage medium - Google Patents

Image sharpening method and device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN117132474A
CN117132474A CN202210556027.3A CN202210556027A CN117132474A CN 117132474 A CN117132474 A CN 117132474A CN 202210556027 A CN202210556027 A CN 202210556027A CN 117132474 A CN117132474 A CN 117132474A
Authority
CN
China
Prior art keywords
image
restored
point spread
spread function
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210556027.3A
Other languages
Chinese (zh)
Inventor
王月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Xiongan ICT Co Ltd
China Mobile System Integration Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Xiongan ICT Co Ltd
China Mobile System Integration Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Xiongan ICT Co Ltd, China Mobile System Integration Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202210556027.3A priority Critical patent/CN117132474A/en
Publication of CN117132474A publication Critical patent/CN117132474A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration using non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20201Motion blur correction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application relates to the technical field of image processing, and provides an image sharpening method, an image sharpening device, electronic equipment and a computer storage medium, wherein the method comprises the following steps: estimating a point spread function of the image to be restored to obtain the point spread function; performing spatial domain high-pass filtering processing on the image to be restored through a preset convolution template to obtain an original clear image of the image to be restored; carrying out frequency domain wiener filtering treatment on the point spread function and the original sharpened image to obtain Fourier transform of the original sharpened image; and carrying out inverse transformation on the Fourier transformation of the original sharpened image to obtain a final sharpened image of the image to be restored. According to the image sharpening method provided by the embodiment of the application, the point spread function of the image to be restored is estimated, the space domain high-pass filtering is carried out on the image to be restored, and the frequency domain wiener filtering is carried out on the point spread function and the original sharpened image, so that the image information in the final sharpened image is clearer and more complete, and the restoration quality of the image to be restored is improved.

Description

Image sharpening method and device, electronic equipment and computer storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image sharpening method, an image sharpening device, an electronic device, and a computer storage medium.
Background
The existing motion blur image sharpening method generally comprises the following steps: a restoration method for reducing the exposure time of a camera; the restoration method for establishing the mathematical image restoration model solves the restoration problem of the motion blur image through the mathematical image restoration model, and the method has universality and is a main means for researching and solving the motion blur problem. Classical image restoration methods, such as inverse filtering, wiener filtering, constrained least squares, maximum entropy restoration, etc., have been widely used to date.
Restoration methods that reduce the exposure time of a camera may reduce the degree of blurring, but it is not possible to reduce the exposure time of a camera without limitation, and the signal-to-noise ratio of the resulting image may decrease with a decrease in exposure time, thereby reducing the image restoration quality. Classical image restoration methods have certain limitations, which have undesirable restoration effects on some images and reduce the quality of image restoration. The restoration method of the mathematical image restoration model requires sample data to train and test, and when the data volume is small, the data volume can cause over fitting to influence the model result, thereby reducing the image restoration quality.
Disclosure of Invention
The application provides an image sharpening method, an image sharpening device, electronic equipment and a computer storage medium, aiming at improving the image restoration quality.
In a first aspect, the present application provides an image sharpening method, including:
estimating a point spread function of an image to be restored to obtain the point spread function of the image to be restored;
performing spatial domain high-pass filtering processing on the image to be restored through a preset convolution template to obtain an original clear image of the image to be restored;
carrying out frequency domain wiener filtering treatment on the point spread function and the original sharpened image to obtain Fourier transform of the original sharpened image;
and carrying out inverse transformation on the Fourier transformation of the original sharpened image to obtain a final sharpened image of the image to be restored.
In one embodiment, the performing frequency domain wiener filtering on the point spread function and the original sharpened image to obtain a fourier transform of the original sharpened image includes:
performing Fourier transformation on the image to be restored, the point spread function and the original sharpened image, and converting the image to be restored, the point spread function and the original sharpened image to be restored to a frequency domain to respectively obtain the image to be restored to the frequency domain, the point spread function of the frequency domain and the original sharpened image of the frequency domain;
determining a degradation model of the frequency domain according to the image to be restored of the frequency domain, the point spread function of the frequency domain and the original sharpened image of the frequency domain;
and combining a wiener filtering frequency domain solution formula in the frequency domain wiener filtering and a degradation model of the frequency domain to obtain Fourier transform of the original clear image.
The estimating the point spread function of the image to be restored to obtain the point spread function of the image to be restored comprises the following steps:
performing bilinear interpolation and direction differentiation on the image to be restored to obtain a fuzzy direction;
rotating the image to be restored according to the blurring direction to obtain a horizontal image to be restored, wherein the movement direction of the horizontal image is rotated to a horizontal axis;
performing first-order differentiation and autocorrelation on the horizontal image to be restored to obtain a fuzzy scale;
the point spread function is determined based on the blur direction and the blur scale.
Performing bilinear interpolation and direction differentiation on the image to be restored to obtain a fuzzy direction, wherein the method comprises the following steps:
determining the image to be restored as first target image data, and determining an arc with the first target image data as a circle center and a preset radius;
determining each first image data to be processed which forms different included angle ranges with the first target image data by combining the circular arcs through bilinear interpolation;
determining each direction differential multiplier according to the first target image data and each first image data to be processed, and carrying out direction differential on the image to be restored through each direction differential multiplier to obtain each differential image;
and determining an included angle between the motion direction and the horizontal axis according to each differential image, and determining the included angle between the motion direction and the horizontal axis as the blurring direction.
The determining the included angle between the motion direction and the horizontal axis according to each differential image comprises the following steps:
adding and summing absolute values of gray values of the differential images to obtain gray total values of the images;
determining the image gray level total value with the smallest numerical value in the image gray level total values;
and determining the included angle in the image gray level total value with the minimum numerical value as the included angle between the motion direction and the horizontal axis.
Performing first-order differentiation and autocorrelation on the horizontal image to be restored to obtain a fuzzy scale, including:
performing first-order differentiation on the horizontal image to be restored in the horizontal direction to obtain a target differential image of the horizontal image to be restored;
carrying out autocorrelation on the target differential image in the horizontal direction to obtain an autocorrelation function, and adding and summing all columns of the autocorrelation function to obtain all suppression noise values;
determining the minimum value of each suppression noise value, and determining the number of the motion points of the position of the minimum value of the suppression noise value in the horizontal direction;
and determining the number of the motion points in the horizontal direction as the blurring scale.
The determining the point spread function based on the blur direction and the blur scale includes:
determining a first blur scale in a first blur direction and determining a second blur scale in a second blur direction;
if the number of the first blur scales is greater than the number of the second blur scales, determining the point spread function based on the first blur direction, the first blur scale and the second blur scale;
and if the number of the second fuzzy scales is greater than or equal to the number of the first fuzzy scales, determining the point spread function based on the second fuzzy direction, the first fuzzy scales and the second fuzzy scales.
In a second aspect, the present application provides an image sharpening device comprising:
the estimating module is used for estimating the point spread function of the image to be restored to obtain the point spread function of the image to be restored;
the first processing module is used for performing spatial domain high-pass filtering processing on the image to be restored through a preset convolution template to obtain an original clear image of the image to be restored;
the second processing module is used for carrying out frequency domain wiener filtering processing on the point spread function and the original sharpened image to obtain Fourier transform of the original sharpened image;
and the inverse transformation module is used for carrying out inverse transformation on the Fourier transformation of the original sharpened image to obtain a final sharpened image of the image to be restored.
In a third aspect, the present application also provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the image sharpening method of the first aspect when executing the program.
In a fourth aspect, the present application also provides a non-transitory computer readable storage medium comprising a computer program which, when executed by the processor, implements the image sharpening method of the first aspect.
In a fifth aspect, the present application also provides a computer program product comprising a computer program which, when executed by the processor, implements the image sharpening method of the first aspect.
According to the image sharpening method, the device, the electronic equipment and the computer storage medium, in the process of sharpening the image to be restored, the point spread function of the image to be restored is estimated, the spatial domain high-pass filtering is carried out on the image to be restored, the frequency domain wiener filtering is carried out on the point spread function and the original sharpened image, and the final sharpened image with enhanced edge detail information is obtained, so that the image information in the final sharpened image is clearer and more complete and is easy to identify, the final sharpened image is closer to and more practical, and the restoration quality of the image to be restored is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the following description will be given with a brief introduction to the drawings used in the embodiments or the description of the prior art, it being obvious that the drawings in the following description are some embodiments of the present application, and that other drawings can be obtained from these drawings without the inventive effort of a person skilled in the art.
FIG. 1 is a flow chart of an image sharpening method provided by the application;
fig. 2 is a schematic structural view of an image sharpening device provided by the application;
fig. 3 is a schematic structural diagram of an electronic device provided by the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The image sharpening method, the device, the electronic equipment and the computer storage medium provided by the application are described with reference to fig. 1 to 3. FIG. 1 is a flow chart of an image sharpening method provided by the application; fig. 2 is a schematic structural view of an image sharpening device provided by the application; fig. 3 is a schematic structural diagram of an electronic device provided by the present application.
The embodiments of the present application provide embodiments of an image sharpening method, and it should be noted that although a logic sequence is shown in the flowchart, the steps shown or described may be accomplished in a different order than that shown or described herein under certain data.
The embodiment of the application takes the electronic equipment as an execution main body for example, and the embodiment of the application takes the image processing system as one of the expression forms of the electronic equipment, and is not limited.
Referring to fig. 1, fig. 1 is a flow chart of an image sharpening method provided by the application. The image sharpening method provided by the embodiment of the application comprises the following steps:
and S10, estimating a point spread function of the image to be restored to obtain the point spread function of the image to be restored.
It should be noted that, in the embodiment of the present application, the image to be restored is represented by a motion blurred image, that is, the embodiment of the present application performs image restoration on the motion blurred image.
Further, when repairing the motion blurred image, the image processing system first needs to perform a point spread function estimation on the motion blurred image to obtain a point spread function of the motion blurred image, which is specifically described in steps S101 to S104.
Further, the descriptions of step S101 to step S104 are as follows:
step S101, performing bilinear interpolation and direction differentiation on the image to be restored to obtain a fuzzy direction;
step S102, rotating the image to be restored according to the blurring direction to obtain a horizontal image to be restored, wherein the movement direction of the horizontal image is rotated to a horizontal axis;
step S103, performing first-order differentiation and autocorrelation on the horizontal image to be restored to obtain a fuzzy scale;
step S104, determining the point spread function based on the blur direction and the blur scale.
Specifically, the image processing system performs bilinear interpolation and differential in a direction of a preset size on the motion blurred image to obtain an included angle between the motion direction and the horizontal axis, where in an embodiment, the preset size may be 3*3, that is, it may be understood that the image processing system performs bilinear interpolation 3*3 differential in the direction of the motion blurred image to obtain an included angle between the motion direction and the horizontal axis, and determines the included angle between the motion direction and the horizontal axis as the blurred direction of the motion blurred image, which is specifically described in steps S1011 to S1014. Further, the image processing system rotates the motion blurred image according to the blurring direction, so that the motion direction of the motion blurred image is rotated to a horizontal axis, and a horizontal image to be restored, of which the motion direction is rotated to the horizontal axis, is obtained.
Further, the image processing system performs first-order differentiation on the horizontal image to be restored, and then performs autocorrelation to obtain an autocorrelation function. Next, the image processing system determines the position of the minimum point of the autocorrelation function, and determines the blur scale according to the position of the minimum point of the autocorrelation function, as described in steps S1031 to 1034. Further, the image processing system obtains a point spread function of the motion blurred image according to the blur direction and the blur scale, which is specifically described in step S1041 to step S1043.
According to the embodiment of the application, the point spread function of the motion blurred image is determined through the blurring direction and the blurring scale, basic data is provided for frequency domain wiener filtering, and basic data is provided for obtaining a final sharpened image with enhanced edge detail information.
Further, the descriptions of step S1011 to step S1014 are as follows:
step S1011, determining the image to be restored as first target image data, and determining an arc with the first target image data as a circle center and a preset radius;
step S1012, determining each piece of first image data to be processed, which forms different included angle ranges with the first target image data, by combining the circular arcs through bilinear interpolation;
step S1013, determining each direction differential multiplier according to the first target image data and each first image data to be processed, and performing direction differential on the image to be restored by each direction differential multiplier to obtain each differential image;
step S1014, determining an included angle between the motion direction and the horizontal axis according to each differential image, and determining the included angle between the motion direction and the horizontal axis as the blurring direction.
Specifically, the image processing system determines the image to be restored as first target image data, where the first target image data may be denoted as g (i, j), and determines an arc with a preset radius using the first target image data g (i, j) as a center, where the preset radius is set according to an actual situation, and in the embodiment of the present application, 2 is the preset radius, that is, the center of the arc is the first target image data g (i, j), and the radius is 2. Further, the image processing system calculates each point in the arc forming a different included angle range α with the first target image data g (i, j) through bilinear interpolation, namely each first image data to be processed, wherein the first image data to be processed can be expressed as g (i ', j'), and the included angle range α hasAnd->The 6 ranges take on values.
Further, the image processing system determines a calculation formula of the directional differential multiplier, the calculation formula being g (i ', j') -g (i, j) =g (i, j) ×d α Wherein D is a Namely, a directional differential multiplier. The image processing system substitutes the first target image data g (i, j) and the first image data g (i, j) to be processed into a calculation formula of the directional differential multiplier to calculate the directional differential multiplier D of each 3*3 a
For an included angle range alpha of3*3 of itDirectional differential multiplier D a The method comprises the following steps:
for an included angle range alpha ofIts 3*3 direction differential multiplier D a The method comprises the following steps:
for an included angle range alpha ofIts 3*3 direction differential multiplier D a The method comprises the following steps:
for an included angle range alpha ofIts 3*3 direction differential multiplier D a The method comprises the following steps:
for an included angle range alpha ofIts 3*3 direction differential multiplier D a The method comprises the following steps:
for an included angle range alpha ofIts 3*3 direction differential multiplier D a The method comprises the following steps:
further, the image processing system is arranged in the following wayIn the range, the included angle range alpha is taken according to the step length of 1 degree, and the multiplier D is differentiated by the direction of each 3*3 a Differentiating the motion blurred image g (i, j) in the length of 2 directions to obtain each differentiated image delta g (i, j) α Wherein Δg (i, j) α =g(i,j)×D α . Finally, the image processing system generates a respective differential image Δg (i, j) α An included angle between the moving direction and the horizontal axis is determined, and the included angle between the moving direction and the horizontal axis is determined as a blurring direction, specifically as described in steps S10141 to S10143.
The embodiment of the application determines the blurring direction and provides basic data for determining the point spread function of the motion blurring image.
Further, the descriptions of step S10141 to step S10143 are as follows:
step S10141, adding and summing absolute values of gray values of the differential images to obtain gray total values of the images;
step S10142, determining the image gray level total value with the smallest value in the image gray level total values;
and step S10143, determining the included angle in the image gray level total value with the minimum numerical value as the included angle between the motion direction and the horizontal axis.
Specifically, the image processing system subjects each differential image Δg (i, j) α Adding and summing the absolute values of the gray values of the images to obtain the gray total value I (delta g) of each image α . Further, the image processing system determines the total gray level value I (Δg) of each image α Image gray level total value I with minimum medium value min (Δg) α . Then, the image processing system determines the image gray level total value I with the smallest value min (Δg) α The included angle alpha of the image gray scale total value I with the smallest value min (Δg) α The included angle alpha of the motion direction is determined as the included angle between the motion direction and the horizontal axis.
Further, the descriptions of step S1031 to step 1034 are as follows:
step S1031, performing first-order differentiation on the horizontal image to be restored in the horizontal direction to obtain a target differential image of the horizontal image to be restored;
step S1032, carrying out autocorrelation on the target differential image in the horizontal direction to obtain an autocorrelation function, and adding and summing all columns of the autocorrelation function to obtain all suppression noise values;
step S1033, determining the minimum value of each suppression noise value, and determining the number of the motion points of the position of the minimum value of the suppression noise value in the horizontal direction;
step S1034, determining the number of motion points in the horizontal direction as the blur scale.
It should be noted that, in the embodiment of the present application, the image to be restored is represented by a motion blurred image, and therefore, the horizontal image to be restored is a horizontal motion blurred image.
Specifically, the image processing system performs first-order differentiation on a horizontal motion blur image g (i, j) with a size of m×m in the horizontal direction to obtain a target differential image g '(i, j) of the horizontal motion blur image g (i, j), where the calculation formula of the target differential image g' (i, j) is
Further, the image processing system performs autocorrelation processing on the target differential image g' (i, j) in the horizontal direction to obtain an autocorrelation function s (i, j), wherein the calculation formula of the autocorrelation function s (i, j) is as followsThen, the image is formedThe processing system sums the columns of the autocorrelation function s (i, j) to obtain the suppression noise values s add (j) Wherein the purpose of the addition and summation is to suppress the influence of noise to improve the accuracy and reliability of discrimination, suppressing the noise value s add (j) The calculation formula of (2) is +.>Finally, the image processing system determines the respective suppression noise value s add (j) Noise suppression value s with minimum median value add (j) min Supposing that the motion blur scale is N, suppressing the noise value s add (j) The smallest suppression noise value s occurs at N-1 and 1-N add (j) min Determining the minimum suppression noise value s add (j) min And determining the number of motion points in the horizontal direction as a blur scale, it can be understood that the blur scale is denoted as the number of blur points.
The embodiment of the application determines the blurring scale and provides basic data for determining the point spread function of the motion blurring image.
Further, the descriptions of step S1041 to step S1043 are as follows:
step S1041, determining a first blur scale in a first blur direction and determining a second blur scale in a second blur direction;
step S1042, if the number of the first blur scales is greater than the number of the second blur scales, determining the point spread function based on the first blur direction, the first blur scale, and the second blur scale;
step S1043, if the number of second blur scales is greater than or equal to the number of first blur scales, determining the point spread function based on the second blur direction, the first blur scale, and the second blur scale.
Specifically, the image processing system determines a first blur scale m in the first blur direction x and a second blur scale n in the second blur direction y, i.e. a blur point number.
It can thus be appreciated that the image processing system determines a first blur point number m in the first blur direction x and a second blur point number n in the second blur direction y. Then, the image processing system compares the number of the first fuzzy points m with the number of the second fuzzy points n, namely, the number of the first fuzzy points m is determined to be larger than or equal to the number of the second fuzzy points n.
Further, if the number of the first blur points m is determined to be less than or equal to the number of the second blur points n, the image processing system determines a point spread function according to the second blur direction y, the first blur points m and the second blur points n, that is, according to x= [ m×y/n ], y=0, 1,2,..n-1, where [ (] is a rounding operation.
Further, if the number of the first blur points m is determined to be greater than the number of the second blur points n, the image processing system determines a point spread function based on the first blur direction x, the first blur points m, and the second blur points n, that is, according to y= [ n x/m ], x=0, 1,2, & gt, m-1, where [ (] is a rounding operation.
According to the embodiment of the application, the point spread function of the motion blurred image is determined through the blurring direction and the blurring scale, basic data is provided for frequency domain wiener filtering, and basic data is provided for obtaining a final sharpened image with enhanced edge detail information.
And S20, performing spatial domain high-pass filtering processing on the image to be restored through a preset convolution template to obtain an original clear image of the image to be restored.
Before the spatial high-pass filtering processing is performed on the motion blurred image, a spatial high-pass filtering convolution template needs to be selected. According to the embodiment of the application, experiments and analysis are carried out on various airspace high-pass filtering convolution templates, the airspace high-pass filtering convolution templates of the Laplacian operator derivative matrix H are used as preset convolution templates, and the Laplacian operator derivative matrix H is as follows:
the image processing system carries out airspace high-pass filtering on the motion blurred image through an airspace high-pass filtering convolution template of the Laplacian operator derivative matrix H, namely, carries out processing on pixels of the motion blurred image through the airspace high-pass filtering convolution template of the Laplacian operator derivative matrix H, and convolves the motion blurred image with the airspace high-pass filtering convolution template of the Laplacian operator derivative matrix H so as to increase edge information of the motion blurred image and strengthen edge details of a restored image, and obtain an original sharpened image after sharpening enhancement of the motion blurred image.
And step S30, carrying out frequency domain wiener filtering processing on the point spread function and the original sharpened image to obtain Fourier transform of the original sharpened image.
The image processing system performs fourier transform on the point spread function and the original sharpened image respectively, converts the point spread function and the original sharpened image into a frequency domain, and performs wiener filtering to obtain fourier transform of the original sharpened image, which is described in step S301 to step S303.
Further, the descriptions of step S301 to step S303 are as follows:
step S301, performing Fourier transform on the image to be restored, the point spread function and the original sharpened image, and converting the image to a frequency domain to respectively obtain the image to be restored in the frequency domain, the point spread function in the frequency domain and the original sharpened image in the frequency domain;
step S302, determining a degradation model of the frequency domain according to the image to be restored of the frequency domain, the point spread function of the frequency domain and the original sharpened image of the frequency domain;
and step S303, combining a wiener filtering frequency domain solution formula in the frequency domain wiener filtering and a degradation model of the frequency domain to obtain Fourier transform of the original clear image.
Specifically, the image processing system constructs an original degradation model of the motion blurred image according to the motion blurred image g (x, y), the point spread function h (x, y) and the original sharpened image f (x, y), namely, the original degradation model is g (x, y) =f (x, y) ×h (x, y) +n (x, y), and n (x, y) is an additive noise term. Then, the image processing system performs fourier transform on the motion blurred image G (x, y), the point spread function H (x, y) and the original sharpening image F (x, y) in the original degradation model G (x, y) =f (x, y) =h (x, y) +n (x, y), and converts the motion blurred image G (u, v), the point spread function H (u, v) of the frequency domain, the original sharpening image F (u, v) of the frequency domain and the additive noise N (u, v) of the frequency domain to obtain the motion blurred image G (u, v) of the frequency domain, wherein the motion blurred image G (u, v) of the frequency domain is the fourier transform of the motion blurred image, the point spread function H (u, v) of the frequency domain is the fourier transform of the point spread function, the original sharpening image F (u, v) of the frequency domain is the fourier transform of the original sharpening image, and the additive noise N (u, v) of the frequency domain is the fourier transform of the additive noise.
Further, the image processing system determines a degradation model of the frequency domain according to the image G (u, v) to be restored of the frequency domain, the point spread function H (u, v) of the frequency domain, the original sharpened image F (u, v) of the frequency domain and the additive noise N (u, v) of the frequency domain, and the convolution of the space domain is equal to multiplication of the frequency domain as known by a convolution theorem, namely, the degradation model of the frequency domain is G (u, v) =f (u, v) H (u, v) +n (u, v). Then, the image processing system obtains the Fourier transform of the original sharpened image by wiener filtering in the frequency domain according to the degradation model G (u, v) =F (u, v) H (u, v) +N (u, v) of the frequency domain.
Specifically, the image processing system combines a wiener filtering frequency domain solution formula in frequency domain wiener filtering and a degradation model G (u, v) =F (u, v) H (u, v) +N (u, v) of the frequency domain to obtain Fourier transform of an original sharpened imageThe wiener filtering frequency domain solution formula is:
wherein S is n (u, v) and S f (u, v) are the noise power spectrum and the original image signal power spectrum, respectively. Gamma is the Lagrangian coefficient due to S n (u, v) and S f (u, v) is difficult to find, and therefore the ratio k replaces the ratio of both, i.e. simplificationThe wiener filtering frequency domain solution formula is:
the embodiment of the application determines the Fourier transform of the original sharpened image and provides basic data for obtaining the final sharpened image with enhanced edge detail information
And step S40, carrying out inverse transformation on the Fourier transformation of the original sharpened image to obtain a final sharpened image of the image to be restored.
Fourier transform of original sharpened image by image processing systemPerforming inverse transformation, and converting to airspace to obtain final definition image +.>
According to the image sharpening method provided by the embodiment of the application, in the process of sharpening the image to be restored, the point spread function of the image to be restored is estimated, the spatial domain high-pass filtering is carried out on the image to be restored, the frequency domain wiener filtering is carried out on the point spread function and the original sharpened image, and the final sharpened image with enhanced edge detail information is obtained, so that the image information in the final sharpened image is clearer and more complete and is easy to identify, the final sharpened image is closer to and more practical, and the restoration quality of the image to be restored is improved.
Further, the image sharpening device provided by the application is described below, and the image sharpening device and the image sharpening method can be correspondingly referred to each other.
As shown in fig. 2, fig. 2 is a schematic structural diagram of an image sharpening device according to the present application, the image sharpening device includes:
an estimation module 201, configured to perform point spread function estimation on an image to be restored, so as to obtain a point spread function of the image to be restored;
the first processing module 202 is configured to perform spatial domain high-pass filtering processing on the image to be restored through a preset convolution template, so as to obtain an original sharpened image of the image to be restored;
the second processing module 203 is configured to perform frequency domain wiener filtering processing on the point spread function and the original sharpened image, so as to obtain fourier transform of the original sharpened image;
and the inverse transformation module 204 is configured to perform inverse transformation on the fourier transform of the original sharpened image, so as to obtain a final sharpened image of the image to be restored.
Further, the second processing module 203 is further configured to:
performing Fourier transformation on the image to be restored, the point spread function and the original sharpened image, and converting the image to be restored, the point spread function and the original sharpened image to be restored to a frequency domain to respectively obtain the image to be restored to the frequency domain, the point spread function of the frequency domain and the original sharpened image of the frequency domain;
determining a degradation model of the frequency domain according to the image to be restored of the frequency domain, the point spread function of the frequency domain and the original sharpened image of the frequency domain;
and combining a wiener filtering frequency domain solution formula in the frequency domain wiener filtering and a degradation model of the frequency domain to obtain Fourier transform of the original clear image.
Further, the estimation module 201 is further configured to:
performing bilinear interpolation and direction differentiation on the image to be restored to obtain a fuzzy direction;
rotating the image to be restored according to the blurring direction to obtain a horizontal image to be restored, wherein the movement direction of the horizontal image is rotated to a horizontal axis;
performing first-order differentiation and autocorrelation on the horizontal image to be restored to obtain a fuzzy scale;
the point spread function is determined based on the blur direction and the blur scale.
Further, the estimation module 201 is further configured to:
determining the image to be restored as first target image data, and determining an arc with the first target image data as a circle center and a preset radius;
determining each first image data to be processed which forms different included angle ranges with the first target image data by combining the circular arcs through bilinear interpolation;
determining each direction differential multiplier according to the first target image data and each first image data to be processed, and carrying out direction differential on the image to be restored through each direction differential multiplier to obtain each differential image;
and determining an included angle between the motion direction and the horizontal axis according to each differential image, and determining the included angle between the motion direction and the horizontal axis as the blurring direction.
Further, the estimation module 201 is further configured to:
adding and summing absolute values of gray values of the differential images to obtain gray total values of the images;
determining the image gray level total value with the smallest numerical value in the image gray level total values;
and determining the included angle in the image gray level total value with the minimum numerical value as the included angle between the motion direction and the horizontal axis.
Further, the estimation module 201 is further configured to:
performing first-order differentiation on the horizontal image to be restored in the horizontal direction to obtain a target differential image of the horizontal image to be restored;
carrying out autocorrelation on the target differential image in the horizontal direction to obtain an autocorrelation function, and adding and summing all columns of the autocorrelation function to obtain all suppression noise values;
determining the minimum value of each suppression noise value, and determining the number of the motion points of the position of the minimum value of the suppression noise value in the horizontal direction;
and determining the number of the motion points in the horizontal direction as the blurring scale.
Further, the estimation module 201 is further configured to:
determining a first blur scale in a first blur direction and determining a second blur scale in a second blur direction;
if the number of the first blur scales is greater than the number of the second blur scales, determining the point spread function based on the first blur direction, the first blur scale and the second blur scale;
and if the number of the second fuzzy scales is greater than or equal to the number of the first fuzzy scales, determining the point spread function based on the second fuzzy direction, the first fuzzy scales and the second fuzzy scales.
The specific embodiments of the image sharpening device provided by the application are basically the same as the embodiments of the image sharpening method, and are not described herein.
Fig. 3 illustrates a physical schematic diagram of an electronic device, as shown in fig. 3, the electronic device may include: processor 310, communication interface (Communications Interface) 320, memory 330 and communication bus 340, wherein processor 310, communication interface 320, memory 330 accomplish communication with each other through communication bus 340. The processor 310 may invoke logic instructions in the memory 330 to perform an image sharpening method comprising:
estimating a point spread function of an image to be restored to obtain the point spread function of the image to be restored;
performing spatial domain high-pass filtering processing on the image to be restored through a preset convolution template to obtain an original clear image of the image to be restored;
carrying out frequency domain wiener filtering treatment on the point spread function and the original sharpened image to obtain Fourier transform of the original sharpened image;
and carrying out inverse transformation on the Fourier transformation of the original sharpened image to obtain a final sharpened image of the image to be restored.
Further, the logic instructions in the memory 330 described above may be implemented in the form of software functional units and may be stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In another aspect, the present application also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the image sharpening method provided by the above methods, the method comprising:
estimating a point spread function of an image to be restored to obtain the point spread function of the image to be restored;
performing spatial domain high-pass filtering processing on the image to be restored through a preset convolution template to obtain an original clear image of the image to be restored;
carrying out frequency domain wiener filtering treatment on the point spread function and the original sharpened image to obtain Fourier transform of the original sharpened image;
and carrying out inverse transformation on the Fourier transformation of the original sharpened image to obtain a final sharpened image of the image to be restored.
In yet another aspect, the present application also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform the image sharpening methods provided above, the method comprising:
estimating a point spread function of an image to be restored to obtain the point spread function of the image to be restored;
performing spatial domain high-pass filtering processing on the image to be restored through a preset convolution template to obtain an original clear image of the image to be restored;
carrying out frequency domain wiener filtering treatment on the point spread function and the original sharpened image to obtain Fourier transform of the original sharpened image;
and carrying out inverse transformation on the Fourier transformation of the original sharpened image to obtain a final sharpened image of the image to be restored.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present application without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. An image sharpening method, comprising:
estimating a point spread function of an image to be restored to obtain the point spread function of the image to be restored;
performing spatial domain high-pass filtering processing on the image to be restored through a preset convolution template to obtain an original clear image of the image to be restored;
carrying out frequency domain wiener filtering treatment on the point spread function and the original sharpened image to obtain Fourier transform of the original sharpened image;
and carrying out inverse transformation on the Fourier transformation of the original sharpened image to obtain a final sharpened image of the image to be restored.
2. The method for sharpening an image according to claim 1, wherein said performing a frequency domain wiener filtering process on said point spread function and said original sharpened image to obtain a fourier transform of said original sharpened image comprises:
performing Fourier transformation on the image to be restored, the point spread function and the original sharpened image, and converting the image to be restored, the point spread function and the original sharpened image to be restored to a frequency domain to respectively obtain the image to be restored to the frequency domain, the point spread function of the frequency domain and the original sharpened image of the frequency domain;
determining a degradation model of the frequency domain according to the image to be restored of the frequency domain, the point spread function of the frequency domain and the original sharpened image of the frequency domain;
and combining a wiener filtering frequency domain solution formula in the frequency domain wiener filtering and a degradation model of the frequency domain to obtain Fourier transform of the original clear image.
3. The method for image sharpening according to claim 1, wherein the performing the point spread function estimation on the image to be restored to obtain the point spread function of the image to be restored includes:
performing bilinear interpolation and direction differentiation on the image to be restored to obtain a fuzzy direction;
rotating the image to be restored according to the blurring direction to obtain a horizontal image to be restored, wherein the movement direction of the horizontal image is rotated to a horizontal axis;
performing first-order differentiation and autocorrelation on the horizontal image to be restored to obtain a fuzzy scale;
the point spread function is determined based on the blur direction and the blur scale.
4. The method for image sharpening according to claim 3, wherein said performing bilinear interpolation and direction differentiation on the image to be restored to obtain a blurred direction comprises:
determining the image to be restored as first target image data, and determining an arc with the first target image data as a circle center and a preset radius;
determining each first image data to be processed which forms different included angle ranges with the first target image data by combining the circular arcs through bilinear interpolation;
determining each direction differential multiplier according to the first target image data and each first image data to be processed, and carrying out direction differential on the image to be restored through each direction differential multiplier to obtain each differential image;
and determining an included angle between the motion direction and the horizontal axis according to each differential image, and determining the included angle between the motion direction and the horizontal axis as the blurring direction.
5. The image sharpening method of claim 4, wherein said determining an angle between a motion direction and a horizontal axis from each of said differential images comprises:
adding and summing absolute values of gray values of the differential images to obtain gray total values of the images;
determining the image gray level total value with the smallest numerical value in the image gray level total values;
and determining the included angle in the image gray level total value with the minimum numerical value as the included angle between the motion direction and the horizontal axis.
6. The image sharpening method of claim 3, wherein said subjecting the horizontal image to be restored to first-order differentiation and autocorrelation to obtain a blur scale comprises:
performing first-order differentiation on the horizontal image to be restored in the horizontal direction to obtain a target differential image of the horizontal image to be restored;
carrying out autocorrelation on the target differential image in the horizontal direction to obtain an autocorrelation function, and adding and summing all columns of the autocorrelation function to obtain all suppression noise values;
determining the minimum value of each suppression noise value, and determining the number of the motion points of the position of the minimum value of the suppression noise value in the horizontal direction;
and determining the number of the motion points in the horizontal direction as the blurring scale.
7. A method of image sharpening according to claim 3, wherein said determining said point spread function based on said blurring direction and said blurring scale comprises:
determining a first blur scale in a first blur direction and determining a second blur scale in a second blur direction;
if the number of the first blur scales is greater than the number of the second blur scales, determining the point spread function based on the first blur direction, the first blur scale and the second blur scale;
and if the number of the second fuzzy scales is greater than or equal to the number of the first fuzzy scales, determining the point spread function based on the second fuzzy direction, the first fuzzy scales and the second fuzzy scales.
8. An image sharpening device, comprising:
the estimating module is used for estimating the point spread function of the image to be restored to obtain the point spread function of the image to be restored;
the first processing module is used for performing spatial domain high-pass filtering processing on the image to be restored through a preset convolution template to obtain an original clear image of the image to be restored;
the second processing module is used for carrying out frequency domain wiener filtering processing on the point spread function and the original sharpened image to obtain Fourier transform of the original sharpened image;
and the inverse transformation module is used for carrying out inverse transformation on the Fourier transformation of the original sharpened image to obtain a final sharpened image of the image to be restored.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the image sharpening method of any one of claims 1 to 7 when executing the computer program.
10. A non-transitory computer readable storage medium comprising a computer program, characterized in that the computer program, when executed by a processor, implements the image sharpening method of any one of claims 1 to 7.
CN202210556027.3A 2022-05-20 2022-05-20 Image sharpening method and device, electronic equipment and computer storage medium Pending CN117132474A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210556027.3A CN117132474A (en) 2022-05-20 2022-05-20 Image sharpening method and device, electronic equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210556027.3A CN117132474A (en) 2022-05-20 2022-05-20 Image sharpening method and device, electronic equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN117132474A true CN117132474A (en) 2023-11-28

Family

ID=88861561

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210556027.3A Pending CN117132474A (en) 2022-05-20 2022-05-20 Image sharpening method and device, electronic equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN117132474A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118010579A (en) * 2024-04-03 2024-05-10 山东科技大学 Marine plastic particle primary screening system for ship and image detection method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118010579A (en) * 2024-04-03 2024-05-10 山东科技大学 Marine plastic particle primary screening system for ship and image detection method thereof

Similar Documents

Publication Publication Date Title
CN108805840B (en) Image denoising method, device, terminal and computer readable storage medium
CN109242799B (en) Variable-threshold wavelet denoising method
CN105335947B (en) Image de-noising method and image denoising device
CN111275626A (en) Video deblurring method, device and equipment based on ambiguity
CN112508810A (en) Non-local mean blind image denoising method, system and device
WO2014070273A1 (en) Recursive conditional means image denoising
CN112862753B (en) Noise intensity estimation method and device and electronic equipment
JP2007018379A (en) Image processing method and image processing device
CN117132474A (en) Image sharpening method and device, electronic equipment and computer storage medium
CN115082336A (en) SAR image speckle suppression method based on machine learning
Xu et al. An image-enhancement method based on variable-order fractional differential operators
CN111415317B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN117218149B (en) Image reconstruction method and system based on self-coding neural network
CN113592725B (en) Medical optical imaging noise elimination method
KR20100101463A (en) Apparatus and method for eliminating noise
CN108629740B (en) Image denoising processing method and device
CN111260590B (en) Image noise reduction method and related product
Sheta Restoration of medical images using genetic algorithms
Pan et al. Fractional directional derivative and identification of blur parameters of motion-blurred image
de Paiva et al. A hybrid genetic algorithm for image denoising
CN111340724B (en) Image jitter removing method and device in LED screen correction process
CN114119377A (en) Image processing method and device
CN113706392A (en) Moire pattern processing method, computer-readable storage medium and terminal device
Dong et al. Maximum likelihood interpolation for aliasing-aware image restoration
CN111652811A (en) Motion blurred image restoration method based on edge function and optimal window wiener filtering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination