CN110717922A - Image definition evaluation method and device - Google Patents

Image definition evaluation method and device Download PDF

Info

Publication number
CN110717922A
CN110717922A CN201810758155.XA CN201810758155A CN110717922A CN 110717922 A CN110717922 A CN 110717922A CN 201810758155 A CN201810758155 A CN 201810758155A CN 110717922 A CN110717922 A CN 110717922A
Authority
CN
China
Prior art keywords
image
edge
target image
target
subjected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201810758155.XA
Other languages
Chinese (zh)
Inventor
杨茜
侯国梁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Putian Information Technology Co Ltd
Original Assignee
Putian Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Putian Information Technology Co Ltd filed Critical Putian Information Technology Co Ltd
Priority to CN201810758155.XA priority Critical patent/CN110717922A/en
Publication of CN110717922A publication Critical patent/CN110717922A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an image definition evaluation method and device, wherein the method comprises the following steps: acquiring the edge width of a target image to be subjected to image definition evaluation; acquiring the edge variance of the target image to be subjected to image definition evaluation; and evaluating the definition of the target image according to the edge width and the edge variance. Therefore, the image definition evaluation method based on the edge width and the edge variance improves the discrimination between the image definitions by using the obvious difference of the images with different definition degrees in the edge variance and the slight difference of the edge width.

Description

Image definition evaluation method and device
Technical Field
The invention relates to the technical field of computers, in particular to an image definition evaluation method and device.
Background
With the rapid development of information technology and the wide application of intelligent terminal devices, images are increasingly regarded as information media. Image quality evaluation is an important research direction in the field of image processing, and is commonly used in the fields of imaging quality detection and control of imaging systems, evaluation of image processing algorithms and the like. The image quality evaluation method is divided into subjective image quality evaluation and objective image quality evaluation. The objective image quality evaluation is divided into three categories: the quality evaluation of the full reference image, which is to give an evaluation result by comparing the difference between the image to be evaluated and the original reference image, is actually the calculation of the fidelity of the image; partial reference image quality evaluation is carried out, no reference image exists, but a characteristic combination of some reference images is used as prior information, and the evaluation principle is the same as that of the former method; and the no-reference image quality evaluation is to evaluate the quality of a single frame image without any reference information. In most applications, reference images cannot be obtained generally, and a no-reference image quality evaluation method is a research focus and difficulty in the field of image quality evaluation.
Blur distortion is the most common type of image and video distortion in daily life, and occurs during various processing such as acquisition, transmission, and compression. The blur degree of an image corresponds to the sharpness, and the evaluation of the sharpness of an image is indispensable for the evaluation of image quality. The existing definition evaluation methods are mainly divided into three categories: transform domain based methods, statistical based methods, and spatial domain based methods. The method based on the transform domain mainly inspects the frequency components of the image, and the high-frequency components of the image which is clearly focused are more, and the low-frequency components of the image which is blurred are more. The method has strong noise immunity and high accuracy, but needs to perform space transformation on the image, such as Fourier transformation, discrete cosine transformation and wavelet transformation, has large calculation amount and is difficult to meet the real-time requirement of engineering. The statistical-based method mainly comprises an information entropy method, which has strong dependence on image content, is sensitive to noise and has low accuracy. The method based on the spatial domain is the most common method, mainly inspects the gray scale change or gradient information of the image, directly processes the pixel or gray scale gradient value of the image, has simple calculation and meets the real-time requirement, but the method has poor anti-noise capability and is easy to be interfered by illumination and background.
At present, methods based on a spatial domain are roughly classified into two categories, one is a definition evaluation method based on an edge width, and the other is a definition evaluation method based on a gray gradient. The definition evaluation method based on the edge width is characterized in that the edge width is obtained by calculating the positions of the left extreme point and the right extreme point of each edge point, the method is easily influenced by the brightness distribution of an image and the richness of the edge information of the image, errors are easily generated at the positions of the left extreme point and the right extreme point, and the definition division degree of the image is not high. The definition evaluation method based on the gray gradient is easy to be interfered by an image background, and the threshold value is difficult to set.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides an image definition evaluation method and device.
Specifically, the invention provides the following technical scheme:
in a first aspect, the present invention provides an image sharpness evaluation method, including:
acquiring the edge width of a target image to be subjected to image definition evaluation; the target image to be subjected to image definition evaluation is a gray image subjected to gray processing;
acquiring the edge variance of the target image to be subjected to image definition evaluation;
and evaluating the definition of the target image according to the edge width and the edge variance.
Further, the acquiring the edge width of the target image to be subjected to image sharpness evaluation specifically includes:
extracting edges in the horizontal direction and the vertical direction by using a sobel operator to obtain a horizontal gray level gradient image and a vertical gray level gradient image;
obtaining a horizontal and vertical strong edge gray gradient map by taking a gradient mean value as a threshold value according to the horizontal and vertical gray gradient maps;
calculating the edge widths of all edge points based on the horizontal strong edge gray gradient map, and taking the maximum value or the average value as the horizontal direction edge width of the target image;
calculating the edge widths of all edge points based on the vertical strong edge gray gradient map, and taking the maximum value or the average value as the edge width of the target image in the vertical direction;
and carrying out weighted summation on the horizontal direction edge width and the vertical direction edge width to obtain the edge width of the target image.
Further, when the horizontal direction edge width and the vertical direction edge width are subjected to weighted summation, a first weighting factor is corresponding to the horizontal direction edge width, and a second weighting factor is corresponding to the vertical direction edge width, wherein the sum of the first weighting factor and the second weighting factor is 1, and when the target feature in the target image is in the vertical direction relative to the moving direction of the camera, the second weighting factor is greater than the first weighting factor, and when the target feature in the target image is in the horizontal direction relative to the moving direction of the camera, the first weighting factor is greater than the second weighting factor.
Further, the acquiring the edge variance of the target image to be subjected to image sharpness evaluation specifically includes:
adopting a canny edge detection operator to carry out edge extraction on the target image to obtain an edge image of the target image;
expanding the edge image through a square operator with a preset size, processing the target image by taking the expanded gradient image as a mask, extracting an effective edge area of the target image, and setting the rest areas as black backgrounds to obtain an effective edge image of the target image;
and acquiring the Laplacian gradient variance of the effective edge image as the edge variance of the target image to be subjected to image definition evaluation.
Further, the performing sharpness evaluation on the target image according to the edge width and the edge variance specifically includes:
and evaluating the definition of the target image according to the edge width and the edge variance by using the following first relation model:
Sharpness=edge_width/(edge_var+C);
wherein, Sharpness is a definition evaluation index, and a smaller value indicates that an image is clearer, otherwise, the image is more fuzzy; edge _ width is the edge width, edge _ var is the edge variance, and C is a preset adjustment parameter used to ensure that the denominator is not zero.
Further, before obtaining the edge width and the edge variance of the target image, the method further comprises:
and preprocessing a target image to be subjected to image definition evaluation.
Further, the preprocessing the target image to be subjected to the image sharpness evaluation specifically includes:
carrying out alignment shearing processing on target features in a target image to be subjected to image definition evaluation to obtain an image only containing the target features;
carrying out median filtering and gray level equalization processing on the image only containing the target characteristics to obtain a processed gray level image;
and carrying out image size normalization processing on the processed gray level image to obtain an image with a preset size.
In a second aspect, the present invention further provides an image sharpness evaluation apparatus, including:
the first acquisition module is used for acquiring the edge width of a target image to be subjected to image definition evaluation; the target image to be subjected to image definition evaluation is a gray image subjected to gray processing;
the second acquisition module is used for acquiring the edge variance of the target image to be subjected to image definition evaluation;
and the definition evaluation module is used for evaluating the definition of the target image according to the edge width and the edge variance.
In a third aspect, the present invention further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the steps of the image sharpness evaluation method according to the first aspect are implemented.
In a fourth aspect, the present invention also provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the image sharpness evaluation method according to the first aspect.
According to the technical scheme, the image definition evaluation method provided by the invention respectively obtains the edge width and the edge variance of the target image to be subjected to image definition evaluation, and then carries out definition evaluation on the target image according to the edge width and the edge variance. Therefore, the image definition evaluation method provided by the invention utilizes the obvious difference of the edge variance and the slight difference of the edge width of the images with different definition degrees, and evaluates the image definition based on the edge width and the edge variance, thereby improving the distinction degree between the image definitions.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of an image sharpness evaluation method according to an embodiment of the present invention;
fig. 2 is another flowchart of an image sharpness evaluation method according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an image sharpness evaluation apparatus according to another embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to yet another embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides an image definition evaluation method, which fully utilizes the remarkable difference of images with different definition degrees on edge strength and the slight difference on edge width and provides a definition index calculation method based on the edge width and the edge variance so as to improve the distinction degree between the image definitions. The image sharpness evaluation method provided by the present invention will be described in detail below by way of specific examples.
An embodiment of the present invention provides an image sharpness evaluation method, which includes the following steps, with reference to fig. 1:
step 101: acquiring the edge width of a target image to be subjected to image definition evaluation; and the target image to be subjected to image definition evaluation is a gray image subjected to gray processing.
Step 102: and acquiring the edge variance of the target image to be subjected to image definition evaluation.
Step 103: and evaluating the definition of the target image according to the edge width and the edge variance.
As can be seen from the above description, the image sharpness evaluation method provided in this embodiment respectively obtains the edge width and the edge variance of a target image to be subjected to image sharpness evaluation, and then performs sharpness evaluation on the target image according to the edge width and the edge variance. As can be seen, the image sharpness evaluation method provided in this embodiment evaluates the image sharpness based on the edge width and the edge variance by using the significant difference of the edge variance and the slight difference of the edge width of the images with different degrees of sharpness, so as to improve the degree of distinction between the image sharpness.
In a preferred embodiment, the step 101 is implemented as follows:
and A1, extracting edges in the horizontal direction and the vertical direction by using a sobel operator to obtain a horizontal gray gradient image and a vertical gray gradient image.
In the step, the sobel operator is adopted to extract the edges in the horizontal direction and the vertical direction, and a horizontal gray gradient image and a vertical gray gradient image are obtained. The sobel operator is an operator for extracting texture gradient information in the horizontal direction and the vertical direction, and accords with the texture extraction process when human eyes observe images. In this embodiment, the sobel operator is used to extract the edges in the horizontal direction and the vertical direction, so as to obtain the horizontal and vertical gray gradient maps.
And A2, obtaining a horizontal and vertical strong edge gray gradient map by taking the gradient mean value as a threshold value according to the horizontal and vertical gray gradient maps.
In this step, after obtaining the horizontal and vertical strong edge gray gradient images, the false edges generated by the isolated noise points can be further removed, and only those pixel points satisfying that at least 3 edge points exist in 8 neighborhoods are reserved, so as to improve the accuracy of the definition evaluation of the subsequent images.
And A3, calculating the edge width of all edge points based on the horizontal strong edge gray gradient map, and taking the maximum value or the average value as the horizontal direction edge width of the target image.
In this step, based on the horizontal strong edge gray gradient map, the edge widths of all edge points are calculated, and the maximum value or the average value is taken to represent the horizontal direction edge width of the image. Here, the edge width calculation employs an existing method, such as that proposed by Marziliano, of calculating a contour region between local maxima and minima around an edge as a horizontal-direction edge width.
And A4, calculating the edge width of all edge points based on the vertical strong edge gray gradient map, and taking the maximum value or the average value as the vertical direction edge width of the target image.
In the step, based on the vertical strong edge gray gradient map, the edge widths of all edge points are calculated, and the maximum value or the average value is taken to represent the edge width in the vertical direction of the image. Here, the edge width calculation employs an existing method, such as that proposed by Marziliano, of calculating a contour region between local maxima and minima around an edge as a vertical-direction edge width.
And A5, carrying out weighted summation on the horizontal direction edge width and the vertical direction edge width to obtain the edge width of the target image.
In this step, the edge widths in the horizontal and vertical directions are weighted and summed to obtain the edge width of the target image.
It should be noted that, when performing weighted summation on the horizontal direction edge width and the vertical direction edge width, a first weighting factor corresponds to the horizontal direction edge width, and a second weighting factor corresponds to the vertical direction edge width, where a sum of the first weighting factor and the second weighting factor is 1. In one embodiment, to simplify the calculation, the first weighting factor and the second weighting factor each take 0.5. However, in order to improve the accuracy of the subsequent image sharpness evaluation, in another embodiment, the horizontal or vertical weighting factor may be increased based on the direction of significant movement of the target feature in the target image relative to the camera. For example, when the target feature in the target image is in a vertical direction with respect to the moving direction of the camera, the second weighting factor is greater than the first weighting factor, and when the target feature in the target image is in a horizontal direction with respect to the moving direction of the camera, the first weighting factor is greater than the second weighting factor. Further, when it is not determined whether the movement direction of the target feature in the target image with respect to the camera is the horizontal direction or the vertical direction, the first weighting factor and the second weighting factor may each be 0.5.
In a preferred embodiment, the step 102 is implemented as follows:
and B1, adopting a canny edge detection operator to carry out edge extraction on the target image, and obtaining an edge image of the target image.
In the step, edge extraction is carried out on the target image by adopting a canny edge detection operator, and an edge image of the target image is obtained. In general, the edge-extracted gradient map is a binary image having a value of 0 or 255.
And B2, expanding the edge image through a square operator with a preset size, processing the target image by taking the expanded gradient image as a mask, extracting an effective edge area of the target image, and setting the rest areas as black backgrounds to obtain the effective edge image of the target image.
In this step, the edge image is expanded by a square operator with a preset size (e.g., 3 × 3), the target image is processed by using the expanded gradient map as a mask, an effective edge region of the target image is extracted, and the rest regions are set as black backgrounds, so that an effective edge image of the target image is obtained. It should be noted that, the edge image extracted by directly using the canny operator in step B1 is a binary image, in step B2, an expansion operation is first performed on the basis of the edge image obtained in step B1, and then a mask operation is performed on the expanded gradient map and the original target image to extract an effective edge region of the target image, where the effective edge region is a gray-level value image, and the processing procedure can effectively reduce an edge decision error compared to the edge image extracted by directly using the canny operator in step B1 and obtained by binarization.
And B3, acquiring the Laplacian gradient variance of the effective edge image, and taking the Laplacian gradient variance as the edge variance of the target image to be subjected to image definition evaluation.
In the step, Laplacian gradient variance of the effective edge image is calculated and used as an index for representing the edge information of the image, on one hand, the information richness of the interested region of human eyes in the image is reflected, and the image is clearer when the variance value is larger; on the other hand, the edge sharpness of the image is also reflected compared to the black background area.
In a preferred embodiment, the step 103 is implemented as follows: and evaluating the definition of the target image according to the edge width and the edge variance by using the following first relation model:
Sharpness=edge_width/(edge_var+C);
wherein Sharpness is a definition evaluation index, the value of Sharpness is distributed between 0 and 1, and the smaller the value is, the clearer the image is, otherwise, the more fuzzy the image is; edge _ width is the edge width, edge _ var is the edge variance, usually normalized to (0, 100), C is a preset adjustment parameter to ensure that the denominator is not zero, and C is generally a small value, such as 0.01.
Note that, in general, sharp is less than or equal to 0.3, and the image is regarded as clear. The maximum value of Sharpness can be set to 1, and the closer to 1, the more blurred the image.
In a preferred embodiment, before obtaining the edge width and the edge variance of the target image, referring to fig. 2, the method further comprises:
step 100: and preprocessing a target image to be subjected to image definition evaluation.
In a preferred embodiment, the step 100 is implemented as follows:
and C1, carrying out alignment shearing processing on the target features in the target image to be subjected to image definition evaluation to obtain an image only containing the target features.
In this step, in order to reduce the influence of the image background as much as possible, alignment clipping processing may be performed on the target features in the target image, so as to obtain an image containing only the target features. The alignment clipping processing technique of the target feature may adopt an existing processing method, which is not described in detail in this embodiment. For example, if the target feature in the target image is a face, the image only including the size of the face may be obtained by cutting the target feature after the face is aligned, where the face alignment technique may adopt an existing method and is not described herein again.
And C2, performing median filtering and gray level equalization processing on the image only containing the target characteristics to obtain a processed gray level image.
In this step, to reduce the influence of the illumination non-uniformity and remove the isolated noise points, the grayscale image is further subjected to median filtering and grayscale equalization, for example, a median filter of 5 × 5 size and a grayscale equalizer of 8 × 8 size are used to obtain a processed grayscale image.
And C3, carrying out image size normalization processing on the processed gray level image to obtain an image with a preset size.
In this step, the processed grayscale image may be subjected to image size normalization processing to obtain an image with a preset size (e.g., 160 × 160), so as to facilitate subsequent processing.
As can be seen from the above description, in the embodiment, the edge width and the edge variance are combined to calculate the definition, and a method for performing coarse discrimination on the edge variance and performing fine discrimination on the edge width is formed, so that the discrimination between the definitions can be effectively improved. In addition, the embodiment extracts the effective edge area of the image to calculate the edge variance, so that the influence of the background pixels of the image can be effectively reduced. In addition, the sharpness index in this embodiment is distributed in (0, 1), and the threshold is easily set, independent of the influence of the image content.
Based on the same inventive concept, another embodiment of the present invention provides an image sharpness evaluating apparatus, referring to fig. 3, including: a first acquisition module 21, a second acquisition module 22 and a sharpness evaluation module 23, wherein:
the first acquisition module 21 is configured to acquire an edge width of a target image to be subjected to image sharpness evaluation; the target image to be subjected to image definition evaluation is a gray image subjected to gray processing;
the second obtaining module 22 is configured to obtain an edge variance of the target image to be subjected to image sharpness evaluation;
and the definition evaluation module 23 is configured to perform definition evaluation on the target image according to the edge width and the edge variance.
In a preferred embodiment, the first obtaining module 21 is specifically configured to:
extracting edges in the horizontal direction and the vertical direction by using a sobel operator to obtain a horizontal gray level gradient image and a vertical gray level gradient image;
obtaining a horizontal and vertical strong edge gray gradient map by taking a gradient mean value as a threshold value according to the horizontal and vertical gray gradient maps;
calculating the edge widths of all edge points based on the horizontal strong edge gray gradient map, and taking the maximum value or the average value as the horizontal direction edge width of the target image;
calculating the edge widths of all edge points based on the vertical strong edge gray gradient map, and taking the maximum value or the average value as the edge width of the target image in the vertical direction;
and carrying out weighted summation on the horizontal direction edge width and the vertical direction edge width to obtain the edge width of the target image.
In a preferred embodiment, the second obtaining module 22 is specifically configured to:
adopting a canny edge detection operator to carry out edge extraction on the target image to obtain an edge image of the target image;
expanding the edge image through a square operator with a preset size, processing the target image by taking the expanded gradient image as a mask, extracting an effective edge area of the target image, and setting the rest areas as black backgrounds to obtain an effective edge image of the target image;
and acquiring the Laplacian gradient variance of the effective edge image as the edge variance of the target image to be subjected to image definition evaluation.
In a preferred embodiment, the sharpness evaluation module 23 is specifically configured to:
and evaluating the definition of the target image according to the edge width and the edge variance by using the following first relation model:
Sharpness=edge_width/(edge_var+C);
wherein, Sharpness is a definition evaluation index, and a smaller value indicates that an image is clearer, otherwise, the image is more fuzzy; edge _ width is the edge width, edge _ var is the edge variance, and C is a preset adjustment parameter used to ensure that the denominator is not zero.
The image sharpness evaluation apparatus described in this embodiment may be configured to execute the image sharpness evaluation method described in the foregoing embodiment, and the principle and the technical effect are similar, which are not described herein again.
Based on the same inventive concept, another embodiment of the present invention provides an electronic device, which specifically includes the following components, with reference to fig. 4: a processor 701, a memory 702, a communication interface 703 and a bus 704;
the processor 701, the memory 702 and the communication interface 703 complete mutual communication through the bus 704; the communication interface 703 is used for realizing information transmission between related devices such as modeling software, an intelligent manufacturing equipment module library and the like;
the processor 701 is configured to call a computer program in the memory 702, and the processor implements all the steps in the image sharpness evaluation method according to the above embodiment when executing the computer program, for example, the processor implements the following steps when executing the computer program:
step 101: acquiring the edge width of a target image to be subjected to image definition evaluation; and the target image to be subjected to image definition evaluation is a gray image subjected to gray processing.
Step 102: and acquiring the edge variance of the target image to be subjected to image definition evaluation.
Step 103: and evaluating the definition of the target image according to the edge width and the edge variance.
Based on the same inventive concept, another embodiment of the present invention provides a computer-readable storage medium, having a computer program stored thereon, where the computer program is executed by a processor to implement all the steps of the above-mentioned image sharpness evaluation method, for example, when the processor executes the computer program, the processor implements the following steps:
step 101: acquiring the edge width of a target image to be subjected to image definition evaluation; and the target image to be subjected to image definition evaluation is a gray image subjected to gray processing.
Step 102: and acquiring the edge variance of the target image to be subjected to image definition evaluation.
Step 103: and evaluating the definition of the target image according to the edge width and the edge variance.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above examples are only for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. An image sharpness evaluation method is characterized by comprising:
acquiring the edge width of a target image to be subjected to image definition evaluation; the target image to be subjected to image definition evaluation is a gray image subjected to gray processing;
acquiring the edge variance of the target image to be subjected to image definition evaluation;
and evaluating the definition of the target image according to the edge width and the edge variance.
2. The method according to claim 1, wherein the acquiring an edge width of a target image to be subjected to image sharpness evaluation specifically comprises:
extracting edges in the horizontal direction and the vertical direction by using a sobel operator to obtain a horizontal gray level gradient image and a vertical gray level gradient image;
obtaining a horizontal and vertical strong edge gray gradient map by taking a gradient mean value as a threshold value according to the horizontal and vertical gray gradient maps;
calculating the edge widths of all edge points based on the horizontal strong edge gray gradient map, and taking the maximum value or the average value as the horizontal direction edge width of the target image;
calculating the edge widths of all edge points based on the vertical strong edge gray gradient map, and taking the maximum value or the average value as the edge width of the target image in the vertical direction;
and carrying out weighted summation on the horizontal direction edge width and the vertical direction edge width to obtain the edge width of the target image.
3. The method according to claim 2, wherein when the horizontal edge width and the vertical edge width are weighted and summed, a first weighting factor is corresponding to the horizontal edge width, and a second weighting factor is corresponding to the vertical edge width, wherein the sum of the first weighting factor and the second weighting factor is 1, and wherein the second weighting factor is greater than the first weighting factor when the target feature in the target image is in the vertical direction with respect to the moving direction of the camera, and wherein the first weighting factor is greater than the second weighting factor when the target feature in the target image is in the horizontal direction with respect to the moving direction of the camera.
4. The method according to claim 1, wherein the obtaining of the edge variance of the target image to be subjected to image sharpness evaluation specifically includes:
adopting a canny edge detection operator to carry out edge extraction on the target image to obtain an edge image of the target image;
expanding the edge image through a square operator with a preset size, processing the target image by taking the expanded gradient image as a mask, extracting an effective edge area of the target image, and setting the rest areas as black backgrounds to obtain an effective edge image of the target image;
and acquiring the Laplacian gradient variance of the effective edge image as the edge variance of the target image to be subjected to image definition evaluation.
5. The method according to claim 1, wherein the evaluating the sharpness of the target image according to the edge width and the edge variance specifically comprises:
and evaluating the definition of the target image according to the edge width and the edge variance by using the following first relation model:
Sharpness=edge_width/(edge_var+C);
wherein, Sharpness is a definition evaluation index, and a smaller value indicates that an image is clearer, otherwise, the image is more fuzzy; edge _ width is the edge width, edge _ var is the edge variance, and C is a preset adjustment parameter used to ensure that the denominator is not zero.
6. The method according to any one of claims 1 to 5, wherein before acquiring the edge width and edge variance of the target image, the method further comprises:
and preprocessing a target image to be subjected to image definition evaluation.
7. The method according to claim 6, wherein the preprocessing of the target image to be subjected to image sharpness evaluation specifically comprises:
carrying out alignment shearing processing on target features in a target image to be subjected to image definition evaluation to obtain an image only containing the target features;
carrying out median filtering and gray level equalization processing on the image only containing the target characteristics to obtain a processed gray level image;
and carrying out image size normalization processing on the processed gray level image to obtain an image with a preset size.
8. An image sharpness evaluation apparatus, comprising:
the first acquisition module is used for acquiring the edge width of a target image to be subjected to image definition evaluation; the target image to be subjected to image definition evaluation is a gray image subjected to gray processing;
the second acquisition module is used for acquiring the edge variance of the target image to be subjected to image definition evaluation;
and the definition evaluation module is used for evaluating the definition of the target image according to the edge width and the edge variance.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the image sharpness evaluation method according to any of claims 1 to 7 are implemented when the processor executes the program.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image sharpness evaluation method according to any one of claims 1 to 7.
CN201810758155.XA 2018-07-11 2018-07-11 Image definition evaluation method and device Withdrawn CN110717922A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810758155.XA CN110717922A (en) 2018-07-11 2018-07-11 Image definition evaluation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810758155.XA CN110717922A (en) 2018-07-11 2018-07-11 Image definition evaluation method and device

Publications (1)

Publication Number Publication Date
CN110717922A true CN110717922A (en) 2020-01-21

Family

ID=69208965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810758155.XA Withdrawn CN110717922A (en) 2018-07-11 2018-07-11 Image definition evaluation method and device

Country Status (1)

Country Link
CN (1) CN110717922A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111369531A (en) * 2020-03-04 2020-07-03 浙江大华技术股份有限公司 Image definition grading method, equipment and storage device
CN111754491A (en) * 2020-06-28 2020-10-09 国网电子商务有限公司 Picture definition judging method and device
CN111968144A (en) * 2020-09-07 2020-11-20 北京凌云光技术集团有限责任公司 Image edge point acquisition method and device
CN112258563A (en) * 2020-09-23 2021-01-22 成都旷视金智科技有限公司 Image alignment method and device, electronic equipment and storage medium
CN112561890A (en) * 2020-12-18 2021-03-26 深圳赛安特技术服务有限公司 Image definition calculation method and device and computer equipment
CN114040155A (en) * 2021-10-31 2022-02-11 中汽院(重庆)汽车检测有限公司 Panoramic all-around image testing system for vehicle
CN114782276A (en) * 2022-04-29 2022-07-22 电子科技大学 Resistivity imaging dislocation correction method based on adaptive gradient projection
CN114972084A (en) * 2022-05-13 2022-08-30 杭州汇萃智能科技有限公司 Image focusing accuracy evaluation method and system
CN115266536A (en) * 2022-09-26 2022-11-01 南通钧儒卫生用品有限公司 Method for detecting water absorption performance of paper diaper
CN115631171B (en) * 2022-10-28 2023-09-15 上海为旌科技有限公司 Picture definition evaluation method, system and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050248655A1 (en) * 2004-04-21 2005-11-10 Fuji Photo Film Co. Ltd. Image processing method, image processing apparatus, and image processing program
US20080187225A1 (en) * 2007-02-05 2008-08-07 Fujitsu Limited Computer-readable record medium in which a telop character extraction program is recorded, telop character extraction method and telop character extraction apparatus
EP2187350A1 (en) * 2008-11-05 2010-05-19 Thomson Licensing Method and device for assessing image quality degradation
CN103093419A (en) * 2011-10-28 2013-05-08 浙江大华技术股份有限公司 Method and device for detecting image definition
CN104134204A (en) * 2014-07-09 2014-11-05 中国矿业大学 Image definition evaluation method and image definition evaluation device based on sparse representation
CN106651834A (en) * 2016-10-20 2017-05-10 国网山东省电力公司电力科学研究院 Method and device for evaluating quality of substation equipment infrared thermal image with no reference image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050248655A1 (en) * 2004-04-21 2005-11-10 Fuji Photo Film Co. Ltd. Image processing method, image processing apparatus, and image processing program
US20080187225A1 (en) * 2007-02-05 2008-08-07 Fujitsu Limited Computer-readable record medium in which a telop character extraction program is recorded, telop character extraction method and telop character extraction apparatus
EP2187350A1 (en) * 2008-11-05 2010-05-19 Thomson Licensing Method and device for assessing image quality degradation
CN103093419A (en) * 2011-10-28 2013-05-08 浙江大华技术股份有限公司 Method and device for detecting image definition
CN104134204A (en) * 2014-07-09 2014-11-05 中国矿业大学 Image definition evaluation method and image definition evaluation device based on sparse representation
CN106651834A (en) * 2016-10-20 2017-05-10 国网山东省电力公司电力科学研究院 Method and device for evaluating quality of substation equipment infrared thermal image with no reference image

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111369531B (en) * 2020-03-04 2023-09-01 浙江大华技术股份有限公司 Image definition scoring method, device and storage device
CN111369531A (en) * 2020-03-04 2020-07-03 浙江大华技术股份有限公司 Image definition grading method, equipment and storage device
CN111754491A (en) * 2020-06-28 2020-10-09 国网电子商务有限公司 Picture definition judging method and device
CN111968144A (en) * 2020-09-07 2020-11-20 北京凌云光技术集团有限责任公司 Image edge point acquisition method and device
CN111968144B (en) * 2020-09-07 2024-03-29 凌云光技术股份有限公司 Image edge point acquisition method and device
CN112258563A (en) * 2020-09-23 2021-01-22 成都旷视金智科技有限公司 Image alignment method and device, electronic equipment and storage medium
CN112561890A (en) * 2020-12-18 2021-03-26 深圳赛安特技术服务有限公司 Image definition calculation method and device and computer equipment
CN114040155A (en) * 2021-10-31 2022-02-11 中汽院(重庆)汽车检测有限公司 Panoramic all-around image testing system for vehicle
CN114782276A (en) * 2022-04-29 2022-07-22 电子科技大学 Resistivity imaging dislocation correction method based on adaptive gradient projection
CN114782276B (en) * 2022-04-29 2023-04-11 电子科技大学 Resistivity imaging dislocation correction method based on adaptive gradient projection
CN114972084A (en) * 2022-05-13 2022-08-30 杭州汇萃智能科技有限公司 Image focusing accuracy evaluation method and system
CN115266536A (en) * 2022-09-26 2022-11-01 南通钧儒卫生用品有限公司 Method for detecting water absorption performance of paper diaper
CN115631171B (en) * 2022-10-28 2023-09-15 上海为旌科技有限公司 Picture definition evaluation method, system and storage medium

Similar Documents

Publication Publication Date Title
CN110717922A (en) Image definition evaluation method and device
CN108921800B (en) Non-local mean denoising method based on shape self-adaptive search window
CN108090886B (en) High dynamic range infrared image display and detail enhancement method
CN111612741B (en) Accurate reference-free image quality evaluation method based on distortion recognition
Ray Unsupervised edge detection and noise detection from a single image
CN109584198B (en) Method and device for evaluating quality of face image and computer readable storage medium
CN110796615A (en) Image denoising method and device and storage medium
CN110378893B (en) Image quality evaluation method and device and electronic equipment
CN109255752B (en) Image self-adaptive compression method, device, terminal and storage medium
KR20110049570A (en) Image enhancement method using neural network model based on edge component classification
Kwok et al. Intensity-based gain adaptive unsharp masking for image contrast enhancement
WO2022016326A1 (en) Image processing method, electronic device, and computer-readable medium
CN111445490A (en) Method and system for extracting target foreground of micro-operation system
CN112801031A (en) Vein image recognition method and device, electronic equipment and readable storage medium
Mu et al. Low and non-uniform illumination color image enhancement using weighted guided image filtering
CN116129195A (en) Image quality evaluation device, image quality evaluation method, electronic device, and storage medium
Tan et al. Image haze removal based on superpixels and Markov random field
Zamperoni Image enhancement
CN112884662A (en) Three-dimensional digital map image processing method based on aerial image of aircraft
CN111311610A (en) Image segmentation method and terminal equipment
CN115829967A (en) Industrial metal surface defect image denoising and enhancing method
Ghazal et al. Structure-oriented multidirectional wiener filter for denoising of image and video signals
Khmag et al. Natural image noise removal using non local means and hidden Markov models in stationary wavelet transform domain
Tong et al. Infrared and visible image fusion under different illumination conditions based on illumination effective region map
CN113379631A (en) Image defogging method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200121