WO2019173954A1 - Procédé et appareil de détection de résolution d'image - Google Patents

Procédé et appareil de détection de résolution d'image Download PDF

Info

Publication number
WO2019173954A1
WO2019173954A1 PCT/CN2018/078751 CN2018078751W WO2019173954A1 WO 2019173954 A1 WO2019173954 A1 WO 2019173954A1 CN 2018078751 W CN2018078751 W CN 2018078751W WO 2019173954 A1 WO2019173954 A1 WO 2019173954A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
threshold
metric value
determining
detection result
Prior art date
Application number
PCT/CN2018/078751
Other languages
English (en)
Chinese (zh)
Inventor
丁欣
董辰
郜文美
姜永涛
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201880077809.0A priority Critical patent/CN111417981A/zh
Priority to PCT/CN2018/078751 priority patent/WO2019173954A1/fr
Publication of WO2019173954A1 publication Critical patent/WO2019173954A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present application relates to the field of computer technologies, and in particular, to a method and device for detecting image sharpness.
  • Blurred images can cause many image-like application functions to fail, such as face recognition, video surveillance, and so on.
  • a commonly used method for detecting image sharpness is a machine learning based sharpness detection method.
  • the method establishes an image sample library, wherein the image sample library includes a large number of clear images and a large number of blurred images, and then establishes a deep learning model, and uses the clear image in the image sample library and the blurred image to train the deep learning model. .
  • the image depth detection is performed by the trained deep learning model.
  • This kind of machine learning-based sharpness detection method needs to collect a large number of clear images and a large number of blurred images when constructing the image sample library, which has a large workload and high algorithm complexity.
  • the embodiment of the present application provides a method and a device for detecting image sharpness, which are used to solve the problem that the workload of the image sharpness detection is large and the algorithm complexity is high.
  • an embodiment of the present application provides an image sharpness detecting method, which may be applied to an electronic device, including: scaling a first image of a first size to obtain a second image of a second scale, and Determining a first metric value of the first image and a second metric value of the second image, the first metric value being used to characterize a sharpness of the first image, and the second metric value Characterizing the sharpness of the second image. And then performing operations on the first metric value and the second metric value, obtaining an operation result, comparing the operation result with a first threshold, and determining, according to the obtained comparison result, whether the first image is Clear image.
  • the embodiment of the present application after the blurred image and the clear image are scaled, the phenomenon that the degree of clarity changes is different. For example, if the blurred image is reduced, the change of the degree of clarity is small, and the clear image is reduced, and the degree of clarity changes greatly.
  • the embodiment of the present application can accurately determine the sharpness of an image, and the workload is small, and the algorithm complexity is small.
  • determining the first metric value of the first image includes: detecting a target area in the first image, and performing mask processing on the target area in the first image to obtain a a mask image, determining a metric value of the first mask image to obtain a metric value of the first image.
  • Determining the second metric value of the second image comprising: detecting a target area in the second image, performing mask processing on the target area in the second image, obtaining a second mask image, determining The metric value of the second mask image is obtained as the metric value of the second image.
  • the first metric value and the second metric value are operated to obtain an operation result, which is implemented by: determining the first metric value and the second metric value The ratio. Comparing the operation result with the first threshold, and determining whether the first image is a clear image according to the obtained comparison result, by implementing, if the ratio is greater than the first threshold, determining the first image Is a clear image or, if the ratio is less than or equal to the first threshold, determining that the first image is a blurred image.
  • the ratio of the first metric value and the second metric value determines a change of the metric value before and after performing the scaling process on the first image, and determining the metric value before and after performing the scaling process according to the first image.
  • the change of the condition determines the sharpness of the first image, and the purpose of accurately determining the sharpness of the first image can be achieved with a small amount of work.
  • the first detection result is that the first image is a clear image or a blurred image according to a comparison result of the operation result and the first threshold. If the first detection result is inaccurate, the first threshold is adjusted. In the above design, by determining whether the detection result is accurate, and adjusting the first threshold when determining that the detection result is inaccurate, the more accurate detection result is determined according to the adjusted first threshold, thereby improving the accuracy of the image sharpness detection. Sex.
  • the second metric value is compared with the second threshold and the third threshold before the first metric value and the second metric value are operated to obtain the operation result. And the comparison result obtained is that the second metric value is smaller than the second threshold value, and is greater than the third threshold value, and the second threshold value is greater than the third threshold value.
  • the image can be effectively reduced. The complexity of clarity detection.
  • the detection result can be obtained by comparing the second metric value with the second threshold and the third threshold, thereby reducing the complexity of image sharpness detection.
  • the second metric value is greater than or equal to the second threshold, after determining that the detection result of the first image is a clear image, determining whether the second detection result is accurate, The second detection result is that the first image is a clear image according to the second metric value and the second threshold. If the second detection result is inaccurate, the second threshold is adjusted. If the second metric is less than or equal to the third threshold, after determining that the first image is a blurred image, determining whether the third detection result is accurate, the third detection result is according to the second metric The value and the third threshold determine that the first image is a blurred image. If the third detection result is inaccurate, the third threshold is adjusted.
  • comparing the first metric value and the second metric value to obtain the operation result comparing the first metric value with the fourth threshold value and the fifth threshold value And obtaining a comparison result that the first metric value is smaller than the fourth threshold value, and is greater than the fifth threshold value, where the fourth threshold value is greater than the fifth threshold value.
  • the detection result can be obtained by comparing the first metric value with the fourth threshold value and the fifth threshold value for the image that is particularly clear or particularly blurred, so that the complexity of image sharpness detection can be reduced.
  • the fourth threshold determines whether the fourth detection result is accurate after determining that the first image is a clear image; the fourth detection The result is that the first image is a clear image based on the first metric value and the fourth threshold. If the fourth detection result is inaccurate, the fourth threshold is adjusted. If the first metric value is less than or equal to the fifth threshold, determining whether the fifth detection result is accurate after determining that the first image is a blurred image; the fifth detection result is according to the first degree The magnitude and the fifth threshold determine that the first image is a blurred image, and if the fifth detection result is not accurate, the fifth threshold is adjusted.
  • an embodiment of the present application provides an image sharpness detecting apparatus, including: a scaling module, configured to perform a scaling process on a first image of a first size to obtain a second image of a second size.
  • a determining module configured to determine a first metric value of the first image and a second metric value of the second image obtained by the scaling module, the first metric value being used to represent the first image The clarity of the second metric is used to characterize the sharpness of the second image.
  • an operation module configured to perform operations on the first metric value and the second metric value determined by the determining module, to obtain an operation result.
  • a comparison module configured to compare the operation result obtained by the operation module with a first threshold, and obtain a comparison result.
  • a determining module configured to determine, according to the comparison result obtained by the comparing module, whether the first image is a clear image.
  • the operation module is specifically configured to: determine a ratio of the first metric value and the second metric value.
  • the comparing module is specifically configured to: compare the ratio with a first threshold.
  • the determining module is configured to: if the ratio is greater than the first threshold, determine that the first image is a clear image, or if the ratio is less than or equal to the first threshold, determine the first The image is a blurred image.
  • the determining module is further configured to: after comparing the operation result with a first threshold, and determining whether the first image is a clear image according to the obtained comparison result, determining Whether the detection result is accurate; the first detection result is that the first image is a clear image or a blurred image according to a comparison result of the operation result and the first threshold.
  • the device further includes an adjustment module, where the adjustment module is configured to adjust the first threshold when the first detection result is inaccurate.
  • the comparing module is further configured to: before calculating the first metric value and the second metric value, obtaining the second metric value and the first The second threshold and the third threshold are compared, and the obtained comparison result is that the second metric is smaller than the second threshold and greater than the third threshold, and the second threshold is greater than the third threshold.
  • the determining module is further configured to: when the second metric value is greater than or equal to the second threshold, determine that the first image is a clear image; or, in the When the second metric is less than or equal to the third threshold, it is determined that the first image is a blurred image.
  • the determining module is further configured to: after determining that the detection result of the first image is a clear image, determining Whether the second test result is accurate.
  • the second detection result is that the first image is a clear image according to the second metric value and the second threshold.
  • the device further includes an adjustment module, where the adjustment module is configured to adjust the second threshold when the second detection result determined by the determining module is inaccurate.
  • the determining module is further configured to: after determining that the first image is a blurred image, determine whether the third detection result is accurate.
  • the third detection result is that the first image is a blurred image according to the second metric value and the third threshold.
  • the adjusting module is further configured to: when the third detection result is inaccurate, adjust the third threshold.
  • the comparing module is further configured to: before the operation of the first metric value and the second metric value, obtain the operation result, and compare the first metric value with The fourth threshold and the fifth threshold are compared, and the obtained comparison result is that the first metric value is smaller than the fourth threshold and greater than the fifth threshold, and the fourth threshold is greater than the fifth threshold.
  • the determining module is further configured to: when the first metric value is greater than or equal to the fourth threshold, determine that the first image is a clear image, or When the first metric value is less than or equal to the fifth threshold, it is determined that the first image is a blurred image.
  • the determining module is further configured to: after determining that the first image is a clear image, determine the fourth Whether the test results are accurate.
  • the fourth detection result is that the first image is a clear image according to the first metric value and the fourth threshold.
  • the device further includes an adjustment module, where the adjustment module is configured to adjust the fourth threshold when the fourth detection result is inaccurate.
  • the determining module is further configured to: after determining that the first image is a blurred image, determine whether the fifth detection result is accurate.
  • the fifth detection result is that the first image is a blurred image according to the first metric value and the fifth threshold.
  • the adjusting module is further configured to: when the fifth detection result is inaccurate, adjust the fifth threshold.
  • the determining module when determining the first metric value of the first image, is specifically configured to: detect a target area in the first image, in the first image The target area is masked to obtain a first mask image, and the metric value of the first mask image is determined.
  • determining the second metric value of the second image specifically, detecting: a target area in the second image, performing mask processing on the target area in the second image, to obtain a second mask image And determining a metric value of the second mask image.
  • an embodiment of the present application further provides a terminal, where the terminal includes a processor and a memory, where the memory is used to store a software program, and the processor is configured to read a software program stored in the memory and implement the first
  • the electronic device can be a mobile terminal, a computer, or the like.
  • the embodiment of the present application further provides a computer storage medium, where the software program stores a software program, where the software program can implement the first aspect or the first one when being read and executed by one or more processors Any of the aspects provided by the design.
  • the embodiment of the present application provides a computer program product comprising instructions, when executed on a computer, causing a computer to perform the method described in any one of the above first aspect or the first aspect, or The method provided by the second aspect or any one of the above second aspects.
  • FIG. 1A is a clear image provided by an embodiment of the present application.
  • FIG. 1B is a blurred image provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a change in sharpness of a blurred image and a clear image before and after a reduction process according to an embodiment of the present application;
  • FIG. 3 is a schematic flowchart of a method for detecting image sharpness according to an embodiment of the present application
  • FIG. 4 is a schematic view of an elliptical mask provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of a first determination provided by an embodiment of the present application.
  • FIG. 6 is a schematic flowchart of a second determination provided by an embodiment of the present application.
  • FIG. 7 is a schematic flowchart of adjusting a threshold according to an embodiment of the present application.
  • FIG. 8 is a schematic flowchart of a method for detecting a sharpness of a face image according to an embodiment of the present application
  • FIG. 9 is a schematic structural diagram of an image sharpness detecting apparatus according to an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of an image sharpness detecting apparatus according to an embodiment of the present application.
  • an image-like application function is to detect a skin condition of a face according to a facial image of the user, such as detecting and analyzing a plurality of skin features such as a pore, a blackhead, a fine line, a stain, and a subcutaneous red area of the face.
  • the implementation of the application function has high requirements on the definition of the face photo, and the problems such as jitter and unclear focus when photographing, all affect the stability and reliability of the skin detection.
  • a large number of pores can be seen in the higher-resolution facial image, and the pore boundary is relatively clear, and the blackhead is also clearly visible, as shown in FIG. 1A, and the lower-resolution facial image can only see a larger one.
  • this difference in sharpness will result in a large difference in the results of two skin tests by the same user.
  • the first method is a sharpness detection method based on a reference image.
  • the method generally needs to obtain a clear image of the same scene or content as the image to be detected as a reference image, and determines the sharpness to be detected by comparing the reference image and the gradient, frequency and the like of the image to be detected.
  • this method can only detect the sharpness of the image with the same or the same content as the reference image scene, and has certain limitations and poor adaptability.
  • the second method is a non-reference sharpness detection method.
  • the method analyzes the frequency domain information of the image to be detected, such as frequency domain information obtained by Fourier transforming the image to be detected, frequency domain information obtained by wavelet transforming the image to be detected, or the like, or analyzing the edge of the image to be detected.
  • the width peak information and the gradient peak information are used to calculate an index for characterizing the sharpness, and then the sharpness is determined based on the comparison result of the index with the threshold.
  • the thresholds used in determining the definition may be different. Therefore, the same threshold is used for all images for determination, resulting in poor accuracy of the results of the determination.
  • this method has a large amount of computation, high computational complexity, and poor real-time performance.
  • the third method is a machine learning based sharpness detection method.
  • the method establishes an image sample library, wherein the image sample library includes a large number of clear images and a large number of blurred images, and then establishes a deep learning model, and uses the clear image in the image sample library and the blurred image to train the deep learning model. .
  • the image depth detection is performed by the trained deep learning model.
  • This kind of machine learning-based sharpness detection method needs to collect a large number of clear images and a large number of blurred images when constructing the image sample library, which has a large workload and high algorithm complexity.
  • the blurred image has a small change in the degree of clarity; and the sharp image is sharpened with a clear degree. big change.
  • the first personal face map is an image before the reduction processing is performed on the blurred image
  • the second personal face image is an image after the reduction processing is performed on the blurred image
  • the first personal face image and the second personal face The Laplacian variance ratio between the graphs is 1.8050.
  • the third personal face image is an image before the reduction process is performed for the clear image
  • the fourth personal face image is the image after the reduction image is reduced
  • the Laplacian variance ratio between the third personal face image and the fourth personal face image is As for 5.5153, it can be seen that compared with the clear image, the blurred image has a large visual difference before and after the reduction processing, and the ratio of the Laplacian variance is also large.
  • the embodiment of the present application provides a method and device for detecting image sharpness, which is used to solve the problem that the workload is large and the algorithm complexity is high when the image sharpness detection is more accurately determined.
  • the method and the device are based on the same inventive concept. Since the principles of the method and the device for solving the problem are similar, the implementation of the device and the method can be referred to each other, and the repeated description is not repeated.
  • the embodiments of the present application may be applied to an electronic device, such as a computer, a tablet, a notebook, a smart phone, a server, etc.
  • the electronic device may be, but not limited to, an element including a camera/camera, an image processor, a central processing unit, a storage medium, and the like.
  • the camera/camera can be used to collect images to be detected
  • the image processor can be used for scaling processing, mask processing, etc.
  • the central processing unit can be used for detecting the image to be detected, etc.
  • the storage medium can be used. For storing image data, software programs, and so on.
  • the fields of application of the embodiments of the present application include, but are not limited to, a face image field, a vehicle image field, a plant image field, or other types of image fields.
  • the embodiments of the present application may be, but are not limited to, applied to the following scenarios: face recognition, identity information collection, facial skin detection, video tracking, and the like.
  • the embodiments of the present application may be, but are not limited to, applied to the following scenarios: automatic picture screening, photo blurred reminders, and the like.
  • Multiple means two or more.
  • the method includes:
  • the scaling process includes a reduction process or an enlargement process. If the first image of the first scale is reduced, a second image of the second scale is obtained, and the second scale may be smaller than the first scale, such as the second scale being 1/4 of the first scale, and the like. If the first image of the first scale is processed by the method to obtain a second image of the second scale, the second scale may be greater than the first scale, such as the second scale being 4 times of the first scale, and the like.
  • the first metric value may be, but is not limited to, a Laplacian variance, a Sobel variance, a grayscale variance, and the like, which are pixel values of the first image.
  • the second metric value may be, but is not limited to, a Laplacian variance, a Sobel variance, a grayscale variance, or the like, which is a pixel value of the second image.
  • the embodiment of the present application after the blurred image and the clear image are scaled, the phenomenon that the degree of clarity changes is different. For example, if the blurred image is reduced, the change of the degree of clarity is small, and the clear image is reduced, and the degree of clarity changes greatly.
  • the embodiment of the present application can accurately determine the sharpness of an image, and the workload is small, and the algorithm complexity is small.
  • the first image may be acquired first.
  • the manner of acquiring the first image includes, but is not limited to, acquiring a first image by a sensor such as a camera, acquiring a first image in a database, and the like.
  • attention is often paid only to the image definition of the target area.
  • attention is often paid to the definition of the face area. Therefore, the complex and varied background will affect the accuracy of the detection.
  • the background in the image sharpness detection can be reduced by filtering the background in the image.
  • the method when acquiring the first metric value of the first image, the method may be implemented as follows:
  • A1 Filter the background in the first image to obtain a target area of the first image.
  • the filtering of the background in the first image may be, but is not limited to, being implemented as follows:
  • Method 1 detecting a target area in the second image.
  • an detection algorithm based on an iterative algorithm (AdaBoost) algorithm, a convolutional neural network (CNN) based detection algorithm, a support vector machine (SVM) based detection algorithm, and a principal component can be used.
  • Analysis English: Principle Component Analysis, PCA
  • PCA Principle Component Analysis
  • other methods may be used for the detection of the target area.
  • a masking process is performed on the target area in the second image to obtain a mask image of the second image.
  • an elliptical mask covering the target area may be generated according to the size and position of the target area. Taking the target area as the face area as an example, the elliptical mask is shown in FIG.
  • Method two filtering the background in the second image by using the neural network model.
  • A2. Determine a metric value of a target area of the first image.
  • the method when acquiring the second metric value of the second image, the method may be implemented as follows:
  • the first metric value or the second metric may be first based on The value is used for the sharpness detection of the first image, so that the detection result can be obtained according to the first metric value or the second metric value for a particularly clear or particularly blurred image, for an intermediate state that is not particularly clear or particularly blurred.
  • the image is further processed by calculating the first metric value and the second metric value, and then comparing the operation result with the first threshold value, so that the image can be based on the first degree only for a particularly clear or particularly blurred image.
  • the magnitude or the second metric obtains the detection result without the need to operate the first and second metric values, thereby reducing the computational complexity and complexity of the sharpness detection algorithm.
  • performing the step S303, performing operations on the first metric value and the second metric value, before obtaining the operation result may be based on the second
  • the metric value makes the first determination of the sharpness of the first image.
  • the process of the first determination can be implemented as follows, as shown in FIG. 5:
  • step S503 Determine whether the second metric value is less than a third threshold; if yes, perform step S504; if not, perform a second determination on the sharpness of the first image based on the first metric value.
  • the second threshold is greater than the third threshold.
  • the process of the second determination can be implemented by the following process, as shown in FIG. 6:
  • step S601 Determine whether the first metric value is greater than a fourth threshold; if yes, execute step S602; if no, perform step S603.
  • step S603. Determine whether the first metric value is less than a fifth threshold; if yes, perform step S604; if not, perform a second determination on the resolution of the first image based on the first metric value.
  • the second threshold is greater than the third threshold.
  • the first threshold, the second threshold, the third threshold, the fourth threshold, and the fifth threshold in the embodiment of the present application may be determined based on an empirical value, or determined by a large number of experiments, for example, a second threshold: A certain clear image is scaled to obtain the processed image, and the processed metric value of the image is determined, and the second threshold may be the processed metric value of the image.
  • the third threshold is determined by scaling a particular blurred image to obtain the processed image, and determining the processed metric value of the image, and the third threshold may be the processed metric value of the image. Determination process of the fourth threshold: determining the metric value of a particularly clear image, then the fourth threshold may be the metric value of the image.
  • the fifth threshold determination process determining the metric value of a particular blurred image, then the fifth threshold may be the metric value of the image.
  • the first threshold is determined by determining the first threshold based on the ratio of the most acceptable blurred image.
  • the first threshold, the second threshold, the third threshold, the fourth threshold, and the fifth threshold may be adjusted based on the feedback mechanism. Specifically, as shown in FIG. 7 :
  • the second metric value is greater than the second threshold
  • determining that the first image is a clear image determining that the first image is a clear image
  • the feedback information is the first image.
  • adjust the second threshold adjust the second threshold. The adjusted second threshold can then be used when image sharpness detection is performed.
  • the process of adjusting the first threshold, or the third threshold, or the fourth threshold, or the fifth threshold based on the feedback mechanism may refer to the process of adjusting the second threshold, and details are not repeatedly described herein.
  • the first threshold value after the adjustment, or the second threshold value, or the third threshold value, or the fourth threshold value, or the fifth threshold value is used as a threshold value for the next image sharpness detection.
  • the method of adjusting the threshold based on the feedback mechanism may enable more accurate detection results to be determined according to the first threshold, the second threshold, the third threshold, the fourth threshold, and the fifth threshold, thereby improving the accuracy of image sharpness detection.
  • FIG. 8 is a schematic diagram for detecting the sharpness process of the face image.
  • Step S801. Acquire an image of a face to be detected. Step S802 is performed.
  • Step S802 Perform a reduction process on the face image to be detected to obtain a thumbnail image.
  • the scale of the thumbnail image may be 1/4 of the scale of the image to be detected.
  • Step S803 is performed.
  • Step S803 performing face detection and positioning on the reduced image. Step S804 is performed.
  • Step S804 generating a mask covering the face region according to the face detection and the size and position of the face region obtained by the positioning, to obtain a first mask region.
  • the first mask may be elliptical.
  • Step S805. Calculate a metric value of the first mask area.
  • the metric value may be a Laplacian variance, a Sobel variance, a gray scale variance, and the like of pixel values of the first mask region.
  • Step S806 is performed.
  • Step S810 performing face detection and positioning on the detected face image. Step S811 is performed.
  • Step S811 generating a mask covering the face region according to the face detection and the size and position of the face region obtained by the positioning, to obtain a second mask region.
  • the second mask may be elliptical.
  • Step S812 calculating a metric value of the second mask area.
  • the metric value may be a Laplacian variance, a Sobel variance, a gray scale variance, and the like of the pixel values of the second mask region. Step S813 is performed.
  • Step S815. Determine a ratio between a metric value of the second mask region and a metric value of the first mask region. Step S816 is performed.
  • the feedback information may be acquired, and determining, according to the feedback information, whether the first image is a clear image according to the second threshold is accurate, and if so, the second threshold is not adjusted. If not, adjust the second threshold.
  • the feedback information may be acquired, and determining, according to the feedback information, whether the first image is a clear image according to the fourth threshold is accurate, and if so, the fourth threshold is not adjusted. If not, adjust the fourth threshold.
  • the feedback information may be acquired, and determining, according to the feedback information, whether the first image is a clear image according to the first threshold, and if so, the first threshold is not adjusted. If not, adjust the first threshold.
  • the feedback information may be acquired, and determining, according to the feedback information, whether the first image is a blurred image according to the third threshold is accurate, and if so, the third threshold is not adjusted. If not, adjust the third threshold.
  • the feedback information may be acquired, and determining, according to the feedback information, whether the first image is a blurred image according to the fifth threshold is accurate, and if so, the fifth threshold is not adjusted. If not, adjust the fifth threshold.
  • the feedback information may be acquired, and determining, according to the feedback information, whether the first image is a blurred image according to the first threshold, and if so, the first threshold is not adjusted. If not, adjust the first threshold.
  • the embodiment of the present application provides a terminal device, specifically for implementing the method described in the embodiments described in FIG. 3 to FIG. 8.
  • the structure of the device is as shown in FIG.
  • the scaling module 901 is configured to perform scaling processing on the first image of the first size to obtain a second image of the second size.
  • a determining module 902 configured to determine a first metric value of the first image and a second metric value of the second image obtained by the scaling module 901, where the first metric value is used to represent the first The sharpness of an image used to characterize the sharpness of the second image.
  • the operation module 903 is configured to perform operations on the first metric value and the second metric value determined by the determining module 902 to obtain an operation result. And comparing the operation result obtained by the operation module 903 with a first threshold to obtain a comparison result.
  • the determining module 905 is configured to determine, according to the comparison result obtained by the comparing module 904, whether the first image is a clear image.
  • the operation module 903 is specifically configured to: determine a ratio of the first metric value and the second metric value.
  • the comparison module 904 is specifically configured to: compare the ratio with a first threshold.
  • the determining module 905 is specifically configured to: if the ratio is greater than the first threshold, determine that the first image is a clear image, or if the ratio is less than or equal to the first threshold, determine the first An image is a blurred image.
  • the determining module 902 is further configured to: after comparing the operation result with a first threshold, and determining, according to the obtained comparison result, whether the first image is a clear image, Determining whether the first detection result is accurate; the first detection result is determining that the first image is a clear image or a blurred image according to a comparison result of the operation result and the first threshold.
  • the device further includes an adjustment module 906, configured to adjust the first threshold when the first detection result is inaccurate.
  • the comparing module 904 is further configured to: before the operation of the first metric value and the second metric value, obtain the second metric value and the second threshold and the third threshold The comparison is performed, and the obtained comparison result is that the second metric value is smaller than the second threshold value and greater than the third threshold value, and the second threshold value is greater than the third threshold value.
  • the determining module 905 is further configured to: when the second metric value is greater than or equal to the second threshold, determine that the first image is a clear image; or, when the second metric value is less than or equal to The third threshold is determined to be a blurred image.
  • the determining module 902 is further configured to: determine, after the detection result of the first image is a clear image, whether the second detection result is accurate .
  • the second detection result is that the first image is a clear image according to the second metric value and the second threshold.
  • the adjusting module 906 is configured to adjust the second threshold when the second detection result determined by the determining module 902 is inaccurate.
  • the determining module 902 is further configured to: after determining that the first image is a blurred image, determine whether the third detection result is accurate.
  • the third detection result is that the first image is a blurred image according to the second metric value and the third threshold.
  • the adjusting module 906 is further configured to: when the third detection result is inaccurate, adjust the third threshold.
  • the comparing module 904 is further configured to: before the operation of the first metric value and the second metric value, obtain the operation result, and use the first metric value and the fourth The threshold and the fifth threshold are compared, and the obtained comparison result is that the first metric value is smaller than the fourth threshold and greater than the fifth threshold, and the fourth threshold is greater than the fifth threshold.
  • the determining module 905 is further configured to: when the first metric value is greater than or equal to the fourth threshold, determine that the first image is a clear image, or When the first metric value is less than or equal to the fifth threshold, it is determined that the first image is a blurred image.
  • the determining module 902 is further configured to: determine whether the fourth detection result is accurate after determining that the first image is a clear image, if the first metric is greater than or equal to the fourth threshold.
  • the fourth detection result is that the first image is a clear image according to the first metric value and the fourth threshold.
  • the adjusting module 906 is configured to adjust the fourth threshold when the fourth detection result is inaccurate.
  • the determining module 902 is further configured to: after determining that the first image is a blurred image, determine whether the fifth detection result is accurate.
  • the fifth detection result is that the first image is a blurred image according to the first metric value and the fifth threshold.
  • the adjusting module 906 is further configured to: when the fifth detection result is inaccurate, adjust the fifth threshold.
  • the determining module 902 when determining the first metric value of the first image, is specifically configured to: detect a target area in the first image, and target a target area in the first image Performing a mask process to obtain a first mask image, and determining a metric value of the first mask image.
  • determining the second metric value of the second image specifically, detecting: a target area in the second image, performing mask processing on the target area in the second image, to obtain a second mask image And determining a metric value of the second mask image.
  • each functional module in each embodiment of the present application may be integrated into one processing. In the device, it can also be physically existed alone, or two or more modules can be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the terminal device may include the processor 1002.
  • the hardware of the entity corresponding to the above module may be the processor 1002.
  • the processor 1002 can be a central processing unit (CPU), or a digital processing module or the like.
  • the terminal device may further include a collector 1001, and the processor 1002 collects an image through the collector 1001.
  • the apparatus also includes a memory 1003 for storing a program executed by the processor 1002.
  • the memory 1003 may be a non-volatile memory, such as a hard disk drive (HDD) or a solid-state drive (SSD), or a volatile memory such as a random access memory (random). -access memory, RAM).
  • Memory 1003 is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited thereto.
  • the processor 1002 is configured to execute the program code stored in the memory 1003, specifically for performing any one of the methods described in the embodiments shown in FIG. 3 to FIG. For the methods described in the embodiments shown in FIG. 3 to FIG. 8 , the application will not be repeated herein.
  • connection medium between the collector 1001, the processor 1002, and the memory 1003 is not limited in the embodiment of the present application.
  • the memory 1003, the processor 1002, and the collector 1001 are connected by a bus 1004 in FIG. 10, and the bus is indicated by a thick line in FIG. 10, and the connection manner between other components is only schematically illustrated. , not limited to.
  • the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is shown in FIG. 10, but it does not mean that there is only one bus or one type of bus.
  • the embodiment of the present invention further provides a chip, where the chip includes the foregoing collector and the processor, and is configured to support the first relay device to implement any one of the methods described in the embodiments shown in FIG. 3 to FIG. .
  • the embodiment of the present application further provides a computer readable storage medium for storing computer software instructions required to execute the foregoing processor, which includes a program for executing the above-mentioned processor.
  • embodiments of the present application can be provided as a method, system, or computer program product.
  • the present application can take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment in combination of software and hardware.
  • the application can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé et un appareil pour détecter la résolution d'une image, utilisés pour résoudre les problèmes d'une lourde charge de travail et d'une grande complexité d'algorithme lorsque la résolution d'une image est détectée. Le procédé comprend : la mise à l'échelle d'une première image d'une première taille pour obtenir une seconde image d'une seconde taille, et la détermination d'une première valeur de métrique de la première image et d'une seconde valeur de métrique de la seconde image, la première valeur de métrique étant utilisée pour caractériser la résolution de la première image et la seconde valeur de métrique étant utilisée pour caractériser la résolution de la seconde image ; ensuite, la réalisation d'une opération sur la première valeur de métrique et sur la seconde valeur de métrique pour obtenir un résultat d'opération, la comparaison du résultat d'opération à un premier seuil, et le fait de déterminer si la première image est une image claire selon le résultat de comparaison obtenu.
PCT/CN2018/078751 2018-03-12 2018-03-12 Procédé et appareil de détection de résolution d'image WO2019173954A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880077809.0A CN111417981A (zh) 2018-03-12 2018-03-12 一种图像清晰度检测方法及装置
PCT/CN2018/078751 WO2019173954A1 (fr) 2018-03-12 2018-03-12 Procédé et appareil de détection de résolution d'image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/078751 WO2019173954A1 (fr) 2018-03-12 2018-03-12 Procédé et appareil de détection de résolution d'image

Publications (1)

Publication Number Publication Date
WO2019173954A1 true WO2019173954A1 (fr) 2019-09-19

Family

ID=67907281

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/078751 WO2019173954A1 (fr) 2018-03-12 2018-03-12 Procédé et appareil de détection de résolution d'image

Country Status (2)

Country Link
CN (1) CN111417981A (fr)
WO (1) WO2019173954A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111010556A (zh) * 2019-12-27 2020-04-14 成都极米科技股份有限公司 投影双向热失焦补偿的方法、装置及可读存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112346968B (zh) * 2020-10-20 2024-04-19 北京达佳互联信息技术有限公司 一种多媒体文件清晰度的自动化检测方法及装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050248655A1 (en) * 2004-04-21 2005-11-10 Fuji Photo Film Co. Ltd. Image processing method, image processing apparatus, and image processing program
CN104091340A (zh) * 2014-07-18 2014-10-08 厦门美图之家科技有限公司 一种模糊图像的快速检测方法
CN106548468A (zh) * 2016-10-13 2017-03-29 广州酷狗计算机科技有限公司 图像清晰度的判别方法及装置
CN106934806A (zh) * 2017-03-09 2017-07-07 东南大学 一种基于结构清晰度的无参考图失焦模糊区域分割方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6548800B2 (en) * 2001-04-09 2003-04-15 Microsoft Corporation Image blur detection methods and arrangements
JP2003067748A (ja) * 2001-08-28 2003-03-07 Kddi Corp アニメーション画像抽出装置
CN103093419B (zh) * 2011-10-28 2016-03-02 浙江大华技术股份有限公司 一种检测图像清晰度的方法及装置
CN102903098A (zh) * 2012-08-28 2013-01-30 四川虹微技术有限公司 一种基于图像清晰度差异的深度估计方法
CN105574857B (zh) * 2015-12-11 2019-02-15 小米科技有限责任公司 图像分析方法及装置
GB2549068B (en) * 2016-03-22 2021-09-29 Toshiba Europe Ltd Image adjustment
CN107784645A (zh) * 2016-08-26 2018-03-09 广州康昕瑞基因健康科技有限公司 图像清晰度评价方法及***、自动聚焦方法
CN107481238A (zh) * 2017-09-20 2017-12-15 众安信息技术服务有限公司 图像质量评估方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050248655A1 (en) * 2004-04-21 2005-11-10 Fuji Photo Film Co. Ltd. Image processing method, image processing apparatus, and image processing program
CN104091340A (zh) * 2014-07-18 2014-10-08 厦门美图之家科技有限公司 一种模糊图像的快速检测方法
CN106548468A (zh) * 2016-10-13 2017-03-29 广州酷狗计算机科技有限公司 图像清晰度的判别方法及装置
CN106934806A (zh) * 2017-03-09 2017-07-07 东南大学 一种基于结构清晰度的无参考图失焦模糊区域分割方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111010556A (zh) * 2019-12-27 2020-04-14 成都极米科技股份有限公司 投影双向热失焦补偿的方法、装置及可读存储介质
US11934089B2 (en) 2019-12-27 2024-03-19 Chengdu Xgimi Technology Co., Ltd. Bidirectional compensation method and apparatus for projection thermal defocusing, and readable storage medium

Also Published As

Publication number Publication date
CN111417981A (zh) 2020-07-14

Similar Documents

Publication Publication Date Title
US20200160040A1 (en) Three-dimensional living-body face detection method, face authentication recognition method, and apparatuses
Kakar et al. Exposing digital image forgeries by detecting discrepancies in motion blur
US10453204B2 (en) Image alignment for burst mode images
US9305240B2 (en) Motion aligned distance calculations for image comparisons
US11682225B2 (en) Image processing to detect a rectangular object
WO2014074959A1 (fr) Détection faciale en temps réel à l'aide de paires de pixels
US11086977B2 (en) Certificate verification
CN112784750B (zh) 基于像素和区域特征匹配的快速视频物体分割方法和装置
WO2019173954A1 (fr) Procédé et appareil de détection de résolution d'image
US9940718B2 (en) Apparatus and method for extracting peak image from continuously photographed images
US11216961B2 (en) Aligning digital images by selectively applying pixel-adjusted-gyroscope alignment and feature-based alignment models
CN113592706A (zh) 调整单应性矩阵参数的方法和装置
JPWO2018179119A1 (ja) 映像解析装置、映像解析方法およびプログラム
WO2020244076A1 (fr) Procédé et appareil de reconnaissance faciale, dispositif électronique et support d'informations
US20220122341A1 (en) Target detection method and apparatus, electronic device, and computer storage medium
CN114743150A (zh) 目标跟踪方法、装置、电子设备及存储介质
KR20210087494A (ko) 인체 방향 검출 방법, 장치, 전자 기기 및 컴퓨터 저장 매체
US10671881B2 (en) Image processing system with discriminative control
JP2017512398A (ja) 映像を提示する方法及び装置
KR20200017576A (ko) 심층신경망 기반 객체 검출 장치 및 방법
US11741617B2 (en) Method and apparatus with object tracking
WO2023111674A1 (fr) Procédé et appareil de détection de cible, dispositif électronique et support de stockage informatique
Zhang et al. Boolean Map Saliency: A Surprisingly Simple Method
TWI623889B (zh) 三維手勢影像辨識方法及其系統
Santos et al. Content-aware retargeting based on Information Theoretic Learning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18910200

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18910200

Country of ref document: EP

Kind code of ref document: A1