CN118261885A - Image definition identification method, intelligent terminal and storage medium - Google Patents

Image definition identification method, intelligent terminal and storage medium Download PDF

Info

Publication number
CN118261885A
CN118261885A CN202410422854.2A CN202410422854A CN118261885A CN 118261885 A CN118261885 A CN 118261885A CN 202410422854 A CN202410422854 A CN 202410422854A CN 118261885 A CN118261885 A CN 118261885A
Authority
CN
China
Prior art keywords
value
image
pixel
pixels
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410422854.2A
Other languages
Chinese (zh)
Inventor
高群
叶碧发
郑富文
吕福康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jiadi Technology Co ltd
Original Assignee
Shenzhen Jiadi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jiadi Technology Co ltd filed Critical Shenzhen Jiadi Technology Co ltd
Priority to CN202410422854.2A priority Critical patent/CN118261885A/en
Publication of CN118261885A publication Critical patent/CN118261885A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The application relates to an image definition identification method, an intelligent terminal and a storage medium, belonging to the field of image analysis, wherein the method comprises the following steps: acquiring an image edge of a target image based on a preset edge detection algorithm; acquiring the number of edge pixels of target pixels of the image edge and the gradient value of each target pixel; calculating an average gradient value of the image edge based on the number of edge pixels and the gradient value; based on the number of the edge pixels and the average gradient value, evaluating the image definition of the target image, and obtaining an evaluation result; the evaluation result comprises a clear result and a fuzzy result; if the evaluation result is the clear result, judging that the uploading of the target image is successful; and if the evaluation result is the fuzzy result, judging that the uploading of the target image fails, and sending out failure information. The application has the effect of effectively improving the working efficiency of the staff responsible for recruitment on the online platform.

Description

Image definition identification method, intelligent terminal and storage medium
Technical Field
The present application relates to the field of image analysis, and in particular, to an image definition recognition method, an intelligent terminal, and a storage medium.
Background
On some online platforms requiring personal identification, such as talent recruitment websites, the image for identification uploaded by the user is mainly used as personal identification, so that the recruitment enterprise can know the identity information of the user. If the user needs to search for a job on the online platform, the image for proving the identity needs to be uploaded, the online platform can authenticate the image after the image is uploaded, and the job can be searched after the authentication is passed.
In the prior art, when an online platform authenticates an image for proving identity uploaded by a user, if the size and resolution of the image meet the requirements, the image can pass the authentication. Therefore, the applicant believes that if the user uploads the blurred image and the size and the resolution of the blurred image meet the requirements, the user can pass the authentication and log in the online platform, so that a worker responsible for recruitment on the online platform cannot accurately verify the image, the identity of the user cannot be accurately verified, the recruitment working difficulty is increased, and the working efficiency is greatly reduced.
Disclosure of Invention
In order to effectively improve the working efficiency of workers responsible for recruitment on an online platform, the application provides an image definition identification method, an intelligent terminal and a storage medium.
In a first aspect, the present application provides an image sharpness recognition method, which adopts the following technical scheme:
an image sharpness recognition method, comprising:
Acquiring an image edge of a target image based on a preset edge detection algorithm;
Acquiring the number of edge pixels of target pixels of the image edge and the gradient value of each target pixel;
calculating an average gradient value of the image edge based on the number of edge pixels and the gradient value;
Based on the number of the edge pixels and the average gradient value, evaluating the image definition of the target image, and obtaining an evaluation result; the evaluation result comprises a clear result and a fuzzy result;
If the evaluation result is the clear result, judging that the uploading of the target image is successful;
and if the evaluation result is the fuzzy result, judging that the uploading of the target image fails, and sending out failure information.
By adopting the technical scheme, the definition of the target image is evaluated according to the number of the edge pixels and the average gradient value, the target image is judged to be successfully uploaded when the evaluation result is a clear result, and the uploading failure is judged when the evaluation result is a fuzzy result, so that when a user uploads the fuzzy image, the probability that the user still passes authentication when the size and the resolution of the fuzzy image meet the requirements is effectively reduced, further, the identity of the user is conveniently and accurately verified by the staff responsible for recruitment on the online platform, and the work efficiency is effectively improved by the staff responsible for recruitment on the online platform.
Optionally, the estimating the image sharpness of the target image based on the number of edge pixels and the average gradient value, and obtaining an estimation result includes:
if the number of the edge pixels is larger than a preset number threshold and the average gradient value is larger than a preset gradient value threshold, generating the evaluation result as the clear result;
If the number of edge pixels is smaller than the number threshold and the average gradient value is smaller than the gradient value threshold, generating the evaluation result as the blurring result;
And if the number of the edge pixels is greater than the number threshold and the average gradient value is less than or equal to the gradient value threshold, generating an evaluation result as the clear result.
By adopting the technical scheme, the image definition of the target image can be effectively evaluated according to the number of the edge pixels and the average gradient value, so that whether the target image is successfully uploaded or not can be conveniently determined according to the image definition, and the working efficiency of a worker responsible for recruitment on an on-line platform is effectively improved.
Optionally, the evaluation result further includes a result to be tested;
The step of evaluating the image definition of the target image based on the number of edge pixels and the average gradient value, and obtaining an evaluation result, further includes:
and if the number of the edge pixels is smaller than or equal to the number threshold and the average gradient value is larger than the gradient value threshold, generating an evaluation result as the to-be-measured result.
By adopting the technical scheme, if the image definition cannot be estimated through the number of the edge pixels and the average gradient value, an estimation result is generated as a result to be measured, so that the image definition can be further estimated later.
Optionally, after the generating the evaluation result is the to-be-measured result, the method includes:
determining a neighborhood of each target pixel;
For each target pixel on the target image, acquiring all pixels to be detected in the neighborhood taking the target pixel as a center, and acquiring a pixel value of each pixel to be detected;
calculating a variance value of each pixel to be detected according to each pixel value, and calculating an average value of all variance values;
If the average value is larger than a preset average value threshold value, judging that the evaluation result of the target image is the clear result;
and if the average value is not greater than the average value threshold value, judging that the evaluation result of the target image is the fuzzy result.
By adopting the technical scheme, the average value of the variance values of all the pixels to be detected is compared with the average value threshold value, and an evaluation result is generated, so that a user can upload clear images conveniently, and the working efficiency of the staff responsible for recruitment on the online platform is improved effectively.
Optionally, the calculating the variance value of each pixel to be measured according to each pixel value includes:
Acquiring the total number of pixels of all the pixels to be detected, and calculating the pixel average value of all the pixels to be detected in the neighborhood of each target pixel according to each pixel value;
substituting the pixel average value and the total number of pixels into a preset variance value calculation formula, and calculating to obtain a variance value of each pixel to be detected;
The variance value calculation formula is as follows:
wherein var is the variance value of each pixel to be measured, pi, j is the pixel value in the neighborhood of each target pixel, mean is the average value of all the pixels to be measured in the neighborhood, and MxN represents the total number of pixels.
By adopting the technical scheme, the variance value is calculated based on the variance value calculation formula, so that the image definition of the target image is conveniently evaluated according to the variance value, and the work efficiency of the staff responsible for recruitment on the online platform is conveniently improved.
Optionally, after the generating the evaluation result is the to-be-measured result, the method further includes:
Calculating a gray matrix of the target image, and performing zero-mean processing on the target image;
calculating an autocorrelation function between different pixels in the target image based on the gray matrix;
Obtaining the maximum value and the minimum value of the autocorrelation function, and determining the peak value of the autocorrelation function according to the maximum value and the minimum value;
if the peak value is larger than a preset peak value threshold value, generating the evaluation result as the clear result;
and if the peak value is not greater than the peak value threshold value, generating the evaluation result as the fuzzy result.
By adopting the technical scheme, the image definition of the target image is evaluated according to the maximum value and the minimum value of the autocorrelation function, and an evaluation result is generated, so that a user can upload the clear image conveniently, and the working efficiency of a worker responsible for recruitment on an on-line platform is further effectively improved.
Optionally, the calculating an autocorrelation function between different pixels in the target image based on the gray matrix includes:
acquiring a pixel position of each target pixel on the target image, and acquiring a pixel gray value of each target pixel on the pixel position based on the gray matrix;
Acquiring the total number of the image pixels of the target image and the target pixel value of each target pixel, and calculating the target pixel average value of the target image according to the total number of the image pixels and the target pixel value;
Substituting the pixel positions, the total number of pixels of the image, the pixel gray values and the average value of the target pixels into a preset autocorrelation function calculation formula, and calculating to obtain autocorrelation functions among different pixels in the target image;
The autocorrelation function has the following calculation formula:
Wherein R (u, v) is an autocorrelation function, (u, v) is a displacement on an image, (x, y) is a pixel position on the target image, I (x, y) is a pixel gray value on the pixel position, m is a target pixel average value, and N is a total number of pixels of the image.
By adopting the technical scheme, the autocorrelation function is calculated based on the autocorrelation function calculation formula, so that the image definition of the target image can be conveniently evaluated according to the autocorrelation function, and the working efficiency of the staff responsible for recruitment on the online platform can be conveniently improved.
In a second aspect, the present application provides an intelligent terminal, which adopts the following technical scheme:
An intelligent terminal comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the image definition identification method is adopted when the processor loads and executes the computer program.
By adopting the technical scheme, the computer program is generated by the image definition identification method and stored in the memory to be loaded and executed by the processor, so that the intelligent terminal is manufactured according to the memory and the processor, and the intelligent terminal is convenient to use.
In a third aspect, the present application provides a computer readable storage medium, which adopts the following technical scheme:
A computer readable storage medium having a computer program stored therein, which when loaded and executed by a processor, employs the image sharpness recognition method described above.
By adopting the technical scheme, the computer program is generated by the image definition identification method and is stored in the computer readable storage medium to be loaded and executed by the processor, and the computer program is convenient to read and store by the computer readable storage medium.
In summary, the application has at least one of the following beneficial technical effects:
1. According to the method, the definition of the target image is evaluated according to the number of the edge pixels and the average gradient value, the success of uploading the target image is judged when the evaluation result is a clear result, and the failure of uploading is judged when the evaluation result is a fuzzy result, so that when the user uploads the fuzzy image, the probability that the size and the resolution of the fuzzy image meet the requirements is effectively reduced, and the authentication is still passed, so that the identity of the user can be conveniently and accurately verified by the staff responsible for recruitment on the online platform, and the work efficiency is effectively improved by the staff responsible for recruitment on the online platform.
2. And comparing the average value of the variance values of all the pixels to be detected with an average value threshold value, and generating an evaluation result, so that a user can upload clear images conveniently, and the working efficiency of the staff responsible for recruitment on an on-line platform is improved effectively.
3. The autocorrelation function is calculated based on an autocorrelation function calculation formula, so that the image definition of a target image can be conveniently evaluated according to the autocorrelation function, and the work efficiency of a worker responsible for recruitment on an on-line platform can be conveniently improved.
Drawings
Fig. 1 is a flowchart of one implementation of an image sharpness recognition method according to an embodiment of the present application.
Fig. 2 is a flow chart of an image sharpness recognition method according to an embodiment of the present application.
Fig. 3 is a flowchart of one of the image sharpness recognition methods according to the embodiment of the present application.
Fig. 4 is a flowchart of one of the image sharpness recognition methods according to the embodiment of the present application.
Fig. 5 is a flowchart of one of the image sharpness recognition methods according to the embodiment of the present application.
Fig. 6 is a flowchart of one of the image sharpness recognition methods according to the embodiments of the present application.
Detailed Description
The application is described in further detail below with reference to fig. 1 to 6.
The embodiment of the application discloses an image definition identification method.
Referring to fig. 1, an image sharpness recognition method includes the steps of:
s101, acquiring an image edge of a target image based on a preset edge detection algorithm.
In this embodiment, the edge detection algorithm is a Canny edge detection algorithm, which is an edge detection algorithm for detecting an image edge of a target image.
S102, acquiring the number of edge pixels of target pixels of the image edge and the gradient value of each target pixel.
The gradient value of the target pixel is used to identify a change in the pixel value of the target pixel. The number of edge pixels of the image edge and the gradient value of each target pixel can be obtained based on an edge detection algorithm. Specifically, the gradient value of the target pixel refers to the magnitude of the gradient value corresponding to the pixel value of the target pixel at the position in the image, and is used to describe the intensity of the change rate of the gray value of the image around the target pixel.
S103, calculating an average gradient value of the image edge based on the number of edge pixels and the gradient value.
The step of calculating the average gradient value of the image edge based on the number of edge pixels and the gradient value is as follows:
Adding the gradient values of all the target pixels, and calculating to obtain a total gradient value;
Dividing the total gradient value by the number of edge pixels to obtain an average gradient value.
From the above steps, the average gradient value is the sum of the gradient values of all the target pixels added divided by the number of edge pixels.
S104, evaluating the image definition of the target image based on the number of edge pixels and the average gradient value, and obtaining an evaluation result; the evaluation results include clear results and fuzzy results.
And S105, if the evaluation result is a clear result, judging that the uploading of the target image is successful.
And S106, if the evaluation result is a fuzzy result, judging that the uploading of the target image fails, and sending out failure information.
The sharpness of the target image is evaluated according to the number of edge pixels and the average gradient size. In general, the more the number of edge pixels of the clearer target image is, the larger the average gradient value is; whereas blurred images are the opposite, i.e. the less the number of edge pixels of the blurred target image, the smaller the average gradient value, on the basis of which the image sharpness of the target image can be evaluated.
If the evaluation result is a clear result, the target image is clear, and the target image can be uploaded to an online platform, and the success of uploading the target image is judged; if the evaluation result is a fuzzy result, the target image is fuzzy and cannot be uploaded to the online platform, namely, failure in uploading the target image is judged, and failure information is sent out to prompt a user to upload a clearer image.
The implementation principle of the embodiment is as follows: according to the method, the definition of the target image is evaluated according to the number of the edge pixels and the average gradient value, the success of uploading the target image is judged when the evaluation result is a clear result, and the failure of uploading is judged when the evaluation result is a fuzzy result, so that when the user uploads the fuzzy image, the probability that the size and the resolution of the fuzzy image meet the requirements is effectively reduced, and the authentication is still passed, so that the identity of the user can be conveniently and accurately verified by the staff responsible for recruitment on the online platform, and the work efficiency is effectively improved by the staff responsible for recruitment on the online platform.
A detailed description will be given by way of fig. 2 based on one of the implementations of the embodiment shown in fig. 1.
Referring to fig. 2, the image sharpness of the target image is evaluated based on the number of edge pixels and the average gradient value, and an evaluation result is obtained, including the steps of:
S201, if the number of the edge pixels is larger than a preset number threshold and the average gradient value is larger than a preset gradient value threshold, generating an evaluation result as a clear result.
S202, if the number of the edge pixels is smaller than the number threshold and the average gradient value is smaller than the gradient value threshold, generating an evaluation result as a fuzzy result.
S203, if the number of the edge pixels is greater than the number threshold and the average gradient value is less than or equal to the gradient value threshold, generating an evaluation result as a clear result.
The number threshold and the gradient value threshold are preset for people, if the number of edge pixels is larger than the preset number threshold and the average gradient value is larger than the preset gradient value threshold, the target image is clear, the requirement of uploading to an on-line platform is met, and an evaluation result is generated to be a clear result.
If the number of the edge pixels is smaller than a preset number threshold and the average gradient value is smaller than a preset gradient value threshold, the target image is fuzzy, the requirement of uploading to an on-line platform cannot be met, and an evaluation result is generated to be a fuzzy result.
If the number of the edge pixels is greater than the number threshold and the average gradient value is less than or equal to the gradient value threshold, the edge of the target image is clear, namely the change is severe, namely the target image is high-contrast and has clear edge demarcation, so the target image is a clear image at the moment, and an evaluation result is generated as a clear result.
The evaluation result also comprises a result to be tested;
based on the number of edge pixels and the average gradient value, evaluating the image definition of the target image, and obtaining an evaluation result, and further comprising:
If the number of the edge pixels is smaller than or equal to the number threshold and the average gradient value is larger than the gradient value threshold, generating an evaluation result as a to-be-measured result.
If the number of edge pixels is less than or equal to the number threshold and the average gradient value is greater than the gradient value threshold, it indicates that the edges of the target image are smoother, i.e., do not change severely, but the edges are still more visible due to the greater number of edge pixels. For example, in a landscape photograph, the edges of the forest and sky may not be very sharp and change strongly, but due to the presence of leaves and clouds, etc., the number of edge pixels is high, so the forest and sky can still be distinguished significantly.
In the above case, the target image may have slight blurring, but it cannot be determined that the target image is a blurred image, so the generated evaluation result is a to-be-measured result, which is used to indicate that the target image needs to be evaluated for its image sharpness according to other modes.
According to the image definition identification method provided by the embodiment, the image definition of the target image can be effectively evaluated according to the number of the edge pixels and the average gradient value, so that whether the target image is successfully uploaded or not can be conveniently determined according to the image definition, and the working efficiency of a worker responsible for recruitment on an on-line platform is effectively improved. If the image definition cannot be estimated through the number of edge pixels and the average gradient value, an estimation result is generated as a result to be measured, so that the image definition can be further estimated later.
A detailed description will be given by way of fig. 3 based on one of the implementations of the embodiment shown in fig. 1.
Referring to fig. 3, after generating an evaluation result as a test result, the method includes the steps of:
s301, determining the neighborhood of each target pixel.
A neighborhood refers to a local pixel area in a target image centered around a certain target pixel, also referred to as a subset of pixels. The size of the neighborhood is typically expressed in terms of number of pixels, such as 3x3, 5x5, etc. In this embodiment, the size of the neighborhood is selected to be 3x3.
S302, for each target pixel on the target image, acquiring all pixels to be detected in the neighborhood taking the target pixel as a center, and acquiring a pixel value of each pixel to be detected.
After the neighborhood of the target pixel is determined, all pixels to be detected in the neighborhood with the target pixel as the center can be determined, and all pixels to be detected in the neighborhood are traversed to obtain the pixel value of each pixel to be detected. Specifically, the pixel value of each pixel in the neighborhood can be obtained by using a function or method provided by image processing software or programming language through parameters such as the target pixel position, the neighborhood size and the like.
S303, calculating the variance value of each pixel to be detected according to each pixel value, and calculating the average value of all the variance values.
The variance value of a pixel to be measured refers to the sum of squares of the deviations of the variables of a set of pixel values divided by the number of the set of pixels to be measured. In image processing, variance values of pixels are used to represent the variation and distribution of a set of pixel values.
After calculating the variance value of each pixel to be measured, on the premise of obtaining the number of the pixels to be measured, the average value of all the variance values is the sum of the variance values of all the pixels to be measured divided by the number of the pixels to be measured.
S304, if the average value is larger than a preset average value threshold value, judging that the evaluation result of the target image is a clear result.
And S305, if the average value is not greater than the average value threshold value, judging that the evaluation result of the target image is a fuzzy result.
The average value of the variance values is used as a definition evaluation index of the target image, if the average value is larger than an average value threshold value, the condition that the target image meets the uploading condition of the online platform is indicated, and the evaluation result of the target image is judged to be a definition result; if the average value is not greater than the average value threshold value, the target image is indicated not to meet the uploading condition of the online platform, and the evaluation result of the target image is judged to be a fuzzy result.
According to the image definition identification method provided by the embodiment, the average value of the variance values of all pixels to be detected is compared with the average value threshold value, and the evaluation result is generated, so that a user can upload clear images conveniently, and further the working efficiency of workers responsible for recruitment on an on-line platform is effectively improved.
A detailed description will be given by way of fig. 4 based on one of the implementations of the embodiment shown in fig. 1.
Referring to fig. 4, calculating a variance value of each pixel to be measured according to each pixel value includes the steps of:
S401, obtaining the total number of pixels of all pixels to be detected, and calculating the pixel average value of all the pixels to be detected in the neighborhood of each target pixel according to each pixel value.
After determining the neighborhood of the target pixel, all the pixels to be detected in the neighborhood with the target pixel as the center and the number of the pixels to be detected, namely the total number of the pixels, can be determined, and the average value of the pixels of all the pixels to be detected in the neighborhood of each target pixel is the sum of the pixel values of all the pixels to be detected in the neighborhood divided by the total number of the pixels.
S402, substituting the pixel average value and the total number of pixels into a preset variance value calculation formula, and calculating to obtain the variance value of each pixel to be detected.
The variance value calculation formula is:
wherein var is the variance value of each pixel to be measured, pi, j is the pixel value in the neighborhood of each target pixel, mean is the pixel average value of all the pixels to be measured in the neighborhood, and MxN represents the total number of pixels.
According to the image definition identification method provided by the embodiment, the variance value is calculated based on the variance value calculation formula, so that the image definition of the target image can be conveniently evaluated according to the variance value, and further the work efficiency of the staff responsible for recruitment on the online platform can be conveniently improved.
A detailed description will be given by way of fig. 5 based on one of the implementations of the embodiment shown in fig. 1.
Referring to fig. 5, after generating the evaluation result as the to-be-measured result, the method further includes the steps of:
S501, calculating a gray matrix of the target image, and performing zero-averaging processing on the target image.
The gray matrix of the target image is a matrix for representing gray information of each target pixel in the target image. The gray matrix is typically a two-dimensional matrix in which each element represents a gray value of a corresponding pixel in the target image. The gray value range is typically 0-255, where 0 represents black and 255 represents white. In a gray scale image, there is only one gray scale value per pixel, so the rows and columns of the gray scale matrix are equal to the rows and columns of the image.
For color images, it is necessary to first convert them into gray-scale images and then calculate gray-scale matrices. The conversion of the gray image can be realized by taking the average value, weighted average value and the like of three channels of red, green and blue.
Zero-averaging the target image means subtracting the average value of all pixel values of the target image from the pixel values in the target image so that the average value of the target image is 0. The processing method can reduce the deviation of the overall brightness and color of the target image and improve the definition and contrast of the image details.
S502, calculating an autocorrelation function between different pixels in a target image based on the gray matrix.
The autocorrelation function is used to represent the interrelationship between different pixels within the target image. The autocorrelation function is an average value of products of pixel values of each target pixel and pixel values of adjacent pixels thereof, and the known gray matrix can acquire the total number of target pixels in the target image and the pixel value of each target pixel in the target image, and the autocorrelation function can be calculated according to the total number of target pixels in the target image and the pixel value of each target pixel in the target image.
S503, obtaining the maximum value and the minimum value of the autocorrelation function, and determining the peak value of the autocorrelation function according to the maximum value and the minimum value.
The known autocorrelation function can obtain the maximum value and the minimum value of the autocorrelation function, and the peak value of the autocorrelation function can be determined according to the maximum value and the minimum value. Specifically, the method for determining the peak value is as follows:
and searching two continuous values between the maximum value and the minimum value of the autocorrelation function, enabling the two values to be close to the peak value sufficiently, namely, considering the two values as a section where the peak value is located, and searching the maximum value in the section to determine the peak value of the autocorrelation function.
If there is not a value between the maximum and minimum that is sufficiently close to the peak, the interval may be gradually expanded until a suitable interval is found. If there are multiple peaks in the autocorrelation function, the above steps can be repeated to find all peaks.
S504, if the peak value is larger than a preset peak value threshold value, generating an evaluation result as a clear result.
S505, if the peak value is not greater than the peak value threshold value, generating an evaluation result as a fuzzy result.
The peak value threshold is preset for people, in general, the larger the peak value is, the clearer the image is, so if the peak value is larger than the preset peak value threshold, the target image can be uploaded to an online platform, and an evaluation result is generated at the moment as a clear result; if the peak value is not greater than the peak value threshold value, the target image is not uploaded to the online platform, and the generated evaluation result is a fuzzy result.
According to the image definition identification method provided by the embodiment, the image definition of the target image is evaluated according to the maximum value and the minimum value of the autocorrelation function, and an evaluation result is generated, so that a user can upload the clear image conveniently, and further the working efficiency of a worker responsible for recruitment on an online platform is effectively improved.
A detailed description will be given by way of fig. 6 based on one of the implementations of the embodiment shown in fig. 1.
Referring to fig. 6, based on the gray matrix, an autocorrelation function between different pixels within a target image is calculated, comprising the steps of:
S601, acquiring a pixel position of each target pixel on the target image, and acquiring a pixel gray value of each target pixel on the pixel position based on the gray matrix.
The pixel position of each target pixel on the target image is displayed in coordinates in this embodiment. Specifically, an image coordinate system of a target image is firstly constructed, an x-axis, a y-axis and an origin are determined, and each target pixel on the target image is traversed, so that corresponding pixel coordinates, namely pixel positions, can be obtained.
Since each element in the gray matrix represents the gray value of the corresponding pixel in the target image, the pixel gray value of each target pixel at the pixel position can be obtained according to the gray matrix.
S602, acquiring the total number of image pixels of the target image and the target pixel value of each target pixel, and calculating the target pixel average value of the target image according to the total number of image pixels and the target pixel value.
The total number of the pixels of the target image and the target pixel value of each target pixel are obtained through traversal, and the average value of the target pixels of the target image is the sum of all the target pixel values divided by the total number of the pixels of the image.
S603, substituting the pixel positions, the total number of pixels of the image, the pixel gray values and the average value of the target pixels into a preset autocorrelation function calculation formula, and calculating to obtain autocorrelation functions among different pixels in the target image.
The autocorrelation function is calculated as follows:
Wherein R (u, v) is an autocorrelation function, (u, v) is a displacement on the image, (x, y) is a pixel position on the target image, I (x, y) is a pixel gray value on the pixel position, m is a target pixel average value, and N is the total number of pixels of the image.
According to the image definition identification method provided by the embodiment, the autocorrelation function is calculated based on the autocorrelation function calculation formula, so that the image definition of the target image can be conveniently estimated according to the autocorrelation function, and further the work efficiency of a worker responsible for recruitment on an online platform can be conveniently improved.
The embodiment of the application also discloses an intelligent terminal which comprises a memory, a processor and a computer program which is stored in the memory and can run on the processor, wherein the image definition identification method in the embodiment is adopted when the processor executes the computer program.
The intelligent terminal may adopt a computer device such as a desktop computer, a notebook computer or a cloud server, and the intelligent terminal includes, but is not limited to, a processor and a memory, for example, the intelligent terminal may further include an input/output device, a network access device, a bus, and the like.
The processor may be a Central Processing Unit (CPU), or of course, according to actual use, other general purpose processors, digital Signal Processors (DSP), application Specific Integrated Circuits (ASIC), ready-made programmable gate arrays (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc., and the general purpose processor may be a microprocessor or any conventional processor, etc., which is not limited in this respect.
The memory may be an internal storage unit of the intelligent terminal, for example, a hard disk or a memory of the intelligent terminal, or an external storage device of the intelligent terminal, for example, a plug-in hard disk, a Smart Memory Card (SMC), a secure digital card (SD) or a flash memory card (FC) provided on the intelligent terminal, or the like, and may be a combination of the internal storage unit of the intelligent terminal and the external storage device, where the memory is used to store a computer program and other programs and data required by the intelligent terminal, and the memory may be used to temporarily store data that has been output or is to be output, which is not limited by the present application.
The image definition identification method in the embodiment is stored in the memory of the intelligent terminal through the intelligent terminal, and is loaded and executed on the processor of the intelligent terminal, so that the intelligent terminal is convenient to use.
The embodiment of the application also discloses a computer readable storage medium, and the computer readable storage medium stores a computer program, wherein the computer program is executed by a processor, and the image definition identification method in the embodiment is adopted.
The computer program may be stored in a computer readable medium, where the computer program includes computer program code, where the computer program code may be in a source code form, an object code form, an executable file form, or some middleware form, etc., and the computer readable medium includes any entity or device capable of carrying the computer program code, a recording medium, a usb disk, a removable hard disk, a magnetic disk, an optical disk, a computer memory, a read-only memory (ROM), a Random Access Memory (RAM), an electrical carrier signal, a telecommunication signal, a software distribution medium, etc., where the computer readable medium includes, but is not limited to, the above components.
The image definition identifying method in the above embodiment is stored in the computer readable storage medium through the present computer readable storage medium, and is loaded and executed on a processor, so as to facilitate the storage and application of the method.
The above embodiments are not intended to limit the scope of the present application, so: all equivalent changes in structure, shape and principle of the application should be covered in the scope of protection of the application.

Claims (9)

1. An image sharpness recognition method, comprising:
Acquiring an image edge of a target image based on a preset edge detection algorithm;
Acquiring the number of edge pixels of target pixels of the image edge and the gradient value of each target pixel;
calculating an average gradient value of the image edge based on the number of edge pixels and the gradient value;
Based on the number of the edge pixels and the average gradient value, evaluating the image definition of the target image, and obtaining an evaluation result; the evaluation result comprises a clear result and a fuzzy result;
If the evaluation result is the clear result, judging that the uploading of the target image is successful;
and if the evaluation result is the fuzzy result, judging that the uploading of the target image fails, and sending out failure information.
2. The method according to claim 1, wherein the evaluating the image sharpness of the target image based on the number of edge pixels and the average gradient value, and obtaining the evaluation result, comprises:
if the number of the edge pixels is larger than a preset number threshold and the average gradient value is larger than a preset gradient value threshold, generating the evaluation result as the clear result;
If the number of edge pixels is smaller than the number threshold and the average gradient value is smaller than the gradient value threshold, generating the evaluation result as the blurring result;
And if the number of the edge pixels is greater than the number threshold and the average gradient value is less than or equal to the gradient value threshold, generating an evaluation result as the clear result.
3. The method for identifying the definition of an image according to claim 2, wherein the evaluation result further comprises a result to be tested;
The step of evaluating the image definition of the target image based on the number of edge pixels and the average gradient value, and obtaining an evaluation result, further includes:
and if the number of the edge pixels is smaller than or equal to the number threshold and the average gradient value is larger than the gradient value threshold, generating an evaluation result as the to-be-measured result.
4. A method of image sharpness recognition according to claim 3, characterized in that after said generating an evaluation result is said to-be-measured result, it comprises:
determining a neighborhood of each target pixel;
For each target pixel on the target image, acquiring all pixels to be detected in the neighborhood taking the target pixel as a center, and acquiring a pixel value of each pixel to be detected;
calculating a variance value of each pixel to be detected according to each pixel value, and calculating an average value of all variance values;
If the average value is larger than a preset average value threshold value, judging that the evaluation result of the target image is the clear result;
and if the average value is not greater than the average value threshold value, judging that the evaluation result of the target image is the fuzzy result.
5. The method of claim 4, wherein calculating a variance value of each pixel to be measured from each pixel value, comprises:
Acquiring the total number of pixels of all the pixels to be detected, and calculating the pixel average value of all the pixels to be detected in the neighborhood of each target pixel according to each pixel value;
substituting the pixel average value and the total number of pixels into a preset variance value calculation formula, and calculating to obtain a variance value of each pixel to be detected;
The variance value calculation formula is as follows:
wherein var is the variance value of each pixel to be measured, pi, j is the pixel value in the neighborhood of each target pixel, mean is the average value of all the pixels to be measured in the neighborhood, and MxN represents the total number of pixels.
6. The method according to claim 3, further comprising, after the generating the evaluation result is the result to be measured:
Calculating a gray matrix of the target image, and performing zero-mean processing on the target image;
calculating an autocorrelation function between different pixels in the target image based on the gray matrix;
Obtaining the maximum value and the minimum value of the autocorrelation function, and determining the peak value of the autocorrelation function according to the maximum value and the minimum value;
if the peak value is larger than a preset peak value threshold value, generating the evaluation result as the clear result;
and if the peak value is not greater than the peak value threshold value, generating the evaluation result as the fuzzy result.
7. The method of claim 6, wherein calculating an autocorrelation function between different pixels in the target image based on the gray matrix, comprises:
acquiring a pixel position of each target pixel on the target image, and acquiring a pixel gray value of each target pixel on the pixel position based on the gray matrix;
Acquiring the total number of the image pixels of the target image and the target pixel value of each target pixel, and calculating the target pixel average value of the target image according to the total number of the image pixels and the target pixel value;
Substituting the pixel positions, the total number of pixels of the image, the pixel gray values and the average value of the target pixels into a preset autocorrelation function calculation formula, and calculating to obtain autocorrelation functions among different pixels in the target image;
The autocorrelation function has the following calculation formula:
;
Wherein R (u, v) is an autocorrelation function, (u, v) is a displacement on an image, (x, y) is a pixel position on the target image, I (x, y) is a pixel gray value on the pixel position, m is a target pixel average value, and N is a total number of pixels of the image.
8. A smart terminal comprising a memory, a processor and a computer program stored in the memory and capable of running on the processor, characterized in that the method according to any one of claims 1 to 7 is used when the computer program is loaded and executed by the processor.
9. A computer readable storage medium having a computer program stored therein, characterized in that the method of any of claims 1 to 7 is employed when the computer program is loaded and executed by a processor.
CN202410422854.2A 2024-04-30 2024-04-30 Image definition identification method, intelligent terminal and storage medium Pending CN118261885A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410422854.2A CN118261885A (en) 2024-04-30 2024-04-30 Image definition identification method, intelligent terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410422854.2A CN118261885A (en) 2024-04-30 2024-04-30 Image definition identification method, intelligent terminal and storage medium

Publications (1)

Publication Number Publication Date
CN118261885A true CN118261885A (en) 2024-06-28

Family

ID=91608562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410422854.2A Pending CN118261885A (en) 2024-04-30 2024-04-30 Image definition identification method, intelligent terminal and storage medium

Country Status (1)

Country Link
CN (1) CN118261885A (en)

Similar Documents

Publication Publication Date Title
WO2021000524A1 (en) Hole protection cap detection method and apparatus, computer device and storage medium
CN111028213A (en) Image defect detection method and device, electronic equipment and storage medium
US8406535B2 (en) Invariant visual scene and object recognition
CN111899270B (en) Card frame detection method, device, equipment and readable storage medium
CN111220235B (en) Water level monitoring method and device
CN111553914B (en) Vision-based goods detection method and device, terminal and readable storage medium
CN111553302B (en) Key frame selection method, device, equipment and computer readable storage medium
CN110910445B (en) Object size detection method, device, detection equipment and storage medium
CN112464829B (en) Pupil positioning method, pupil positioning equipment, storage medium and sight tracking system
CN111144372A (en) Vehicle detection method, device, computer equipment and storage medium
CN114374760A (en) Image testing method and device, computer equipment and computer readable storage medium
CN111784658B (en) Quality analysis method and system for face image
CN112651953A (en) Image similarity calculation method and device, computer equipment and storage medium
CN113313092B (en) Handwritten signature recognition method, and claims settlement automation processing method, device and equipment
CN116228644A (en) Image detection method, electronic device and storage medium
CN112734747B (en) Target detection method and device, electronic equipment and storage medium
US20080267506A1 (en) Interest point detection
CN117520581A (en) Land mapping information management method, system, equipment and medium
JP4685711B2 (en) Image processing method, apparatus and program
CN112131919A (en) Security inspection method, device, equipment and medium
CN118261885A (en) Image definition identification method, intelligent terminal and storage medium
CN112883959B (en) Identity card integrity detection method, device, equipment and storage medium
CN117011296B (en) Method, equipment and storage medium for quickly detecting tracking precision of photoelectric pod
CN115423855B (en) Template matching method, device, equipment and medium for image
CN117115275B (en) Distortion parameter determination method and device and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination