CN116843683B - Equipment imaging definition evaluation method, system and device - Google Patents

Equipment imaging definition evaluation method, system and device Download PDF

Info

Publication number
CN116843683B
CN116843683B CN202311099700.6A CN202311099700A CN116843683B CN 116843683 B CN116843683 B CN 116843683B CN 202311099700 A CN202311099700 A CN 202311099700A CN 116843683 B CN116843683 B CN 116843683B
Authority
CN
China
Prior art keywords
image
score
frequency domain
images
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311099700.6A
Other languages
Chinese (zh)
Other versions
CN116843683A (en
Inventor
姚可为
张肇宁
陈雪飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311099700.6A priority Critical patent/CN116843683B/en
Publication of CN116843683A publication Critical patent/CN116843683A/en
Application granted granted Critical
Publication of CN116843683B publication Critical patent/CN116843683B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The method, the system and the device for evaluating the imaging definition of the equipment are beneficial to solving the problem of low accuracy of evaluating the imaging definition of the terminal equipment. The method comprises the following steps: acquiring a plurality of images, wherein the images are obtained by respectively shooting preset images in a plurality of light source environments by a second terminal device; respectively calculating frequency domain parameters of each image in the plurality of images, and determining frequency domain comprehensive scores of the plurality of images based on the frequency domain parameters, wherein the frequency domain parameters comprise at least one of sharpness, sharpening strength and sharpening cut-off frequency, and the frequency domain comprehensive scores comprise at least one of sharpness comprehensive scores, sharpening strength comprehensive scores and sharpening cut-off frequency comprehensive scores; respectively calculating the similarity between each image in the plurality of images and a preset image, and determining the similarity comprehensive score of the plurality of images based on the similarity; and evaluating the imaging definition of the second terminal equipment based on the frequency domain comprehensive score and the similarity comprehensive score.

Description

Equipment imaging definition evaluation method, system and device
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method, a system, and an apparatus for evaluating imaging sharpness of a device.
Background
With the development of terminal technology, terminal devices have become a major photographic tool for people. The sharpness of the image of the terminal device is an important indicator for evaluating the imaging quality of the terminal device.
At present, the definition evaluation factor of an image is influenced by various factors such as resolution, contrast, sharpening strength and the like of the image. In the related art, the image sharpness can be evaluated by determining the sharpness, contrast, sharpening strength or resolving power of the image, but the evaluation dimension is limited to the evaluation in the frequency domain dimension, and the accuracy of the image sharpness evaluation is low.
Therefore, there is a need for a device imaging definition evaluation method, which is favorable for solving the problem of low accuracy of device imaging definition evaluation, and meanwhile, after evaluating device imaging, a developer can adjust parameters of a terminal device according to definition evaluation data of the device imaging so as to improve imaging quality of the terminal device and improve user experience.
Disclosure of Invention
The method, the system and the device for evaluating the imaging definition of the equipment are beneficial to solving the problem of low accuracy of evaluating the imaging definition of the terminal equipment.
In a first aspect, a device imaging sharpness evaluation method is provided, which can be executed by a first terminal device, the method comprising: acquiring a plurality of images, wherein the images are obtained by respectively shooting preset images in a plurality of light source environments by a second terminal device, and the light source environments have different light source brightness; respectively calculating frequency domain parameters of each image in the plurality of images, and determining frequency domain comprehensive scores of the plurality of images based on the frequency domain parameters, wherein the frequency domain parameters comprise at least one of sharpness, sharpening strength and sharpening cut-off frequency, and the frequency domain comprehensive scores comprise at least one of sharpness comprehensive scores, sharpening strength comprehensive scores and sharpening cut-off frequency comprehensive scores; respectively calculating the similarity between each image in the plurality of images and a preset image, and determining the similarity comprehensive score of the plurality of images based on the similarity; and evaluating the imaging definition of the second terminal equipment based on the frequency domain comprehensive score and the similarity comprehensive score.
According to the device imaging definition evaluation method, the first terminal device calculates the frequency domain scores and the similarity scores of the plurality of images shot by the second terminal device in different light source environments, and the imaging definition of the second terminal device is evaluated through the frequency domain scores and the similarity scores. Therefore, the influence of the light source environment on the imaging of the equipment can be reduced, and the evaluation of the two dimensions of the frequency domain dimension and the space domain dimension (similarity) is integrated, so that the accuracy of the evaluation of the imaging definition of the equipment is improved.
In one possible implementation manner, after the imaging definition of the second terminal device is evaluated, a developer may adjust parameters of the terminal device according to the definition evaluation data of the imaging of the device, so as to improve the imaging quality of the terminal device and improve the user experience.
Alternatively, the preset image may be a dead leaf map. The dead leaf map contains rich texture details and is similar to a natural image, and can be used for evaluating the imaging definition of equipment. It should be understood that any image with rich texture details may be used as a preset image to evaluate the imaging sharpness of the device, and the specific image type and content of the preset image are not specifically limited in this application.
It should be appreciated that the number of light source environments may be preset. Different light source environments can be simulated by adjusting the brightness of the light source. For example, a sunny light source environment is simulated by adjusting the light source brightness to a first brightness, a cloudy light source environment is simulated by adjusting the light source brightness to a second brightness, and a night light source environment is simulated by adjusting the light source brightness to a third brightness.
It should also be appreciated that when the frequency domain parameter is sharpness, the determined frequency domain composite score is a sharpness composite score. And when the frequency domain parameter is the sharpening intensity, the determined frequency domain comprehensive score is the sharpening intensity comprehensive score. And when the frequency domain parameter is the sharpening cut-off frequency, the determined frequency domain comprehensive score is the sharpening cut-off frequency comprehensive score. When the frequency domain parameters are sharpness and sharpening strength, the determined frequency domain composite score is a sharpness composite score and sharpening strength composite score. And so on, the frequency domain parameters and the frequency domain comprehensive scores are in one-to-one correspondence, and are not repeated here.
Optionally, the first terminal device determines that the imaging sharpness of the second terminal device meets the requirement when the frequency domain integrated score is greater than or equal to a first preset threshold and the similarity integrated score is greater than or equal to a second preset threshold.
It should be understood that in the case where the frequency domain composite score is any one of the sharpness composite score, the sharpening strength composite score, or the sharpening cutoff frequency composite score, the threshold number of the first preset threshold is 1. Correspondingly, under the condition that the sharpness composite score, the sharpening intensity composite score or the sharpening cutoff frequency composite score is greater than or equal to a first preset threshold value and the similarity composite score is greater than or equal to a second preset threshold value, the first terminal device determines that the imaging definition of the second terminal device meets the requirement. In the case where the frequency domain composite score includes composite scores of 2 frequency domain parameters (i.e., any 2 of the sharpness composite score, the sharpening strength composite score, and the sharpening cutoff frequency composite score), the number of threshold values of the first preset threshold value is 2, and the 2 threshold values respectively correspond to the composite scores of 2 frequency domain parameters. And under the condition that the comprehensive scores of the 2 frequency domain parameters are respectively larger than or equal to the corresponding 2 thresholds and the similarity comprehensive score is larger than or equal to a second preset threshold, the first terminal equipment determines that the imaging definition of the second terminal equipment meets the requirement. And so on, according to the difference of the number of the frequency domain comprehensive scores and the frequency domain parameters included by the frequency domain comprehensive scores, the number of the first preset threshold corresponds to the number of the frequency domain comprehensive scores and the number of the first preset threshold, and in this case, the mode that the first terminal equipment determines that the imaging definition of the second terminal equipment meets the requirement is similar to the case that the number of the frequency domain comprehensive scores is 1 or 2, and the application is not repeated.
Illustratively, the frequency domain composite score includes a sharpness composite score, a sharpening strength composite score, and a sharpening cut-off frequency composite score. The first preset threshold corresponds to a threshold 1, a threshold 2, and a threshold 3. Under the conditions that the sharpness composite score is greater than or equal to a threshold value 1, the sharpening strength composite score is greater than or equal to a threshold value 2, the sharpening cutoff frequency composite score is greater than or equal to a threshold value 3, and the similarity composite score is greater than or equal to a second preset threshold value, the first terminal equipment determines that the imaging definition of the second terminal equipment meets the requirement.
By setting the corresponding threshold for the composite score and the similarity score of each frequency domain parameter, the imaging definition of the device can be evaluated in a finer granularity, thereby being beneficial to improving the accuracy of the imaging definition evaluation of the device.
With reference to the first aspect, in certain implementation manners of the first aspect, evaluating the imaging sharpness of the second terminal device based on the frequency domain integrated score and the similarity integrated score includes: weighting and summing the frequency domain comprehensive score and the similarity comprehensive score to obtain image definition comprehensive scores of a plurality of images; and evaluating the imaging definition of the second terminal device based on the image definition composite score.
Optionally, in a case where the frequency domain composite score includes composite scores (i.e., sharpness composite score, sharpening intensity composite score, and sharpening cutoff frequency composite score) of 3 frequency domain parameters, the sharpness composite score, the sharpening intensity composite score, the sharpening cutoff frequency composite score, and the similarity composite score are respectively configured with weight coefficients, and the first terminal device performs weighted summation on the four composite scores based on the configured weight coefficients to obtain an image sharpness composite score of the plurality of images. It should be further understood that, in the case where the frequency domain integrated score includes an integrated score of 2 or 1 frequency domain parameters, the weight configuration of the frequency domain integrated score and the calculation manner of the image sharpness integrated score of the plurality of images are similar to the case where the frequency domain integrated score includes an integrated score of 3 frequency domain parameters, and will not be described herein.
Optionally, determining that the imaging definition of the second terminal device meets the requirement under the condition that the image definition integrated score is greater than or equal to a third preset threshold.
Illustratively, the frequency domain composite score includes: sharpness composite scoreS A Sharpening intensity composite score S B Sharpening cut-off frequency composite score S C . Similarity composite score ofS D . Configuring the weight coefficient of the sharpness composite score asω 1 The weight coefficient of the sharpening strength integrated score isω 2 The weight coefficient of the sharpening cut-off frequency composite score isω 3 The weight coefficient of the similarity comprehensive score isω 4 . The first terminal equipment performs weighted summation on the frequency domain comprehensive score and the similarity comprehensive score to obtain an image definition comprehensive of a plurality of imagesScore of combinationS ISQ 1 ×S A +ω 2 ×S B +ω 3 ×S C +ω 4 ×S D . At the position ofS ISQ And under the condition that the imaging definition of the second terminal equipment is larger than or equal to a third preset threshold value, determining that the imaging definition of the second terminal equipment meets the requirement.
With reference to the first aspect, in certain implementations of the first aspect, determining a frequency domain composite score for a plurality of images includes: calculating a frequency domain score of each of the plurality of images based on the frequency domain parameters; respectively calculating the average value of the frequency domain scores of at least one image corresponding to each light source environment in the plurality of light source environments to obtain the frequency domain score of each light source environment; and carrying out weighted summation on the frequency domain scores of each light source environment to obtain a frequency domain comprehensive score.
It should be appreciated that the frequency domain score for each light source environment includes at least one of each light source environment sharpness score, sharpening intensity score, and sharpening cutoff frequency score.
Optionally, the first terminal device may score the frequency domain parameter of each of the plurality of images according to a preset scoring function, so as to obtain a frequency domain score of each of the plurality of images.
Alternatively, different light source environments may be set with different weights when weighting and summing the frequency domain scores of each light source environment. For example, in some scenarios, where a user has a higher requirement for capturing a device in a nighttime environment, a developer may set a weight for a light source environment with a lower smooth brightness when evaluating the sharpness of the device, so as to adjust parameters of the device to improve the imaging sharpness of the device in the nighttime environment when the imaging sharpness evaluated in the light source environment with the lower smooth brightness is not satisfactory. In other scenes, users have higher requirements on shooting of the equipment in the outdoor environment, and research and development personnel can set the weight of the light source environment with higher smooth brightness when evaluating the definition of the equipment, so that when the imaging definition evaluated in the light source environment with higher smooth brightness is not in accordance with the requirements, the parameters of the equipment are adjusted to improve the imaging definition of the equipment in the outdoor environment.
Specifically, the plurality of light source environments may include a first light source environment and a second light source environment, the weight of the first light source environment may be a first preset weight, the weight of the second light source environment may be a second preset weight, the plurality of images may include a plurality of first images captured by the second terminal device in the first light source environment and a plurality of second images captured by the second terminal device in the second light source environment, and the frequency domain score may include a sharpness score and a sharpening strength score. The first terminal device may first calculate a frequency domain score for each of the plurality of images. Then, the first terminal device may calculate an average value of the frequency domain scores of the plurality of first images and an average value of the frequency domain scores of the plurality of second images from the frequency domain score of each image. Then, the first terminal device may multiply the average value of the frequency domain scores of the plurality of first images with the first preset weight to obtain a first product, and multiply the average value of the frequency domain scores of the plurality of second images with the second preset weight to obtain a second product. Finally, the first terminal device may determine a sum of the first product and the second product as a frequency domain composite score.
By setting different weights for different light sources, the influence of the light sources on the frequency domain comprehensive score of the equipment imaging can be reduced, the influence of the light sources on the equipment imaging definition is further reduced, the accuracy of equipment imaging definition evaluation is improved, and equipment parameters can be adjusted according to the evaluation result, so that the adjusted equipment imaging effect is more suitable for the imaging requirement of a specific environment.
With reference to the first aspect, in certain implementations of the first aspect, determining a frequency domain composite score for a plurality of images includes: calculating a frequency domain score of each of the plurality of images based on the frequency domain parameters; and calculating the average value of the frequency domain scores of the plurality of images to obtain a frequency domain comprehensive score.
With reference to the first aspect, in certain implementation manners of the first aspect, determining a similarity composite score of the plurality of images includes: calculating a similarity score of each of the plurality of images based on the similarity; respectively calculating the average value of the similarity scores of at least one image corresponding to each light source environment in the plurality of light source environments to obtain the similarity score of each light source environment; and carrying out weighted summation on the similarity scores of the light source environments to obtain a similarity comprehensive score.
It should be understood that the weight setting of the light source environments when the similarity score of each light source environment is weighted and summed may be the same as or different from the weight setting of the light source environments when the frequency domain score of each light source environment is weighted and summed as described above, which is not limited in this application.
Specifically, the manner in which the first terminal device determines the similarity composite score of the multiple images is similar to the manner in which the first terminal device determines the frequency domain composite score of the multiple images, which is not described herein.
By setting different weights for different light sources, the influence of the light sources on the comprehensive score of the similarity of the imaging of the equipment can be reduced, the influence of the light sources on the imaging definition of the equipment is further reduced, the accuracy of the imaging definition evaluation of the equipment is improved, and the parameters of the equipment can be adjusted according to the evaluation result, so that the imaging effect of the adjusted equipment is more suitable for the imaging requirement of a specific environment.
With reference to the first aspect, in certain implementation manners of the first aspect, determining a similarity composite score of the plurality of images includes: calculating a similarity score of each of the plurality of images based on the similarity; and calculating the average value of the similarity scores of the plurality of images to obtain a similarity comprehensive score.
With reference to the first aspect, in certain implementation manners of the first aspect, before calculating the frequency domain parameter of each image in the plurality of images separately, the method further includes: detecting each image to obtain brightness information of each image; obtaining a two-dimensional modulation transfer function (modulation transfer function, MTF) of each image based on the brightness information of each image; processing the value of the two-dimensional modulation transfer function of each image to obtain a one-dimensional modulation transfer function of each image; the calculating the frequency domain parameters of each image in the plurality of images respectively includes: frequency domain parameters of each of the plurality of images are calculated based on the one-dimensional modulation transfer function of each image.
Optionally, performing two-dimensional discrete Fourier transform on the brightness information of each image to obtain a power spectrum of each image; dividing the power spectrum of each image by the power spectrum of a preset image to obtain a two-dimensional modulation transfer function of each image.
Optionally, the first terminal device may automatically detect each image through YOLO (you only look once) algorithm to obtain a brightness distribution function of the target area of each imagepsf(a,b) Wherein, the method comprises the steps of, wherein, aAndbrepresenting the location of the pixel. Luminance distribution function for each target areapsf(a,b) Performing 2D discrete Fourier transform to obtain power spectrum of each imageG(m,n) WhereinmThe frequency is represented by a frequency value,nrepresenting the power corresponding to the frequency. Acquiring a power spectrum of a preset imageI(m,n) Power spectrum of each imageG(m,n) Respectively divided by the power spectrum of the preset imageI(m,n) Obtaining a two-dimensional modulation transfer function of each imageWherein, the method comprises the steps of, wherein,ufor the spatial frequency of the signal to be transmitted,vrepresenting different directions of the image, the two-dimensional modulation transfer function may be referred to as an MTF curved surface. The first terminal equipment averages the two-dimensional modulation transfer function of each image in different directions to obtain the one-dimensional modulation transfer function of each imageMTF(u,0) 1D That is to sayMTF(u) The one-dimensional modulation transfer function may also be referred to as an MTF curve, wherein averaging in different directions refers to averaging over the same spatial frequency on different MTF curvesMTFThe values are averaged.
It should be appreciated that the target region described above may be a texture region of an image.
With reference to the first aspect, in some implementation manners of the first aspect, the calculating a similarity between each of the plurality of images and a preset image includes: calculating the standard deviation of each image by using the number of pixel points of each image, the pixel value of each image and the average value of each image; calculating standard deviation of the preset image by using the number of pixels of the preset image, the pixel value of the preset image and the average value of the preset image; calculating covariance between each image and a preset image by using the number of pixels of each image, the pixel value of each image, the average value of each image, the pixel value of the preset image and the average value of the preset image; and calculating the similarity between each image and the preset image by using the standard deviation of each image, the standard deviation of the preset image and the covariance between each image and the preset image.
In a second aspect, a system for evaluating imaging definition of a device is provided, the system comprising a first terminal device, a second terminal device, and a light box, wherein a light source is arranged in the light box; the first terminal device is used for: controlling the light source to be started, adjusting the brightness of the light source to be first brightness, and sending a first shooting instruction to the second terminal equipment; the second terminal device is used for: receiving a first shooting instruction, and shooting a preset image for multiple times to obtain multiple first images; the first terminal device is further configured to: adjusting the brightness of the light source to be second brightness, and sending a second shooting instruction to second terminal equipment; the second terminal device is further configured to: receiving a second shooting instruction, and shooting the preset image for multiple times to obtain multiple second images; transmitting a plurality of images to a first terminal device, wherein the plurality of images comprise a plurality of first images and a plurality of second images; the first terminal device is further configured to: receiving a plurality of images, respectively calculating frequency domain parameters of each image in the plurality of images, and determining a frequency domain composite score of the plurality of images based on the frequency domain parameters of each image, wherein the frequency domain parameters comprise at least one of sharpness, sharpening strength and sharpening cut-off frequency, and the frequency domain composite score comprises at least one of sharpness composite score, sharpening strength composite score and sharpening cut-off frequency composite score; respectively calculating the similarity between each image and a preset image in the plurality of images, and determining the similarity comprehensive score of the plurality of images based on the similarity between each image and the preset image; and evaluating the imaging definition of the second terminal equipment based on the frequency domain comprehensive score and the similarity comprehensive score.
According to the equipment imaging definition evaluation system, the first terminal equipment can control the lamp box to adjust the brightness of the light source, so that the second terminal equipment can shoot a plurality of images in different light source environments, the first terminal equipment calculates frequency domain scores and similarity scores of the plurality of images, and the imaging definition of the second terminal equipment is evaluated through the frequency domain scores and the similarity scores. Therefore, the influence of the light source environment on the imaging of the equipment is reduced, the evaluation of the two dimensions of the frequency domain dimension and the airspace dimension (similarity) is integrated, and the accuracy of the evaluation of the imaging definition of the equipment is improved.
In one possible implementation manner, after the imaging definition of the second terminal device is evaluated, a developer may adjust parameters of the first terminal device according to the definition evaluation data of the imaging of the device, so as to improve the imaging quality of the first terminal device and improve user experience.
With reference to the second aspect, in some implementations of the second aspect, the plurality of first images are obtained by the second terminal device capturing a first preset number of times or a first preset duration of capturing the preset images.
It should be understood that the plurality of second images are obtained by photographing the preset image by the second terminal device for a second preset number of times or for a second preset time period. The first preset times and the second preset times can be the same or different, and the first preset time length and the second preset time length can be the same or different, which is not limited in the application.
It should also be understood that the second terminal device may automatically capture the preset image for the first preset time period. The plurality of first images are shot by the second terminal equipment within a first preset time period. For example, the first preset duration is 20s, and the second terminal device may automatically take 10 photos in one minute, where the 10 photos are a plurality of first images.
With reference to the second aspect, in some implementations of the second aspect, the first shooting instruction carries information for indicating a first preset number of times or a first preset duration.
In a third aspect, a device imaging sharpness evaluation apparatus is provided for performing the method in an implementation of the first aspect described above. In particular, the apparatus comprises means for performing the method in an implementation of the first aspect described above.
In one design, the apparatus may include modules corresponding to each other for performing the methods/operations/steps/actions described in the first aspect, where the modules may be hardware circuits, software, or a combination of hardware circuits and software.
In a fourth aspect, there is provided an apparatus for evaluating imaging sharpness of a device, comprising: a processor and a memory, the processor being configured to read instructions stored in the memory to perform the method of the first aspect described above.
Optionally, the processor is one or more and the memory is one or more.
Alternatively, the memory may be integrated with the processor or the memory may be separate from the processor.
In a specific implementation process, the memory may be a non-transient (non-transitory) memory, for example, a Read Only Memory (ROM), which may be integrated on the same chip as the processor, or may be separately disposed on different chips.
The apparatus in the third aspect may be a chip, and the processor may be implemented by hardware or software, and when implemented by hardware, the processor may be a logic circuit, an integrated circuit, or the like; when implemented in software, the processor may be a general-purpose processor, implemented by reading software code stored in a memory, which may be integrated in the processor, or may reside outside the processor, and exist separately.
In a fifth aspect, there is provided a computer program product comprising: a computer program (which may also be referred to as code, or instructions) which, when executed, causes a computer to perform the method of the first aspect described above.
In a sixth aspect, there is provided a computer readable storage medium storing a computer program (which may also be referred to as code, or instructions) which, when run on a computer, causes the computer to perform the method of the first aspect described above.
Drawings
Fig. 1 is a schematic diagram of an apparatus imaging definition evaluation system according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of an apparatus imaging sharpness evaluation provided in an embodiment of the present application;
FIG. 3 is a schematic block diagram of an image-generating one-dimensional modulation transfer function provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of determining a one-dimensional modulation transfer function provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of determining sharpening strength and sharpening cutoff frequency provided by an embodiment of the present application;
FIG. 6 is a schematic block diagram of an apparatus for evaluating imaging sharpness of a device according to an embodiment of the present application;
fig. 7 is a schematic block diagram of another apparatus imaging sharpness evaluation apparatus provided in an embodiment of the present application.
Detailed Description
The technical solutions in the present application will be described below with reference to the accompanying drawings.
In order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In this application, the terms "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
Furthermore, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, and c may represent: a, b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
In the embodiments of the present application, the descriptions of "when … …", "in the case of … …", "if" and "if" all refer to that the device will make corresponding processing under some objective condition, and are not limited in time, nor do the devices require that the device have to perform a judging action when implemented, nor are other limitations meant to exist.
Currently, in the related art, the imaging definition of a device can be evaluated by evaluating the definition of a photograph taken by the device. The sharpness evaluation factor of an image is affected by various factors such as resolution, contrast, sharpness, and the like of the image. In the related art, the image definition can be evaluated by determining the sharpness, contrast or resolving power of the image, but the evaluation dimension is limited to the evaluation in the frequency domain dimension, so that the accuracy of the image definition evaluation is lower, namely the accuracy of the device imaging definition evaluation is lower.
Therefore, the application provides a method, a system and a device for evaluating imaging definition of equipment, which are used for comprehensively evaluating two dimensions of a frequency domain and a space domain through a plurality of images shot by the equipment under different light source environments. Therefore, the influence of the light source environment on the imaging of the equipment can be reduced, the evaluation of the two dimensions of the frequency domain dimension and the spatial domain dimension (similarity) is integrated, the evaluation accuracy of the imaging definition of the equipment is improved, and after the imaging definition of the second terminal equipment is evaluated, research and development personnel can adjust parameters of a mobile phone camera according to the definition evaluation data of the imaging of the equipment so as to improve the imaging quality of the terminal equipment and improve the user experience.
The application provides a device imaging definition evaluation system, which comprises a first terminal device, a second terminal device and a lamp box, wherein a light source is arranged in the lamp box. The first terminal device is used for: and controlling the light source to be started, adjusting the brightness of the light source to be first brightness, and sending a first shooting instruction to the second terminal equipment. The second terminal device is used for: receiving a first shooting instruction, and shooting a preset image for multiple times to obtain multiple first images; the first terminal device is further configured to: and adjusting the brightness of the light source to be second brightness, and sending a second shooting instruction to the second terminal equipment. The second terminal device is further configured to: receiving a second shooting instruction, and shooting the preset image for multiple times to obtain multiple second images; the method includes transmitting a plurality of images to a first terminal device, the plurality of images including a plurality of first images and a plurality of second images. The first terminal device is further configured to: a plurality of images are received and imaging sharpness of the second terminal device is evaluated based on the plurality of images.
It should be understood that the above-mentioned second terminal device may be a device capable of performing imaging (i.e. taking a picture through a camera), for example, may be a mobile phone, a tablet, a camera, etc., and the above-mentioned first terminal device may be a device capable of performing imaging definition evaluation of a device, for example, may be a mobile phone, a tablet, a computer, etc., which is not limited in this embodiment of the present application.
To facilitate an understanding of the present application, a device sharpness evaluation system 100 according to an embodiment of the present application is described below in conjunction with fig. 1. As shown in FIG. 1, the system 100 includes a card 101, a light box 102, a computer 103, and mobile phones 104-106. Data transmission can be carried out between the computer 103 and the mobile phones 104-106, and the computer 103 can control the lamp box 102 to adjust the brightness of the light source. It should be understood that the computer 103 in fig. 1 corresponds to the first terminal device, the mobile phones 104 to 106 in fig. 1 corresponds to the second terminal device, and the light box 102 in fig. 1 corresponds to the light box.
Optionally, in the system of the embodiment of the present application, the number of the second terminal devices may be one or multiple, and fig. 1 exemplarily shows 3 handsets, but the embodiment of the present application is not limited thereto.
In this embodiment, the graphics card 101 is a preset image, where the preset image includes a texture area, and the texture area is a shadow area in the graphics card 101. The computer 103 may include a setting module where a user may set the number of light source environments and the light source brightness of each light source environment. The user can also set shooting times and shooting time intervals of the mobile phones 104-106 in each light source environment in the setting module of the computer 103. The light box 102 may be illuminated with different light source brightness based on the control of the computer 103. After the user starts the imaging definition evaluation of the mobile phones 104 to 106 on the computer 103, the mobile phones 104 to 106 can shoot the graphics card 101 according to the shooting times and the shooting time intervals set as described above. After the mobile phones 104-106 complete shooting, the shot photos are sent to the computer 103, so that the computer 103 can score the image definition of the shot photos, and the imaging definition of the mobile phones 104-106 can be evaluated according to the score of the images.
As an optional embodiment, the plurality of first images are obtained by shooting a preset image by the second terminal device for a first preset number of times or a first preset duration.
It should be understood that the plurality of second images are obtained by photographing the preset image by the second terminal device for a second preset number of times or for a second preset time period. The first preset times and the second preset times can be the same or different, and the first preset time length and the second preset time length can be the same or different, which is not limited in the application.
It should also be understood that the second terminal device may automatically capture the preset image for the first preset time period. The plurality of first images are a plurality of images shot by the second terminal equipment within a first preset time length. For example, the first preset duration is 20s, and the second terminal device may automatically take 10 photos in one minute, where the 10 photos are a plurality of first images.
As an optional embodiment, the first shooting instruction carries information for indicating a first preset number of times or a first preset duration.
In the following, it is described in detail how the first terminal device evaluates the imaging sharpness of the second terminal device based on a plurality of images.
Fig. 2 is a schematic flowchart of a device imaging sharpness evaluation method 200 provided in an embodiment of the present application. The method 200 may be performed by a first terminal device. The method 200 may include the steps of:
S201, acquiring a plurality of images, wherein the images are obtained by shooting preset images in a plurality of light source environments by a second terminal device respectively, and the light source environments have different light source brightness.
Alternatively, the preset image may be a dead leaf map. The dead leaf map contains rich texture details and is similar to a natural image, and can be used for evaluating the imaging definition of equipment. It should be understood that any image with rich texture details may be used as a preset image to evaluate the imaging sharpness of the device, and the specific image type and content of the preset image are not specifically limited in this application.
It should be appreciated that the number of light source environments may be preset. Different light source environments can be simulated by adjusting the brightness of the light source. For example, a sunny light source environment is simulated by adjusting the light source brightness to a first brightness, a cloudy light source environment is simulated by adjusting the light source brightness to a second brightness, and a night light source environment is simulated by adjusting the light source brightness to a third brightness.
S202, frequency domain parameters of each image in the plurality of images are calculated respectively, and frequency domain comprehensive scores of the plurality of images are determined based on the frequency domain parameters, wherein the frequency domain parameters comprise at least one of sharpness, sharpening intensity and sharpening cutoff frequency, and the frequency domain comprehensive scores comprise at least one of sharpness comprehensive scores, sharpening intensity comprehensive scores and sharpening cutoff frequency comprehensive scores.
It should be appreciated that when the frequency domain parameter is sharpness, the determined frequency domain composite score is a sharpness composite score. And when the frequency domain parameter is the sharpening intensity, the determined frequency domain comprehensive score is the sharpening intensity comprehensive score. And when the frequency domain parameter is the sharpening cut-off frequency, the determined frequency domain comprehensive score is the sharpening cut-off frequency comprehensive score. When the frequency domain parameters are sharpness and sharpening strength, the determined frequency domain composite score is a sharpness composite score and sharpening strength composite score. And so on, the frequency domain parameters and the frequency domain comprehensive scores are in one-to-one correspondence, and are not described in detail herein.
S203, calculating the similarity between each image in the plurality of images and the preset image respectively, and determining the similarity comprehensive score of the plurality of images based on the similarity.
And S204, evaluating the imaging definition of the second terminal equipment based on the frequency domain comprehensive score and the similarity comprehensive score.
According to the device imaging definition evaluation method, the first terminal device calculates the frequency domain scores and the similarity scores of the plurality of images shot by the second terminal device in different light source environments, and the imaging definition of the second terminal device is evaluated through the frequency domain scores and the similarity scores. Therefore, the influence of the light source environment on the imaging of the equipment can be reduced, the evaluation of the two dimensions of the frequency domain dimension and the spatial domain dimension (similarity) is integrated, the evaluation accuracy of the imaging definition of the equipment is improved, and after the imaging definition of the second terminal equipment is evaluated, research and development personnel can adjust parameters of a mobile phone camera according to the definition evaluation data of the imaging of the equipment so as to improve the imaging quality of the terminal equipment and improve the user experience.
As an alternative embodiment, in S202, the first terminal device may determine the frequency domain composite score of the plurality of images based on the frequency domain parameters of each image.
In one possible implementation, the first terminal device calculates the frequency domain score of each image based on the frequency domain parameters of each image. The first terminal equipment calculates the average value of the frequency domain scores of at least one image corresponding to each light source environment in the plurality of light source environments respectively to obtain the frequency domain score of each light source environment. And the first terminal equipment performs weighted summation on the frequency domain scores of each light source environment to obtain a frequency domain comprehensive score.
It should be appreciated that the frequency domain score for each light source environment includes at least one of each light source environment sharpness score, sharpening intensity score, and sharpening cutoff frequency score.
Optionally, the first terminal device may score the frequency domain parameter of each of the plurality of images according to a preset scoring function, so as to obtain a frequency domain score of each of the plurality of images.
Alternatively, different light source environments may be set with different weights when weighting and summing the frequency domain scores of each light source environment. For example, in some scenarios, where a user has a higher requirement for capturing a device in a nighttime environment, a developer may set a weight for a light source environment with a lower smooth brightness when evaluating the sharpness of the device, so as to adjust parameters of the device to improve the imaging sharpness of the device in the nighttime environment when the imaging sharpness evaluated in the light source environment with the lower smooth brightness is not satisfactory. In other scenes, users have higher requirements on shooting of the equipment in the outdoor environment, and research and development personnel can set the weight of the light source environment with higher smooth brightness when evaluating the definition of the equipment, so that when the imaging definition evaluated in the light source environment with higher smooth brightness is not in accordance with the requirements, the parameters of the equipment are adjusted to improve the imaging definition of the equipment in the outdoor environment.
Specifically, the plurality of light source environments may include a first light source environment and a second light source environment, the weight of the first light source environment may be a first preset weight, the weight of the second light source environment may be a second preset weight, and the plurality of images may include a plurality of first images captured by the second terminal device in the first light source environment and a plurality of second images captured by the second terminal device in the second light source environment. The first terminal device may first calculate a frequency domain score for each of the plurality of images. Then, the first terminal device may calculate an average value of the frequency domain scores of the plurality of first images and an average value of the frequency domain scores of the plurality of second images from the frequency domain score of each image. Then, the first terminal device may multiply the average value of the frequency domain scores of the plurality of first images with the first preset weight to obtain a first product, and multiply the average value of the frequency domain scores of the plurality of second images with the second preset weight to obtain a second product. Finally, the first terminal device may determine a sum of the first product and the second product as a frequency domain composite score.
For example, the above light source environments may include a first light source environment and a second light source environment, the weight coefficient of the first light source environment may be 0.7, the weight coefficient of the second light source environment may be 0.3, the image photographed by the second terminal device under the first light source environment may be image 1 and image 2, the image photographed by the second terminal device under the second light source environment may be image 3 and image 4, and the frequency domain score may include a sharpness score and a sharpening intensity score. The frequency domain score of each of the 4 images obtained by calculating the frequency domain parameters of the 4 images by the first terminal device may be as shown in table one. According to the scoring condition in the first table, the first terminal equipment calculates a frequency domain score of the first light source environment, wherein the frequency domain score of the first light source environment comprises: sharpness score= (80+85)/2=82.5, sharpening intensity score= (75+70)/2=72.5. A frequency domain score for the second light source environment, the frequency domain score for the second light source environment comprising: sharpness score= (60+70)/2=65, sharpening strength score= (65+60)/2=62.5. Then, the first terminal equipment obtains a frequency domain comprehensive score through weighted summation calculation, wherein the frequency domain comprehensive score comprises the following components: sharpness composite score = 0.7 x 82.5+0.3 x 65 = 77.25, sharpening intensity composite score = 0.7 x 72.5+0.3 x 62.5 = 69.5.
By setting different weights for different light sources, the influence of the light sources on the frequency domain comprehensive score of the equipment imaging can be reduced, the influence of the light sources on the equipment imaging definition is further reduced, the accuracy of equipment imaging definition evaluation is improved, and equipment parameters can be adjusted according to the evaluation result, so that the adjusted equipment imaging effect is more suitable for the imaging requirement of a specific environment.
List one
In another possible implementation manner, the first terminal device calculates a frequency domain score of each image in the plurality of images based on the frequency domain parameter; and calculating the average value of the frequency domain scores of the plurality of images to obtain a frequency domain comprehensive score.
Illustratively, the number of the plurality of images is 4, and the first terminal device calculates the frequency domain score of the 4 images based on the frequency domain parameters of the 4 images as shown in the table. According to the scoring condition in the first table, the first terminal device calculates a frequency domain comprehensive score comprising: sharpness composite score = (80+85+60+70)/4= 73.75, sharpening strength composite score = (75+70+65+60)/4=67.5.
As an alternative embodiment, in S203, the first terminal device may determine the similarity composite score of the plurality of images based on the similarity of each image.
In one possible implementation, the first terminal device calculates a similarity score for each of the plurality of images based on the similarity of each of the images. The first terminal equipment calculates the average value of the similarity scores of at least one image corresponding to each light source environment in the plurality of light source environments respectively to obtain the similarity score of each light source environment. And the first terminal equipment performs weighted summation on the similarity scores of the light source environments to obtain a similarity comprehensive score.
It should be understood that the weight setting of the light source environments when the similarity score of each light source environment is weighted and summed may be the same as or different from the weight setting of the light source environments when the frequency domain score of each light source environment is weighted and summed as described above, which is not limited in this application.
Specifically, the manner in which the first terminal device determines the similarity composite score of the multiple images is similar to the manner in which the first terminal device determines the frequency domain composite score of the multiple images, which is not described herein.
For example, the above light source environments may include a first light source environment and a second light source environment, the weight coefficient of the first light source environment may be 0.6, the weight coefficient of the first light source environment may be 0.4, the image photographed by the second terminal device under the first light source environment may include image 1 and image 2, and the image photographed by the second terminal device under the second light source environment may include image 3 and image 4. The first terminal device calculates a similarity score of each of the 4 images based on the similarity of the 4 images, as shown in table two. According to the score condition of the table two, the first terminal device calculates a similarity score= (90+80)/2=85 of the first light source environment. Similarity score for the second light source environment= (80+85)/2=82.5. The first terminal device then obtains a similarity composite score=0.6x85+0.4x82.5=84 by a weighted summation calculation.
Watch II
By setting different weights for different light sources, the influence of the light sources on the comprehensive score of the similarity of the imaging of the equipment can be reduced, the influence of the light sources on the imaging definition of the equipment is further reduced, the accuracy of the imaging definition evaluation of the equipment is improved, and the parameters of the equipment can be adjusted according to the evaluation result, so that the imaging effect of the adjusted equipment is more suitable for the imaging requirement of a specific environment.
In another possible implementation manner, the first terminal device calculates a similarity score of each image in the plurality of images based on the similarity of each image; and calculating the average value of the similarity scores of the plurality of images to obtain a similarity comprehensive score.
Illustratively, the number of the plurality of images is 4, and the first terminal device calculates the similarity score of each of the 4 images based on the similarity of the 4 images, as shown in table two. According to the score case of table two above, the first terminal device calculates a similarity composite score= (90+80+80+85)/4= 83.75.
As an optional embodiment, in S204, evaluating the imaging sharpness of the second terminal device based on the frequency domain integrated score and the similarity integrated score includes the following two implementations.
In one possible implementation manner, the first terminal device determines that the imaging sharpness of the second terminal device meets the requirement when the frequency domain integrated score is greater than or equal to a first preset threshold and the similarity integrated score is greater than or equal to a second preset threshold.
It should be understood that in the case where the frequency domain composite score is any one of the sharpness composite score, the sharpening strength composite score, or the sharpening cutoff frequency composite score, the threshold number of the first preset threshold is 1. Correspondingly, under the condition that the sharpness composite score, the sharpening intensity composite score or the sharpening cutoff frequency composite score is greater than or equal to a first preset threshold value and the similarity composite score is greater than or equal to a second preset threshold value, the first terminal device determines that the imaging definition of the second terminal device meets the requirement. In the case where the frequency domain composite score includes composite scores of 2 frequency domain parameters (i.e., any 2 of the sharpness composite score, the sharpening strength composite score, and the sharpening cutoff frequency composite score), the number of threshold values of the first preset threshold value is 2, and the 2 threshold values respectively correspond to the composite scores of 2 frequency domain parameters. And under the condition that the comprehensive scores of the 2 frequency domain parameters are respectively larger than or equal to the corresponding 2 thresholds and the similarity comprehensive score is larger than or equal to a second preset threshold, the first terminal equipment determines that the imaging definition of the second terminal equipment meets the requirement. And so on, according to the difference of the number of the frequency domain comprehensive scores and the frequency domain parameters included by the frequency domain comprehensive scores, the number of the first preset threshold corresponds to the number of the frequency domain comprehensive scores and the number of the first preset threshold, and in this case, the mode that the first terminal equipment determines that the imaging definition of the second terminal equipment meets the requirement is similar to the case that the number of the frequency domain comprehensive scores is 1 or 2, and the application is not repeated.
Illustratively, the frequency domain composite score includes a sharpness composite score, a sharpening strength composite score, and a sharpening cut-off frequency composite score. The first preset threshold corresponds to a threshold 1, a threshold 2, and a threshold 3. Under the conditions that the sharpness composite score is greater than or equal to a threshold value 1, the sharpening strength composite score is greater than or equal to a threshold value 2, the sharpening cutoff frequency composite score is greater than or equal to a threshold value 3, and the similarity composite score is greater than or equal to a second preset threshold value, the first terminal equipment determines that the imaging definition of the second terminal equipment meets the requirement.
By setting the corresponding threshold value for the comprehensive score and the similarity score of each frequency domain parameter, the imaging definition of the device can be more finely granular, so that the accuracy of the imaging definition evaluation of the device can be improved.
In another possible implementation manner, the first terminal device performs weighted summation on the frequency domain integrated score and the similarity integrated score to obtain image definition integrated scores of the multiple images. The first terminal device evaluates the imaging sharpness of the second terminal device based on the image sharpness composite score.
Optionally, in a case where the frequency domain composite score includes composite scores (i.e., sharpness composite score, sharpening intensity composite score, and sharpening cutoff frequency composite score) of 3 frequency domain parameters, the sharpness composite score, the sharpening intensity composite score, the sharpening cutoff frequency composite score, and the similarity composite score are respectively configured with weight coefficients, and the first terminal device performs weighted summation on the four composite scores based on the configured weight coefficients to obtain an image sharpness composite score of the plurality of images. It should be further understood that, in the case where the frequency domain integrated score includes an integrated score of 2 or 1 frequency domain parameters, the weight configuration of the frequency domain integrated score and the calculation manner of the image sharpness integrated score of the plurality of images are similar to the case where the frequency domain integrated score includes an integrated score of 3 frequency domain parameters, and will not be described herein.
Optionally, determining that the imaging definition of the second terminal device meets the requirement under the condition that the image definition integrated score is greater than or equal to a third preset threshold.
Illustratively, the frequency domain composite score includes: sharpness composite scoreS A Sharpening intensity composite score S B Sharpening cut-off frequency composite scoreS C . Similarity composite score ofS D . Configuring the weight coefficient of the sharpness composite score asω 1 The weight coefficient of the sharpening strength integrated score isω 2 The weight coefficient of the sharpening cut-off frequency composite score isω 3 The weight coefficient of the similarity comprehensive score isω 4 . First terminalThe device performs weighted summation on the frequency domain comprehensive score and the similarity comprehensive score to obtain image definition comprehensive scores of the plurality of imagesS ISQ 1 ×S A +ω 2 ×S B +ω 3 ×S C +ω 4 ×S D . At the position ofS ISQ And under the condition that the imaging definition of the second terminal equipment is larger than or equal to a third preset threshold value, determining that the imaging definition of the second terminal equipment meets the requirement.
As an alternative embodiment, before calculating the frequency domain parameters of each image of the plurality of images, the method further includes: the first terminal equipment detects each image to obtain brightness information of each image; obtaining a two-dimensional modulation transfer function of each image based on the brightness information of each image; processing the value of the two-dimensional modulation transfer function of each image to obtain a one-dimensional modulation transfer function of each image; the calculating the frequency domain parameters of each image in the plurality of images respectively includes: frequency domain parameters of each of the plurality of images are calculated based on the one-dimensional modulation transfer function of each image.
Optionally, the first terminal device performs two-dimensional discrete fourier transform on the brightness information of each image to obtain a power spectrum of each image; dividing the power spectrum of each image by the power spectrum of a preset image to obtain a two-dimensional modulation transfer function of each image.
Fig. 3 shows a schematic block diagram of deriving a one-dimensional modulation transfer function based on an image. As shown in fig. 3, the first terminal device may automatically detect each image through YOLO (you only look once) algorithm to obtain a brightness distribution function of a target area of each imagepsf(a,b) Wherein, the method comprises the steps of, wherein,aandbrepresenting the location of the pixel. Luminance distribution function for each target areapsf(a,b) Performing two-dimensional discrete Fourier transform (2 DDFT) to obtain power spectrum of each imageG(m,n) WhereinmThe frequency is represented by a frequency value,nrepresenting the power corresponding to the frequency. Acquiring a power spectrum of a preset imageI(m,n) Power per imageSpectrum (S)G(m, n) Respectively divided by the power spectrum of the preset imageI(m,n) Obtaining a two-dimensional modulation transfer function of each imageWherein, the method comprises the steps of, wherein,ufor the spatial frequency of the signal to be transmitted,vrepresenting different directions of the image, the two-dimensional modulation transfer function may be referred to as an MTF curved surface. The first terminal equipment averages the two-dimensional modulation transfer function of each image in different directions to obtain the one-dimensional modulation transfer function of each image MTF(u,0) 1D That is to sayMTF(u) The corresponding curve of the modulation transfer function may also be referred to as an MTF curve, wherein averaging in different directions refers to averaging over the same spatial frequency on different MTF curvesMTFThe values are averaged.
As an optional embodiment, the calculating the similarity between each of the plurality of images and the preset image includes: calculating the standard deviation of each image by using the number of pixel points of each image, the pixel value of each image and the average value of each image; calculating standard deviation of the preset image by using the number of pixels of the preset image, the pixel value of the preset image and the average value of the preset image; calculating covariance between each image and a preset image by using the number of pixels of each image, the pixel value of each image, the average value of each image, the pixel value of the preset image and the average value of the preset image; and calculating the similarity between each image and the preset image by using the standard deviation of each image, the standard deviation of the preset image and the covariance between each image and the preset image.
As an optional embodiment, the first terminal device may further calculate the similarity between each image and the preset image by any method such as peak signal-to-noise ratio (PSNR), mean square error (mean squared error, MSE), cosine similarity, learning perceived image block similarity (learned perceptualimage patch similarity, LPIPS), and the like, which is not limited in this application.
It should be understood that the method for the first terminal device to calculate the similarity between each image and the preset image through the PSNR, MSE, cosine similarity, or LPIPS may be an existing general method, which is not described in detail in this application.
The apparatus imaging sharpness evaluation method of the present application is described above. The overall process of evaluating the imaging definition of the device of the present application will be described in detail below with a first terminal device as a computer, a second terminal device as a mobile phone, 3 light source environments, and 10 photographs taken in each light source environment.
Step one, a computer responds to the starting operation of a user, and the mobile phone is started to automatically shoot according to a preset light source environment. After the mobile phone finishes shooting all light source environments, the computer acquires all images automatically shot by the mobile phone from the mobile phone.
Optionally, the computer may be configured with a setting interface, through which a user may set the number of light source environments to 3, and the light source brightness values of the light boxes corresponding to each light source environment are different. The user can also set the shooting times of the mobile phone under each light source environment to 10 times and the time interval of each shooting to 5s on the computer through the setting interface. The interface of the computer can be further configured with a button for opening one key, after a user clicks the button for opening one key, the computer responds to clicking operation of the user to control the opening of the lamp box, adjust the brightness of the lamp box to be first brightness, and send a request for starting automatic shooting to the mobile phone. After the mobile phone receives the automatic shooting starting request, an automatic shooting function is started, a preset image is shot every 5s, and after the shooting times reach 10 times, the mobile phone sends shooting completion instructions to the computer. After the computer receives the instruction, the computer judges that the shooting of the 3 light source environments is not completed, controls the lamp box to adjust the brightness of the lamp box to be second brightness, and sends the instruction for starting the automatic shooting to the mobile phone again. After the mobile phone receives the automatic shooting instruction, the mobile phone starts an automatic shooting function, shoots a preset image every 5s, and sends shooting completion instructions to the computer after the shooting times reach 10 times. And after receiving the instruction, the computer judges that the shooting of the 3 light source environments is not completed. The computer controls the lamp box to adjust the brightness of the lamp box to be third brightness, and sends an instruction for starting automatic shooting to the mobile phone for the third time. After the mobile phone receives the instruction for starting the automatic shooting, the automatic shooting function is started, the preset image is shot every 5s, and after the shooting times reach 10 times, the mobile phone sends the instruction for completing shooting to the computer. After receiving the instruction, the computer judges that the shooting of 3 light sources is completed. The computer sends a command for acquiring the picture to the sending mobile phone, and after receiving the command for acquiring the picture, the mobile phone sends the shot 30 pictures to the computer.
And step two, the computer stores the images shot by the mobile phone in groups and calculates the frequency domain parameter and the airspace parameter of each image. The frequency domain parameters comprise sharpness, sharpening strength and sharpening cut-off frequency, and the airspace parameters are the similarity between the images shot by the mobile phone and the preset images.
Specifically, the computer groups 30 images. The 1 st to 10 th images are a first group of images, and the first group of images are shot by the mobile phone under a first brightness. The 11 th to 20 th images are a second group of images photographed by the mobile phone at a second brightness. The 21 st to 30 th images are a third group of images photographed by the mobile phone at a third brightness. Next, the computer calculates a frequency domain parameter and a spatial domain parameter for each of the 30 images.
The calculation modes of the frequency domain parameters and the spatial domain parameters are described in detail below.
Calculation of first frequency domain parameters
The computer detects each image in the 30 images through a yolo algorithm to obtain a brightness distribution function of a target area of each image in the 30 imagespsf(a,b) Wherein, the method comprises the steps of, wherein,aandbrepresenting the location of the pixel. Luminance distribution function for each target areapsf(a,b) Performing 2D discrete Fourier transform to obtain power spectrum of each image G(m,n) WhereinmThe frequency is represented by a frequency value,nrepresenting the power corresponding to the frequency. Acquiring a power spectrum of a preset imageI(m,n) Power spectrum of each imageG(m,n) Respectively divided by the power spectrum of the preset imageI(m,n) Obtaining a two-dimensional modulation transfer function of each imageWherein, the method comprises the steps of, wherein,ufor the spatial frequency of the signal to be transmitted,vrepresenting different directions of the image, the two-dimensional modulation transfer function may be referred to as an MTF curved surface. The computer averages the two-dimensional modulation transfer function of each image in different directions to obtain one-dimensional modulation transfer function of each imageMTF(u,0) 1D That is to sayMTF(u) The one-dimensional modulation transfer function may also be referred to as an MTF curve. Averaging in different directions refers to averaging the same spatial frequency over different MTF curvesMTFThe values are averaged.
Illustratively, fig. 4 shows a schematic diagram of averaging two-dimensional modulation transfer functions in different directions to obtain a one-dimensional modulation transfer function. As shown in FIG. 4, the horizontal axis represents spatial frequency in cycles/pixels and the vertical axis representsMTFIs a value of (a). The direction isv 1 A kind of electronic deviceMTFThe spatial frequency on the curve (i.e. the thicker dotted curve in fig. 4) isu 1 The coordinates of the points of (2) areO(u 1 ,v 1 ,z 1 ) In the direction ofv 2 A kind of electronic deviceMTFThe spatial frequency on the curve (i.e. the thinner dashed curve in fig. 4) is u 1 The coordinates of the points of (2) areP(u 1 ,v 2 ,z 2 ) Obtained after averaging in different directionsMTFThe curve is shown as the direction in the figurevWhen=0MTFCurve (i.e. solid curve in fig. 4), theMTFThe spatial frequency on the curve isu 1 The coordinates of the points of (2) are
After the one-dimensional modulation transfer function is obtained as described above, the computer determines the sharpness, sharpening intensity, and sharpening cutoff frequency of each image based on the one-dimensional modulation transfer function.
Specifically, the computer can calculate the sharpness of each image by the following formulaA
,/>
Wherein,CSF(u) Representing the contrast sensitivity function.CSF(u) The calculation can be performed by the following formula:
wherein,a=0.734,b=0.2,c=0.8。
optionally, the computer calculates the sharpening intensity of each image by the following formulaB
Wherein,MTF max represents the maximum value of the one-dimensional modulation transfer function,MTF u=0 a value representing the one-dimensional modulation transfer function when the spatial frequency is taken to be 0.
Optionally, the computer calculates the sharpening cutoff frequency for each image by the following formulaC
Wherein,representation ofMTF u=k Is an inverse function of (c).
Illustratively, fig. 5 shows a schematic diagram of determining sharpening strength and sharpening cutoff frequency based on a one-dimensional modulation transfer function. As shown in fig. 5, the horizontal axis of the one-dimensional modulation transfer function (MFT curve) is the spatial frequency, the unit of the spatial frequency is the period/pixel, and the vertical axis is MTFIs a value of (a). Sharpening strengthB=2-1=1, sharpening cut-off frequencyC=0.36。
Second, calculation of airspace parameter
The airspace parameter is the similarity between the shot image and the preset image. The computer may calculate the similarity of the preset image and each of the plurality of images by the following formulaS
Wherein,xthe image obtained by photographing is shown as such,yrepresenting a preset image.σ x Representing an imagexCan be used for measuring the standard deviation of the imagexIs used for the contrast ratio of (a),σ y representing a preset imageyCan be used for measuring the standard deviation of the preset imageyIs used for the contrast ratio of (a),σ xy is an imagexAnd a preset imageyCan be used to measure the covariance of the imagexAnd a preset imageyIs used for the structural similarity of the (c) and (d),c 3 is used for maintainingSConstant of stability, 0<c 3 <<1,c 3 I.e. to avoid that the denominator in the above formula is equal to 0. Above-mentionedσ x σ y And (3) the methodσ xy The following formula can be used to obtain:
wherein,Nrepresenting an imagexThe number of pixels in a pixel array,x p is the first in image xpThe pixel value of the individual pixels is set,y p is the first in image ypThe pixel value of the individual pixels is set,μ x is the average value of each image,μ y is the average of the preset images.
And thirdly, calculating the frequency domain score of each image based on the frequency domain parameters of each image by the computer, and calculating the frequency domain comprehensive score of the 30 images according to the frequency domain score of each image. The computer calculates the similarity score of each image based on the similarity between each image and the preset image, and calculates the similarity comprehensive score of the 30 images according to the similarity score of each image.
Alternatively, the computer may pass a preset sharpness scoring functionf(A) Calculating to obtain the firstjSharpness score for a sheet of imageS Aj =f(A j ),jThe value is a positive integer less than or equal to 30. The computer then calculates the average of the sharpness scores for each set of images (i.e., the corresponding image for each light source environment) and takes the average of each set as the sharpness score for the image for the corresponding light source environment. First, theiSharpness score of image under individual light source environmentS A’i iTake on a value of 1,2, or 3. Finally, the computer can calculate the sharpness composite score of the 30 images asWherein, the method comprises the steps of, wherein,α i representing the first imageiThe ratio of the sharpness score in the individual light source environments to the sharpness score.
Alternatively, the computer may score the function by a preset sharpening strengthf(B) Calculating to obtain the firstjSharpening intensity score for a sheet imageS Bj =f(B j ) The computer then calculates the average value of the sharpening intensity score for each group of images, taking the average value of each group as the sharpening intensity score for the image of the corresponding light source environment. First, theiThe sharpening intensity scores of the images in the individual light source environments are:S B’i . Finally, the computer can calculate the sharpening intensity total score of the 30 images as followsWherein, the method comprises the steps of, wherein,β i representing the first imageiThe ratio of the sharpening intensity fraction in the individual light source environments to the total sharpening intensity fraction.
Alternatively, the computer may score the function by a preset sharpening cut-off frequencyf(C) Calculating to obtain the firstjSharpening cut-off frequency score for a sheet imageS Cj =f(C j ) The computer then calculates the average value of the sharpening cutoff frequency score for each group of images, taking the average value of each group as the sharpening cutoff frequency score for the image of the corresponding light source environment. First, theiSharpening cut-off frequency score for images in individual light source environmentsS C’i . Finally, the computer can calculate the sharpening cut-off frequency composite score of the 30 images as followsWhereinε i Representing the first imageiThe ratio of the sharpening cutoff frequency fraction in the individual light source environment to the sharpening cutoff frequency total fraction.
Alternatively, the computer may pass a scoring function of a preset similarityf(D) Calculating to obtain the firstjSimilarity score of sheet image and preset imageS Dj =f(D j ) Then, calculating the average value of the similarity scores of each group of images, and taking the average value of each group as the similarity score of each group of images of the corresponding light source environment. First, theiSimilarity score of images under individual light source environmentsS D’i . Finally, the computer can calculate the similarity composite score of the 30 images as followsWherein, the method comprises the steps of, wherein,λ i representing the first imageiThe ratio of the similarity score in the individual light source environments to the total score of similarity.
Illustratively, table three shows the scoring of the 30 images under a 3 light source environment.
Watch III
And step four, the computer evaluates the imaging definition of the mobile phone according to the frequency domain scores of the 30 images and the similarity scores of the 30 images.
In one possible implementation manner, the computer determines that the imaging sharpness of the mobile phone meets the requirement when the sharpness composite score of the 30 images is greater than a first preset threshold, the sharpening strength score is greater than a second preset threshold, the sharpening cutoff frequency score is greater than a third preset threshold, and the similarity score is greater than a fourth preset threshold. Otherwise, the computer judges that the imaging definition of the mobile phone does not meet the requirement.
In another possible implementation, the computer may weight and sum the frequency domain composite score of the 30 images and the similarity composite score of the 30 images to obtain a sharpness composite score of the 30 imagesS ISQ Next, the computer compares the sharpness composite scoresS ISQ And a fifth preset threshold value, wherein the definition comprehensive scoreS ISQ And under the condition that the imaging definition of the mobile phone is larger than or equal to a fifth preset threshold value, the computer judges that the imaging definition of the mobile phone meets the requirement. Otherwise, the computer judges that the imaging definition of the mobile phone does not meet the requirement.
Specifically, the computer may calculate only the sharpness composite score of the above 30 images by the following formulaS ISQ
S ISQ 1 ×S A +ω 2 ×S B +ω 3 ×S C +ω 4 ×S D
Wherein,ω 1 a weight coefficient that is a sharpness composite score;ω 2 a weight coefficient for the sharpening strength composite score;ω 3 to sharpen the weighting coefficients of the cutoff frequency composite score,ω 4 for the weight coefficient of the similarity composite score,ω 1 +ω 2 +ω 3 +ω 4 =1。
optionally, aConfiguration T 1 T is a preset threshold for sharpness score 2 To sharpen the preset threshold of intensity score, T 3 To sharpen the preset threshold of the cut-off frequency score, T 4 T is a preset threshold value of similarity score all A threshold is preset for the sharpness composite score. The computer can determine whether the image is clear in a number of ways.
In one possible implementation, inS A ≥T 1S B ≥T 2S C ≥T 3 And is provided withS D ≥T 4 Under the condition of (1), the computer judges that the imaging definition of the mobile phone meets the requirement. Otherwise, the computer judges that the imaging definition of the mobile phone does not meet the requirement.
In another possible implementation, inS ISQ ≥T all Under the condition of (1), the computer judges that the imaging definition of the mobile phone meets the requirement. Otherwise, the computer judges that the imaging definition of the mobile phone does not meet the requirement.
The above detailed description of the image evaluation manner, in one possible implementation manner, after the computer evaluates the image shot by the mobile phone, the parameters of the mobile phone can be adjusted according to the scoring condition so as to improve the imaging definition of the mobile phone. In another possible implementation manner, the computer evaluates the definition of the images shot by the plurality of second terminal devices in the above manner, so as to obtain a score of the definition of the images shot by the plurality of second terminal devices. By comparing the scoring conditions of different second terminal devices, parameters (such as exposure) of the second terminal devices can be adjusted to improve the definition of the image shot by the second terminal devices, i.e. to improve the imaging quality of the terminal devices. For example, in the same light source environment, where the sharpness score of device 2 is lower than the sharpness score of device 1, the parameters of device 2 may be adjusted to increase the sharpness score of the photograph taken by device 2 or the composite score of the imaging sharpness, and thus the imaging quality of device 2.
The method of imaging sharpness of the apparatus of embodiments of the present application is described in detail above in connection with fig. 1-5. Next, the apparatus of the embodiment of the present application is described in detail with reference to fig. 6 and 7.
Fig. 6 is a device imaging sharpness evaluation apparatus 600 according to an embodiment of the present application, where the apparatus 600 includes: an acquisition module 601 and a processing module 602. The apparatus 600 is configured to implement steps corresponding to the first terminal device in the above method.
The acquisition module 601 is configured to: and acquiring a plurality of images, wherein the images are obtained by respectively shooting preset images in a plurality of light source environments by the second terminal equipment, and the light source environments have different light source brightness.
The processing module 602 is configured to: respectively calculating frequency domain parameters of each image in the plurality of images, and determining frequency domain comprehensive scores of the plurality of images based on the frequency domain parameters, wherein the frequency domain parameters comprise at least one of sharpness, sharpening strength and sharpening cut-off frequency, and the frequency domain comprehensive scores comprise at least one of sharpness comprehensive scores, sharpening strength comprehensive scores and sharpening cut-off frequency comprehensive scores; respectively calculating the similarity between each image in the plurality of images and a preset image, and determining the similarity comprehensive score of the plurality of images based on the similarity; and evaluating the imaging definition of the second terminal equipment based on the frequency domain comprehensive score and the similarity comprehensive score.
Optionally, the processing module 602 is further configured to: and under the condition that the frequency domain comprehensive score is larger than or equal to a first preset threshold value and the similarity comprehensive score is larger than or equal to a second preset threshold value, determining that the imaging definition of the second terminal equipment meets the requirement.
Optionally, the processing module 602 is further configured to: weighting and summing the frequency domain comprehensive score and the similarity comprehensive score to obtain image definition comprehensive scores of a plurality of images; and evaluating the imaging definition of the second terminal device based on the image definition composite score.
Optionally, the processing module 602 is further configured to: and under the condition that the image definition integrated score is larger than or equal to a third preset threshold value, determining that the imaging definition of the second terminal equipment meets the requirement.
Optionally, the processing module 602 is further configured to: calculating a frequency domain score of each of the plurality of images based on the frequency domain parameters; respectively calculating the average value of the frequency domain scores of at least one image corresponding to each light source environment in the plurality of light source environments to obtain the frequency domain score of each light source environment; and carrying out weighted summation on the frequency domain scores of each light source environment to obtain a frequency domain comprehensive score.
Optionally, the processing module 602 is further configured to: calculating a frequency domain score of each of the plurality of images based on the frequency domain parameters; and calculating the average value of the frequency domain scores of the plurality of images to obtain a frequency domain comprehensive score.
Optionally, the processing module 602 is further configured to: calculating a similarity score of each of the plurality of images based on the similarity; respectively calculating the average value of the similarity scores of at least one image corresponding to each light source environment in the plurality of light source environments to obtain the similarity score of each light source environment; and carrying out weighted summation on the similarity scores of the light source environments to obtain a similarity comprehensive score.
Optionally, the processing module 602 is further configured to: calculating a similarity score of each of the plurality of images based on the similarity; and calculating the average value of the similarity scores of the plurality of images to obtain a similarity comprehensive score.
Optionally, the processing module 602 is further configured to: detecting each image to obtain brightness information of each image; obtaining a two-dimensional modulation transfer function of each image based on the brightness information of each image; processing the value of the two-dimensional modulation transfer function of each image to obtain a one-dimensional modulation transfer function of each image; frequency domain parameters of each of the plurality of images are calculated based on the one-dimensional modulation transfer function of each image.
Optionally, the processing module 602 is further configured to: performing two-dimensional discrete Fourier transform on the brightness information of each image to obtain a power spectrum of each image; dividing the power spectrum of each image by the power spectrum of a preset image to obtain a two-dimensional modulation transfer function of each image.
Optionally, the processing module 602 is further configured to: calculating the standard deviation of each image by using the number of pixel points of each image, the pixel value of each image and the average value of each image; calculating standard deviation of the preset image by using the number of pixels of the preset image, the pixel value of the preset image and the average value of the preset image; calculating covariance between each image and a preset image by using the number of pixels of each image, the pixel value of each image, the average value of each image, the pixel value of the preset image and the average value of the preset image; and calculating the similarity between each image and the preset image by using the standard deviation of each image, the standard deviation of the preset image and the covariance between each image and the preset image.
It should be appreciated that the apparatus 600 herein is embodied in the form of functional modules. The term module herein may refer to an application specific integrated circuit (application specific integrated circuit, ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor, etc.) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an alternative example, it will be understood by those skilled in the art that the apparatus 600 may be specifically configured as the first terminal device in the foregoing embodiment, and the apparatus 600 may be configured to perform each flow and/or step corresponding to the first terminal device in the foregoing method embodiment, which is not repeated herein.
The apparatus 600 has a function of implementing the corresponding steps executed by the first terminal device in the method; the above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above. For example, the processing module 602 may be configured to implement the respective steps and/or flows corresponding to the processing module for performing the processing actions.
In an embodiment of the present application, the apparatus 600 in fig. 6 may also be a chip or a chip system, for example: system On Chip (SOC). Correspondingly, the processing module 602 may be a processing circuit of the chip, which is not limited herein.
Fig. 7 illustrates another device imaging sharpness evaluation apparatus 700 provided in an embodiment of the present application. The apparatus 700 comprises a processor 701, a communication interface 702 and a memory 703. Wherein the processor 701, the communication interface 702 and the memory 703 are in communication with each other via an internal connection path, the memory 703 is configured to store instructions, the processor 701 is configured to execute the instructions stored in the memory 703, the communication interface 702 is configured to receive signals from other modules (e.g., the memory 703), and the communication interface 702 is also configured to send signals to other modules.
It should be understood that the apparatus 700 may be specifically configured as the first terminal device in the foregoing embodiment, and may be configured to perform the steps and/or flows corresponding to the first terminal device in the foregoing method embodiment. The memory 703 may optionally include read only memory and random access memory and provide instructions and data to the processor. A portion of the memory may also include non-volatile random access memory. For example, the memory may also store information of the device type. The processor 701 may be configured to execute instructions stored in a memory, and when the processor 701 executes instructions stored in the memory, the processor 701 is configured to perform the steps and/or flows of the method embodiments corresponding to the apparatus described above. Illustratively, the communication interface 702 may read instructions stored in the memory 703 and send the instructions to the processor 701. The instructions, when executed by the processor 701, may cause the apparatus to perform the steps performed by the first terminal device in the above embodiments.
It should be appreciated that in embodiments of the present application, the processor may be a central processing unit (central processing unit, CPU), the processor may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor executes instructions in the memory to perform the steps of the method described above in conjunction with its hardware. To avoid repetition, a detailed description is not provided herein.
The present application also provides a computer-readable storage medium for storing a computer program for implementing the method corresponding to the first terminal device in the above embodiment.
The present application also provides a computer program product comprising a computer program (which may also be referred to as code, or instructions) which, when run on a computer, is capable of performing the method corresponding to the first terminal device as shown in the above embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system, apparatus and module may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any person skilled in the art may easily think about changes or substitutions within the technical scope of the embodiments of the present application, and the changes or substitutions are intended to be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A device imaging sharpness evaluation method, characterized in that it is applied to a first terminal device, the method comprising:
acquiring a plurality of images, wherein the images are obtained by shooting preset images in a plurality of light source environments by a second terminal device respectively, and the light source environments have different light source brightness;
calculating frequency domain parameters of each image in the plurality of images respectively, and determining a frequency domain composite score of the plurality of images based on the frequency domain parameters, wherein the frequency domain parameters comprise at least one of sharpness, sharpening intensity and sharpening cutoff frequency, and the frequency domain composite score comprises at least one of sharpness composite score, sharpening intensity composite score and sharpening cutoff frequency composite score;
Respectively calculating the similarity between each image in the plurality of images and the preset image, and determining the similarity comprehensive score of the plurality of images based on the similarity;
evaluating the imaging sharpness of the second terminal device based on the frequency domain composite score and the similarity composite score;
the evaluating the imaging sharpness of the second terminal device based on the frequency domain integrated score and the similarity integrated score includes:
determining that the imaging definition of the second terminal equipment meets the requirement under the condition that the frequency domain comprehensive score is larger than or equal to a first preset threshold value and the similarity comprehensive score is larger than or equal to a second preset threshold value; or,
weighting and summing the frequency domain comprehensive score and the similarity comprehensive score to obtain image definition comprehensive scores of the plurality of images;
and under the condition that the image definition integrated score is larger than or equal to a third preset threshold value, determining that the imaging definition of the second terminal equipment meets the requirement.
2. The method of claim 1, wherein the determining the frequency domain composite score for the plurality of images comprises:
Calculating a frequency domain score of each image of the plurality of images based on the frequency domain parameters;
respectively calculating the average value of the frequency domain scores of at least one image corresponding to each light source environment in the plurality of light source environments to obtain the frequency domain score of each light source environment;
and carrying out weighted summation on the frequency domain score of each light source environment to obtain the frequency domain comprehensive score.
3. The method of claim 1, wherein the determining the frequency domain composite score for the plurality of images comprises:
calculating a frequency domain score of each image of the plurality of images based on the frequency domain parameters;
and calculating the average value of the frequency domain scores of the plurality of images to obtain the frequency domain comprehensive score.
4. The method of claim 1, wherein said determining a similarity composite score for the plurality of images comprises:
calculating a similarity score of each of the plurality of images based on the similarity;
respectively calculating the average value of the similarity scores of at least one image corresponding to each light source environment in the plurality of light source environments to obtain the similarity score of each light source environment;
and carrying out weighted summation on the similarity score of each light source environment to obtain the similarity comprehensive score.
5. The method of claim 1, wherein said determining a similarity composite score for the plurality of images comprises:
calculating a similarity score of each of the plurality of images based on the similarity;
and calculating the average value of the similarity scores of the plurality of images to obtain the similarity comprehensive score.
6. The method of claim 1, wherein prior to said separately computing frequency domain parameters for each of said plurality of images, said method further comprises:
detecting each image to obtain brightness information of each image;
obtaining a two-dimensional modulation transfer function of each image based on the brightness information of each image;
processing the value of the two-dimensional modulation transfer function of each image to obtain a one-dimensional modulation transfer function of each image;
the calculating frequency domain parameters of each image in the plurality of images respectively includes:
and calculating the frequency domain parameter of each image in the plurality of images based on the one-dimensional modulation transfer function of each image.
7. The method of claim 6, wherein the obtaining the two-dimensional modulation transfer function for each image based on the luminance information of each image comprises:
Performing two-dimensional discrete Fourier transform on the brightness information of each image to obtain a power spectrum of each image;
dividing the power spectrum of each image by the power spectrum of the preset image to obtain a two-dimensional modulation transfer function of each image.
8. The method according to claim 1, wherein the calculating the similarity between each of the plurality of images and the preset image includes:
calculating the standard deviation of each image by using the number of pixels of each image, the pixel value of each image and the average value of each image;
calculating the standard deviation of the preset image by using the number of pixels of the preset image, the pixel value of the preset image and the average value of the preset image;
calculating covariance between each image and the preset image by using the number of pixels of each image, the pixel value of each image, the average value of each image, the pixel value of the preset image and the average value of the preset image;
and calculating the similarity between each image and the preset image by using the standard deviation of each image, the standard deviation of the preset image and the covariance between each image and the preset image.
9. The equipment imaging definition evaluation system is characterized by comprising a first terminal equipment, a second terminal equipment and a lamp box, wherein a light source is arranged in the lamp box;
the first terminal device is configured to: controlling the light source to be started, adjusting the brightness of the light source to be first brightness, and sending a first shooting instruction to the second terminal equipment;
the second terminal device is configured to: receiving the first shooting instruction, and shooting the preset image for multiple times to obtain multiple first images;
the first terminal device is further configured to: adjusting the brightness of the light source to be second brightness, and sending a second shooting instruction to the second terminal equipment;
the second terminal device is further configured to: receiving the second shooting instruction, and shooting the preset image for multiple times to obtain multiple second images; transmitting a plurality of images to the first terminal device, wherein the plurality of images comprise the plurality of first images and the plurality of second images;
the first terminal device is further configured to: receiving the plurality of images, respectively calculating frequency domain parameters of each image in the plurality of images, and determining a frequency domain composite score of the plurality of images based on the frequency domain parameters of each image, wherein the frequency domain parameters comprise at least one of sharpness, sharpening strength and sharpening cut-off frequency, and the frequency domain composite score comprises at least one of sharpness composite score, sharpening strength composite score and sharpening cut-off frequency composite score; respectively calculating the similarity between each image and the preset image in the plurality of images, and determining the similarity comprehensive score of the plurality of images based on the similarity between each image and the preset image; evaluating the imaging sharpness of the second terminal device based on the frequency domain composite score and the similarity composite score;
The first terminal device is specifically configured to evaluate the imaging sharpness of the second terminal device based on the frequency domain integrated score and the similarity integrated score:
determining that the imaging definition of the second terminal equipment meets the requirement under the condition that the frequency domain comprehensive score is larger than or equal to a first preset threshold value and the similarity comprehensive score is larger than or equal to a second preset threshold value;
weighting and summing the frequency domain comprehensive score and the similarity comprehensive score to obtain image definition comprehensive scores of the plurality of images;
and under the condition that the image definition integrated score is larger than or equal to a third preset threshold value, determining that the imaging definition of the second terminal equipment meets the requirement.
10. The system of claim 9, wherein the plurality of first images are obtained by the second terminal device capturing the preset images a first preset number of times or a first preset duration.
11. The system of claim 10, wherein the first shooting instruction carries information indicating the first preset number of times or the first preset duration.
12. An apparatus for evaluating imaging sharpness of a device, comprising: a processor coupled to a memory for storing a computer program which, when invoked by the processor, causes the apparatus to perform the method of any one of claims 1 to 8.
13. A computer readable storage medium storing a computer program comprising instructions for implementing the method of any one of claims 1 to 8.
CN202311099700.6A 2023-08-30 2023-08-30 Equipment imaging definition evaluation method, system and device Active CN116843683B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311099700.6A CN116843683B (en) 2023-08-30 2023-08-30 Equipment imaging definition evaluation method, system and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311099700.6A CN116843683B (en) 2023-08-30 2023-08-30 Equipment imaging definition evaluation method, system and device

Publications (2)

Publication Number Publication Date
CN116843683A CN116843683A (en) 2023-10-03
CN116843683B true CN116843683B (en) 2024-03-05

Family

ID=88174615

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311099700.6A Active CN116843683B (en) 2023-08-30 2023-08-30 Equipment imaging definition evaluation method, system and device

Country Status (1)

Country Link
CN (1) CN116843683B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008301476A (en) * 2007-05-01 2008-12-11 Sharp Corp Image processing apparatus, image forming apparatus, image reading apparatus, image processing system, and image processing method, image processing program and recording medium therefor
CN101441768A (en) * 2008-11-28 2009-05-27 武汉大学 Image quality evaluating method based on structure distortion and image definition
CN102663764A (en) * 2012-04-25 2012-09-12 武汉大学 Image quality evaluation method based on structural distortion and spatial frequency index
CN103067735A (en) * 2011-09-30 2013-04-24 苹果公司 Full field sharpness test
CN105049838A (en) * 2015-07-10 2015-11-11 天津大学 Objective evaluation method for compressing stereoscopic video quality
CN105139404A (en) * 2015-08-31 2015-12-09 广州市幸福网络技术有限公司 Identification camera capable of detecting photographing quality and photographing quality detecting method
CN106575223A (en) * 2014-07-21 2017-04-19 宇龙计算机通信科技(深圳)有限公司 Image classification method and image classification apparatus
CN110889820A (en) * 2018-08-17 2020-03-17 奥普托斯股份有限公司 Image quality assessment
CN112153371A (en) * 2020-08-24 2020-12-29 珠海格力电器股份有限公司 Image quality detection method, device, storage medium and product detection method
CN114066857A (en) * 2021-11-18 2022-02-18 烟台艾睿光电科技有限公司 Infrared image quality evaluation method and device, electronic equipment and readable storage medium
CN115379208A (en) * 2022-10-19 2022-11-22 荣耀终端有限公司 Camera evaluation method and device
CN115546514A (en) * 2022-01-29 2022-12-30 荣耀终端有限公司 Picture noise calculation method and device and picture test system
CN115965889A (en) * 2022-11-24 2023-04-14 中国人民解放军61932部队 Video quality assessment data processing method, device and equipment
CN116055712A (en) * 2022-08-16 2023-05-02 荣耀终端有限公司 Method, device, chip, electronic equipment and medium for determining film forming rate

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9307230B2 (en) * 2012-06-29 2016-04-05 Apple Inc. Line pair based full field sharpness test

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008301476A (en) * 2007-05-01 2008-12-11 Sharp Corp Image processing apparatus, image forming apparatus, image reading apparatus, image processing system, and image processing method, image processing program and recording medium therefor
CN101441768A (en) * 2008-11-28 2009-05-27 武汉大学 Image quality evaluating method based on structure distortion and image definition
CN103067735A (en) * 2011-09-30 2013-04-24 苹果公司 Full field sharpness test
CN102663764A (en) * 2012-04-25 2012-09-12 武汉大学 Image quality evaluation method based on structural distortion and spatial frequency index
CN106575223A (en) * 2014-07-21 2017-04-19 宇龙计算机通信科技(深圳)有限公司 Image classification method and image classification apparatus
CN105049838A (en) * 2015-07-10 2015-11-11 天津大学 Objective evaluation method for compressing stereoscopic video quality
CN105139404A (en) * 2015-08-31 2015-12-09 广州市幸福网络技术有限公司 Identification camera capable of detecting photographing quality and photographing quality detecting method
CN110889820A (en) * 2018-08-17 2020-03-17 奥普托斯股份有限公司 Image quality assessment
CN112153371A (en) * 2020-08-24 2020-12-29 珠海格力电器股份有限公司 Image quality detection method, device, storage medium and product detection method
CN114066857A (en) * 2021-11-18 2022-02-18 烟台艾睿光电科技有限公司 Infrared image quality evaluation method and device, electronic equipment and readable storage medium
CN115546514A (en) * 2022-01-29 2022-12-30 荣耀终端有限公司 Picture noise calculation method and device and picture test system
CN116055712A (en) * 2022-08-16 2023-05-02 荣耀终端有限公司 Method, device, chip, electronic equipment and medium for determining film forming rate
CN115379208A (en) * 2022-10-19 2022-11-22 荣耀终端有限公司 Camera evaluation method and device
CN115965889A (en) * 2022-11-24 2023-04-14 中国人民解放军61932部队 Video quality assessment data processing method, device and equipment

Also Published As

Publication number Publication date
CN116843683A (en) 2023-10-03

Similar Documents

Publication Publication Date Title
WO2019105154A1 (en) Image processing method, apparatus and device
US9307221B1 (en) Settings of a digital camera for depth map refinement
KR102566998B1 (en) Apparatus and method for determining image sharpness
CN113992861B (en) Image processing method and image processing device
US10645364B2 (en) Dynamic calibration of multi-camera systems using multiple multi-view image frames
US20190378294A1 (en) Stereo camera and height acquisition method thereof and height acquisition system
CN111083458B (en) Brightness correction method, system, equipment and computer readable storage medium
WO2021136386A1 (en) Data processing method, terminal, and server
WO2019029573A1 (en) Image blurring method, computer-readable storage medium and computer device
KR20160038460A (en) Electronic device and control method of the same
CN112672069B (en) Exposure method and apparatus
CN107832598B (en) Unlocking control method and related product
CN105915785A (en) Double-camera shadedness determining method and device, and terminal
EP4297395A1 (en) Photographing exposure method and apparatus for self-walking device
CN113177886B (en) Image processing method, device, computer equipment and readable storage medium
CN116843683B (en) Equipment imaging definition evaluation method, system and device
CN112504473A (en) Fire detection method, device, equipment and computer readable storage medium
WO2023071933A1 (en) Camera photographing parameter adjustment method and apparatus and electronic device
CN112883944B (en) Living body detection method, model training method, device, storage medium and equipment
CN115550558A (en) Automatic exposure method and device for shooting equipment, electronic equipment and storage medium
JP2023063807A (en) Image processing device, image processing method, program, and recording medium
CN114764771A (en) Image quality evaluation method, device, equipment, chip and storage medium
CN109660863B (en) Visual attention area detection method, device, equipment and computer storage medium
CN114339028A (en) Photographing method, electronic device and computer-readable storage medium
CN117726666B (en) Cross-camera monocular picture measurement depth estimation method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant