CN112529871B - Method and device for evaluating image and computer storage medium - Google Patents

Method and device for evaluating image and computer storage medium Download PDF

Info

Publication number
CN112529871B
CN112529871B CN202011459266.4A CN202011459266A CN112529871B CN 112529871 B CN112529871 B CN 112529871B CN 202011459266 A CN202011459266 A CN 202011459266A CN 112529871 B CN112529871 B CN 112529871B
Authority
CN
China
Prior art keywords
event
target
image
evaluation index
evaluating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011459266.4A
Other languages
Chinese (zh)
Other versions
CN112529871A (en
Inventor
茅陈庆
杨海舟
梁晨华
刘名扬
楼炯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision System Technology Co Ltd
Original Assignee
Hangzhou Hikvision System Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision System Technology Co Ltd filed Critical Hangzhou Hikvision System Technology Co Ltd
Priority to CN202011459266.4A priority Critical patent/CN112529871B/en
Publication of CN112529871A publication Critical patent/CN112529871A/en
Application granted granted Critical
Publication of CN112529871B publication Critical patent/CN112529871B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application discloses a method, a device and a computer storage medium for evaluating images, and belongs to the technical field of image processing. In the embodiment of the application, when the target image is evaluated, the evaluation index and the analysis algorithm which are adaptively matched based on the event elements of the associated event can be used for evaluating the target image based on the matched evaluation index and the analysis algorithm. Therefore, in the process of evaluating the event-related image, no human intervention is required, and thus human resources required for evaluating the event-related image can be greatly reduced. In addition, the efficiency of evaluating the event-related images can be improved because the evaluation process does not need to be manually participated. In addition, since the evaluation process matches the evaluation index based on the event, the objectivity of the quality result of the target image can be improved, thereby improving the accuracy of evaluating the event-related image.

Description

Method and device for evaluating image and computer storage medium
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to a method, a device and a computer storage medium for evaluating images.
Background
In the process of performing a depth study analysis of an event by a worker, it is often necessary to analyze the event based on some image associated with the event, which may be a video or a picture. In order to improve the accuracy of event analysis, after the images associated with the event are acquired, the images need to be evaluated first to filter out the low quality images.
In the related art, an image associated with an event is generally evaluated based on a manual manner. The evaluation method is easy to consume human resources, the evaluation result is not objective enough, and accurate evaluation results are difficult to obtain.
Disclosure of Invention
The embodiment of the application provides a method, a device and a computer storage medium for evaluating images, which can improve the accuracy of event-related image evaluation. The technical scheme is as follows:
in one aspect, a method of evaluating an image is provided, the method comprising:
acquiring a target image to be evaluated;
determining a target evaluation index according to event elements of a target event associated with the target image;
and determining the quality result of the target image according to the analysis result of the target image by an analysis algorithm corresponding to the target evaluation index.
Optionally, in a case where the event element of the target event includes an event occurrence site, the target evaluation index includes an evaluation index for evaluating whether a shooting area in an image is the event occurrence site; and/or
In the case where the event element of the target event includes an event occurrence period, the target evaluation index includes an evaluation index for evaluating a relationship between a collection time of an image and the event occurrence period; and/or
In the case that the event element of the target event includes an event object of interest, the target evaluation index includes an evaluation index for evaluating whether the event object of interest exists in the image, and/or for evaluating whether the event object of interest has been marked in the image, and/or for evaluating the sharpness of the event object of interest in the image, where the event object of interest refers to any object that needs to be focused on by analyzing the event; and/or
In the case that the event element of the target event includes event audio, the target evaluation index includes an evaluation index for evaluating whether the event audio corresponds to the image and/or for evaluating the sharpness of the event audio corresponding to the image.
Optionally, the method further comprises:
acquiring a target comprehensive quality evaluation model matched with the target image;
the determining the quality result of the target image according to the analysis result of the target image by the analysis algorithm corresponding to the target evaluation index comprises the following steps:
and determining the quality result of the target image based on the analysis results corresponding to the target evaluation indexes and the logic relation of the analysis results configured in the target comprehensive quality evaluation model.
Optionally, the method further comprises:
acquiring a target grading model matched with the target image;
and determining the quality gear of the target image based on the corresponding relation between the quality result and the quality gear configured in the target gear model.
In another aspect, there is provided an apparatus for evaluating an image, the apparatus comprising:
the acquisition module is used for acquiring a target image to be evaluated;
the determining module is used for determining a target evaluation index according to event elements of a target event associated with the target image;
the determining module is further configured to determine a quality result of the target image according to an analysis result of the target image by an analysis algorithm corresponding to the target evaluation index.
Optionally, in a case where the event element of the target event includes an event occurrence site, the target evaluation index includes an evaluation index for evaluating whether a shooting area in an image is the event occurrence site; and/or
In the case where the event element of the target event includes an event occurrence period, the target evaluation index includes an evaluation index for evaluating a relationship between a collection time of an image and the event occurrence period; and/or
In the case that the event element of the target event includes an event object of interest, the target evaluation index includes an evaluation index for evaluating whether the event object of interest exists in the image, and/or for evaluating whether the event object of interest has been marked in the image, and/or for evaluating the sharpness of the event object of interest in the image, where the event object of interest refers to any object that needs to be focused on by analyzing the event; and/or
In the case that the event element of the target event includes event audio, the target evaluation index includes an evaluation index for evaluating whether the event audio corresponds to the image and/or for evaluating the sharpness of the event audio corresponding to the image.
Optionally, the acquiring module is further configured to:
acquiring a target comprehensive quality evaluation model matched with the target image;
the determining module is used for:
and determining the quality result of the target image based on the analysis results corresponding to the target evaluation indexes and the logic relation of the analysis results configured in the target comprehensive quality evaluation model.
Optionally, the acquiring module is further configured to acquire a target grading model matched with the target image;
the determining module is further configured to:
and determining the quality gear of the target image based on the corresponding relation between the quality result and the quality gear configured in the target gear model.
In another aspect, there is provided an apparatus for evaluating an image, the apparatus comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the steps of the method of evaluating an image described above.
In another aspect, a computer readable storage medium is provided, characterized in that the computer readable storage medium has stored thereon instructions which, when executed by a processor, implement the steps of the method of evaluating an image described above.
In another aspect, a computer program product is provided comprising instructions which, when run on a computer, cause the computer to perform the method of evaluating an image described above.
The beneficial effects that technical scheme that this application embodiment provided include at least:
in the embodiment of the application, when the target image is evaluated, the target evaluation index can be determined based on the event element of the target event associated with the target image, and then the quality result of the target image is determined according to the analysis result of the target image by the analysis algorithm corresponding to the target evaluation index. That is, in the embodiment of the present application, the evaluation index and the analysis algorithm can be adaptively matched based on the event element of the target event associated with the target image, so that the target image is evaluated based on the matched evaluation index and analysis algorithm. According to the method provided by the embodiment of the application, manual participation is not needed in the process of evaluating the event-related image, so that the human resources required for evaluating the event-related image can be greatly reduced. In addition, the evaluation process does not need to be manually participated, so that the efficiency of evaluating the event related images can be improved. In addition, the evaluation process does not need to be manually participated, so that the objectivity of the quality result of the target image can be improved, and the accuracy of evaluating the event-related image is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an image quality evaluation system according to an embodiment of the present application;
FIG. 2 is a flowchart of a method for evaluating an image according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a matching process according to an embodiment of the present application;
FIG. 4 is a flowchart of another method for evaluating an image provided by an embodiment of the present application;
fig. 5 is a schematic structural diagram of an apparatus for evaluating an image according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an evaluation server according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Before explaining the embodiment of the present application in detail, an application scenario of the embodiment of the present application is described.
Currently, workers generally perform unified storage management on event-related images, and then perform quality evaluation on the event-related images. The purpose of carrying out quality evaluation on the event-related images is to prompt related personnel to improve the quality of the uploaded event-related images through quality evaluation, and lay a foundation for carrying out deep research and judgment work such as serial-parallel analysis and the like based on the event-related images.
The event-related image may be a video or a plurality of independent images. The event-related images may include all images associated with the event. Such as may include images or video acquired at the event venue. The event related to the embodiment of the application can be an event which needs to be analyzed by any staff, such as a traffic event, a factory emergency, an article loss event and the like.
The method for evaluating the image provided by the embodiment of the application is applied to the scene for evaluating the event-related image. The purpose is to provide a method capable of improving the evaluation efficiency and accuracy of event-related images.
In order to implement the method provided by the embodiment of the application, the embodiment of the application provides an image quality evaluation system. The image quality evaluation system will be explained in detail first.
Fig. 1 is a schematic diagram of an image quality evaluation system 100 according to an embodiment of the present application. As shown in fig. 1, the image quality evaluation system 100 includes an input-output module 101, an evaluation index management module 102, an algorithm management module 103, and a comprehensive calculation module 104.
The input/output module 101 is used for interaction, and a worker can complete uploading of event-related images through the input/output module. In this embodiment of the present application, the event-related image may be an independent image uploaded by the user, or may be a video frame image in a video uploaded by a worker. For ease of understanding, in fig. 1, an event-related image or an event-related video is input to the input-output module 101 as an example.
Optionally, based on the input-output module 101, the system may also output an evaluation analysis report of the event-related image for further review by the staff.
The evaluation index management module 102 is configured with respective evaluation indexes. These evaluation indexes may include an evaluation index associated with an event element, an image quality index, and the like. The event elements may include elements associated with the event itself, including event occurrence sites, event occurrence time periods, event objects of interest, event audio, and so forth. It should be noted that the foregoing event elements are merely illustrative, and the event elements according to the embodiments of the present application include, but are not limited to, the foregoing event elements, any element associated with an event is within the scope of the event elements in the embodiments of the present application, and are not illustrated herein.
Based on the configured individual evaluation indicators, the evaluation indicator management module 102, when learning an event associated with a certain image, can determine an evaluation indicator matching the image based on the event element of the event.
The algorithm management module 103 is configured to configure each analysis algorithm, and establish a correspondence between the analysis algorithm and the evaluation index, so that after matching to the evaluation index for a certain image, the analysis algorithm can be further matched. Wherein, different evaluation indexes can be corresponding to the same analysis algorithm or different analysis algorithms. The analysis algorithms configured in the algorithm management module 103 may include a video analysis algorithm, an image analysis algorithm, a text analysis algorithm, an audio analysis algorithm, and the like. The video analysis algorithm refers to an algorithm for analyzing videos, and the image analysis algorithm refers to an algorithm for analyzing images acquired independently. The text analysis algorithm is used for analyzing texts in images or videos, and the audio analysis algorithm is used for analyzing audio. It should be noted that the foregoing analysis algorithms are merely illustrative, and the analysis algorithms according to the embodiments of the present application, including but not limited to the foregoing analysis algorithms, are within the scope of the analysis algorithms in the embodiments of the present application, and are not illustrated herein.
In addition, the analysis algorithm configured in the algorithm management module 103 may be various algorithms obtained after training based on a deep learning method or the like. The method may be an algorithm obtained based on other methods, and the method for determining the analysis algorithm in the embodiments of the present application is not limited.
For a certain image, after matching with an analysis algorithm corresponding to a certain evaluation index, the algorithm management module 103 may be further configured to evaluate the evaluation index of the image based on the analysis algorithm, so as to obtain an analysis result of the target image under the evaluation index.
The comprehensive calculation module 104 is configured with different comprehensive quality assessment models and different grading models. The comprehensive quality evaluation model is used for determining the quality result of the target image based on the analysis results of the target image under each evaluation index. The grading model is used for determining a quality grade of the target image based on a quality result of the target image. The target image is any image to be evaluated.
Furthermore, the integrated quality assessment model sets how quality results are determined based on association logic between the respective assessment indicators, and embodiments of the present application are not limited to a specific implementation of how quality assessments are determined based on association logic between the respective assessment indicators. The quality gear corresponding to different scores is set by the grading model based on the score logic, and the specific implementation manner of setting the quality gear corresponding to different scores based on the score logic is not limited in the embodiment of the application. The functions of the composite score calculation model and the grading model will be described in detail in the following method embodiments, and will not be described here.
It should be noted that, the image quality evaluation system shown in fig. 1 may be disposed in a centralized manner in one terminal, or may be disposed in a centralized manner in one server, or alternatively, each module in the image quality evaluation system may be disposed in a distributed manner on different devices, which is not limited in this embodiment of the present application.
In addition, each module in the image quality evaluation system in fig. 1 is a software module, and the naming of each module is based on the function naming of the software module. When the embodiment of the application is applied, different naming can be performed based on requirements, for example, an input/output module can be named as a first module, an evaluation index management module can be named as a second module, and the like. The embodiments of the present application are not limited to the naming of the above modules.
It should be noted that, in the image quality evaluation system shown in fig. 1, the evaluation index and the analysis algorithm may be newly added, updated or deleted. Therefore, based on the image quality evaluation system shown in fig. 1, the manageability, the configurability and the expandability of the evaluation index and the analysis algorithm can be realized, the quality score evaluation and the quality grading of the event-related images can be automatically realized, and the comprehensive analysis report with higher objectivity can be output. The problems of time and labor waste and low efficiency of evaluation and verification of the event-related images in the current stage are solved, and the problems of lack of objective basis of quality verification and incapability of grasping the overall quality condition in the current stage are also solved.
The method for evaluating an image provided in the embodiment of the present application will be explained in detail based on the image quality evaluation system shown in fig. 1. Further, as is known from the image quality evaluation system shown in fig. 1, the execution subject of the method is not limited. For convenience of the following description, the following embodiments will be described taking an example in which an image quality evaluation system is centrally disposed on a terminal. That is, the image quality evaluation system shown in fig. 1 is disposed on the terminal.
Fig. 2 is a flowchart of a method for evaluating an image according to an embodiment of the present application. As shown in fig. 2, the method includes the following steps.
Step 201: and the terminal acquires the target image to be evaluated. For example, the terminal acquires a target image to be evaluated in response to a selection operation for an image acquisition plug-in on a display interface.
In the embodiment of the application, an image analysis control is displayed on a display interface of the terminal, and the image analysis control is used for acquiring a target image to be evaluated and a target event associated with the target image. Therefore, the user can trigger the terminal to start the evaluation flow of the target object by selecting the image analysis control.
In one possible implementation, the staff may store each case-related image at the cloud server in advance, where each stored image corresponds to an associated event. In this way, when the terminal detects the selection operation for the image analysis control, a plurality of images and the event associated with each image can be downloaded from the cloud server, then one image is selected from the plurality of images to serve as a target image in response to the selection operation triggered by the user through the preset operation, and the target event associated with the target image is obtained.
In another possible implementation manner, when the terminal detects a selection operation for the image analysis control, an image input interface may be displayed, so that a user uploads an image to be evaluated through the image input interface and uploads an event associated with the image, thereby enabling the terminal to acquire a target image and a target event associated with the target image.
The image to be evaluated by the user can be uploaded to the terminal in the format of the file to be evaluated. The file to be evaluated may include a plurality of images, and/or a plurality of videos. Because the application scenario in the embodiment of the present application is to evaluate the event-related images, the multiple images and/or multiple videos are associated with the event in advance. For example, a worker may upload a file to be evaluated that includes images and/or videos associated with all traffic accident events that occurred in the last month.
In addition, when the terminal detects the selection operation of the image analysis control, the terminal can also create an evaluation analysis task based on the file to be evaluated uploaded by the staff. The assessment analysis task specifies which images and/or videos associated with which events need to be assessed. The terminal then triggers the evaluation analysis task, and for the images and/or videos associated with any of these events in the evaluation analysis task, the terminal can evaluate the images and/or videos associated with the event by steps 202 to 203. This part of the content is further described in the following steps 202 to 203, and will not be described here again.
The above two implementations are merely two alternative implementations of the terminal acquiring the target image and the target event associated with the target image, and the embodiment of the present application does not limit how the terminal acquires the target image and the target event associated with the target image.
Step 202: and determining a target evaluation index according to the event elements of the target event associated with the target image.
In one possible implementation, as shown in fig. 1, the evaluation index management module is preconfigured with multiple evaluation indexes associated with each event element, where the implementation of step 202 may be: the terminal determines one or more evaluation indexes matched with the target image from a plurality of evaluation indexes according to event elements of a target event associated with the target image to be evaluated, and takes the matched indexes as target evaluation indexes.
The event element of the target event may be an event element of the target event or an event element that needs attention when analyzing the target event. For example, for a target event, if an event occurrence site and an event occurrence period of the target event are predetermined, the event element of the target event may include the event occurrence site and the event occurrence period. For another example, for a target event, if attention to a person appearing in the target event is required, the event element of the target event may include an event attention object. For another example, for a target event, if analysis of sounds occurring in the target event is desired, the event elements of the target event may include event audio.
The event elements that need to be of interest in analyzing the target event may be determined based on the event type of the target event. The event elements that the staff need to pay attention to are usually different for different event types, so the corresponding event elements that need to pay attention to can be configured in advance for different event types. Thus, the event elements that need to be focused when analyzing the target event can be determined based on the type of the target event. For example, in the event type being a contact resource transaction event type, event elements that need to be focused on typically include event occurrence time, event occurrence site, event video or image, event audio, and the like. For another example, in the event type being an item loss event type, the event elements that need to be of interest typically include event occurrence time, event occurrence place, event video or image, but do not need to be of interest in event audio.
After determining the event elements of the target event, determining the target evaluation index matching the target image based on the event elements of the target event may be specifically exemplified as follows.
(1) In the case where the event element of the target event includes an event occurrence scene, the target evaluation index includes an evaluation index for evaluating whether or not the photographed region in the image is the event occurrence scene
For example, for a target event, the event occurrence site for the target event has been defined to be indoor. In this case, the matched target evaluation index may be used to evaluate whether the shooting area in the target image is indoors. For another example, for a target event, the event occurrence site for which the target event has been specified is on the road. In this case, the matched target evaluation index may be used to evaluate whether the photographed region in the target image is on a road.
(2) In the case where the event element of the target event includes an event occurrence period, the target evaluation index includes an evaluation index for evaluating a relationship between the acquisition time of the image and the event occurrence period.
For example, for a target event, it has been clarified that the event occurrence period of the target event is evening. In this case, the matched target evaluation index may be used to evaluate whether the acquisition time of the target image is in the evening.
(3) In the case that the event element of the target event includes an event object of interest, the target evaluation index includes an evaluation index for evaluating whether there is an event object of interest in the image, and/or for evaluating whether there is an event object of interest already marked in the image, and/or for evaluating the sharpness of the event object of interest in the image, the event object of interest refers to any object that needs attention for analyzing the event.
For example, for a target event, it has been clarified that the object of interest in the target event is a vehicle. In this case, the matched target evaluation index may be used to evaluate whether there is a vehicle in the target image, and/or to evaluate whether a worker has marked a vehicle in the target image, and/or to evaluate the sharpness of the vehicle in the target image.
(4) In the case where the event element of the target event includes event audio, the target evaluation index includes an evaluation index for evaluating whether or not event audio corresponds to the image and/or for evaluating sharpness of the event audio corresponding to the image.
For example, for a target event, the tone color of the involved person who needs to pay attention in the target event has been clarified. In this case, the matched target evaluation index may be used to evaluate the sharpness of the event audio corresponding to the target image and/or to evaluate the sharpness of the event audio corresponding to the target image.
It should be noted that the above examples are merely for illustrating how the event elements of the target event match the target evaluation index, and the embodiments of the present application are not limited thereto. In applying the embodiments of the present application, how to match the target evaluation index based on the event element of the target event may be customized based on the requirement, which is not illustrated herein.
Step 203: and determining the quality result of the target image according to the analysis result of the analysis algorithm corresponding to the target evaluation index on the target image.
In one possible implementation, the target evaluation index in step 202 includes one or more target evaluation indexes, where the implementation process of step 202 is: and respectively determining scores corresponding to the one or more target evaluation indexes according to analysis algorithms corresponding to the one or more target evaluation indexes, wherein the scores are analysis results of the analysis algorithm on the target image, and determining total scores of the target image according to the scores corresponding to the one or more target evaluation indexes, wherein the quality results comprise the total scores.
The above-mentioned determining the scores corresponding to the one or more target evaluation indexes respectively according to the analysis algorithm corresponding to the one or more target evaluation indexes respectively specifically refers to: and each target evaluation index is provided with a corresponding analysis algorithm, and for any target evaluation index, the target image is analyzed based on the analysis algorithm corresponding to the target evaluation index, so that the score of the target image under the target evaluation index can be obtained.
For example, in step 202, 3 target evaluation indexes are matched, and are respectively marked as target evaluation index 1, target evaluation index 2, and target evaluation index 3. In this way, after analysis based on the respective corresponding analysis algorithms, the score corresponding to the target evaluation index 1, the score corresponding to the target evaluation index 2, and the score corresponding to the target evaluation index 3 are obtained.
In addition, the analysis algorithm corresponding to each target evaluation index may be preconfigured, that is, a correspondence between the evaluation index and the analysis algorithm is established in the image quality evaluation system in advance, so that when the target evaluation index is matched in step 202, the analysis algorithm corresponding to each target evaluation index may be obtained based on the correspondence. And will not be described in detail herein.
The same evaluation index may have different importance in different events. Therefore, in the embodiment of the present application, the image quality evaluation system shown in fig. 1 may be further configured with a plurality of comprehensive quality evaluation models, where a logical relationship between each analysis result is indicated in any comprehensive quality evaluation model, so as to obtain a final quality result based on each analysis result and the logical relationship.
In one possible implementation, the comprehensive quality assessment model may be a comprehensive score calculation model, where weights for different evaluation indexes are configured in any one of the multiple comprehensive score calculation models. At this time, after matching the target index in step 202, the terminal may also acquire a target composite score calculation model matching the target image based on the plurality of composite score calculation models. In this scenario, the implementation manner of determining the total score of the target image according to the scores corresponding to the one or more target evaluation indexes may be: the total score of the target image may be determined based on the weights of the respective target evaluation indicators configured in the target composite score calculation model and the scores corresponding to the one or more target evaluation indicators.
Wherein, based on the weight of each target evaluation index and the scores corresponding to one or more target evaluation indexes, determining the total score of the target image may refer to: and carrying out weighted summation or weighted product on the scores corresponding to the target evaluation indexes to obtain the total score.
The weights for different evaluation indexes configured in any of the above comprehensive score calculation models may be configured by a worker based on requirements, which is not limited in the embodiment of the present application.
Further, after the target index is matched in step 201, a specific implementation manner of the terminal based on obtaining the target comprehensive score calculation model matched with the target image from the plurality of comprehensive score calculation models may be: and obtaining a comprehensive score calculation model with the consistent configured evaluation index and one or more matched target evaluation indexes from the multiple comprehensive score calculation models, thereby obtaining a target comprehensive score calculation model.
As can be seen, in the embodiment of the present application, after the target evaluation indexes are matched in step 201, the weights between the target evaluation indexes may be also matched. That is, in the embodiment of the present application, the weight of any evaluation index is not fixed, but adaptively matched to different weights based on the difference of the events associated with the image. Thereby improving the accuracy of the subsequent evaluation of the image.
That is, in the embodiment of the present application, after inputting the target image into the image quality evaluation system, the system may adaptively match not only the target evaluation index, but also the corresponding analysis algorithm to the integrated quality evaluation model based on the above implementation manner. I.e. determining which comprehensive quality assessment model to use for determining the aforementioned quality results based on event elements of the target event.
Alternatively, instead of configuring a plurality of comprehensive quality evaluation models in advance, after the analysis results under each target index are obtained, the analysis results corresponding to each target evaluation index are directly processed according to the same rule, so that the quality result can be obtained. For example, instead of configuring a plurality of comprehensive score calculation models in advance, after scores under each target index are obtained, scores corresponding to each target evaluation index are directly summed or accumulated, and then a total score can be obtained. And will not be described in detail herein.
In addition, in different events, different mass gear dividing modes can be adopted. Therefore, in the system shown in fig. 1, a plurality of shift models may be further configured, and a correspondence between the quality result and the quality shift may be further configured in any one of the plurality of shift models. At this time, after the above-described target integrated quality evaluation model is matched, a target classification model matched with the target image may also be acquired. In this way, after obtaining the quality result based on the target quality evaluation model, the quality gear of the target image may also be determined based on the correspondence between the quality result configured in the target shift model and the quality gear. The evaluation of the target image at this time further includes evaluating a quality gear of the target image.
That is, in the embodiment of the present application, after the quality result is determined, the quality gear of the target image may also be determined according to the quality result of the target image. The quality gear of the target image is used for indicating the grade of the quality of the target image so as to be convenient for staff to roughly know the quality of the target image.
For example, the quality result for the target image is expressed in the form of the total score described above, and in this case, n quality steps of Q1, Q2, …, qn are set in advance based on different score intervals in the target shift model. The mass represented by Q1, Q2, …, qn gradually increases. Thus, the staff can know the approximate grade of the quality of the target image based on the quality gear of the target image.
It should be noted that, the embodiment of the present application is not limited to how to set the scoring intervals corresponding to the respective quality gear positions of the image, and when the embodiment of the present application is applied, the setting may be based on different requirements.
From this, after matching each target evaluation index based on step 201, the weight and the quality gear setting mode between each target evaluation index may also be matched based on the target event. That is, in the embodiment of the present application, the image quality shift division manner for all types of events is not fixed, but is different based on the types of events. Thereby improving the accuracy of the subsequent evaluation of the image.
That is, in the embodiment of the present application, after inputting the target image into the image quality evaluation system, the system may be matched not only to the target evaluation index and the corresponding analysis algorithm, but also to the corresponding comprehensive quality evaluation model and the corresponding classification model based on the above implementation manner. That is, it is determined which mass gear dividing method is used to determine the mass gear of the target event according to the event element of the target event. The above-described process may be represented by the matching process shown in fig. 3, and will not be described in detail herein.
Alternatively, instead of configuring a plurality of grading models in advance, a unified grading model may be directly used to determine the quality grade of the target image after obtaining the scores under the respective target indexes. And will not be described in detail herein.
Based on the system shown in fig. 1, the above implementation may be implemented based on a comprehensive calculation module. The comprehensive calculation module may implement step 202 using any of the implementations described above.
Step 204: and displaying the quality result of the target image.
After the quality result of the target image is obtained through steps 201 to 203, the quality result may be displayed so that the user can intuitively feel the analysis result of the target image.
Further, the target image in the above steps 201 to 203 may be one image. Optionally, when the video associated with a certain event needs to be evaluated, the target image is the video, and the evaluation process can be completed through steps 201 to 203.
Specifically, a video to be evaluated is obtained, wherein the video comprises a plurality of frames of images; and taking each frame of image in the multi-frame images as a target image, executing the steps 201 to 202, and determining an evaluation report of the video according to the quality result of each frame of image in the multi-frame images.
For example, when the analysis result of the image is represented by a scoring method and the quality result of the image is represented by a total scoring method, the scoring report may include the total number of images in the video, the total score of each image in the video, and the image distribution situation in different scoring intervals in the video. The image distribution conditions in the video at different scoring intervals can include the number of images at the different scoring intervals, the duty ratio of the images at the different scoring intervals and the like.
In the following, the embodiment of the present application will be further described with reference to the flowchart shown in fig. 4, as shown in fig. 4, a worker uploads a video file to be evaluated in the image quality evaluation system shown in fig. 1, and the image quality evaluation system creates a video analysis evaluation task, and automatically matches evaluation indexes (the evaluation indexes may include a case-related element index (I1), a case-related video image quality index (I2), a case-related target feature index (I3), a case-related video audio index (I4), etc.) according to the video analysis evaluation task, matches corresponding analysis algorithms (A1), a text analysis algorithm (A2), an audio analysis algorithm (A3), etc.), and matches a comprehensive score calculation model and a grading model (Cn) of the video file. After the matching is completed, the image quality evaluation system analyzes each image in the video, invokes various analysis algorithms to start analysis and calculate scores, obtains various scores (S1-Sn) related to a case-related element index (I1), a case-related video image quality index (I2), a case-related target characteristic index (I3), a case-related video audio index (I4) and the like, invokes a previously loaded comprehensive score calculation model and a classification model to calculate and judge the overall score (Stotal) and the quality gear Qn of the video image quality respectively. And finally, outputting an evaluation report of the case related video of the event to a worker. The evaluation report content comprises the total video image number of the input model, the total score of each video image, the number and the duty ratio of the video images under each score and the like. The staff can intuitively see the overall situation of the pattern-related video image quality by reading the evaluation report, so as to provide basis for the assessment of the staff.
In summary, when evaluating the target image, the evaluation index and the analysis algorithm can be adaptively matched based on the event elements of the target event associated with the target image, so that the target image is evaluated based on the matched evaluation index and analysis algorithm. According to the method provided by the embodiment of the application, manual participation is not needed in the process of evaluating the event-related image, so that the human resources required for evaluating the event-related image can be greatly reduced. In addition, the evaluation process does not need to be manually participated, so that the efficiency of evaluating the event related images can be improved. In addition, the evaluation process does not need to be manually participated, so that the objectivity of the quality result of the target image can be improved, and the accuracy of evaluating the event-related image is improved.
All the above optional technical solutions may be combined according to any choice to form an optional embodiment of the present application, which is not described in detail herein.
Fig. 5 is a schematic structural diagram of an apparatus for evaluating an image according to an embodiment of the present application, and as shown in fig. 5, an apparatus 500 for evaluating an image may include the following modules.
An acquiring module 501, configured to acquire a target image to be evaluated; for example, in response to a selection operation for an image analysis control on a display interface, acquiring a target image to be evaluated and a target event associated with the target image;
A determining module 502, configured to determine a target evaluation index according to an event element of a target event associated with the target image;
the determining module 502 is further configured to determine a quality result of the target image according to an analysis result of the target image by an analysis algorithm corresponding to the target evaluation index.
Optionally, in a case where the event element of the target event includes an event occurrence site, the target evaluation index includes an evaluation index for evaluating whether a shooting area in the image is the event occurrence site; and/or
In the case where the event element of the target event includes an event occurrence period, the target evaluation index includes an evaluation index for evaluating a relationship between the acquisition time of the image and the event occurrence period; and/or
In the case that the event element of the target event includes an event object of interest, the target evaluation index includes an evaluation index for evaluating whether the event object of interest exists in the image and/or for evaluating whether the event object of interest has been marked in the image and/or for evaluating the sharpness of the event object of interest in the image, the event object of interest refers to any object that needs to be focused on by the analysis event; and/or
In the case where the event element of the target event includes event audio, the target evaluation index includes an evaluation index for evaluating whether or not event audio corresponds to the image and/or for evaluating sharpness of the event audio corresponding to the image.
Optionally, the obtaining module is further configured to:
acquiring a target comprehensive quality evaluation model matched with a target image;
the determining module is used for:
and determining the quality result of the target image based on the analysis results corresponding to the target evaluation indexes and the logic relation of the analysis results configured in the target comprehensive quality evaluation model.
Optionally, the obtaining module is further configured to obtain a target grading model matched with the target image;
the determination module is also for:
and determining the quality gear of the target image based on the corresponding relation between the quality result configured in the target gear model and the quality gear.
In summary, in the embodiment of the present application, when the target image is evaluated, the evaluation index and the analysis algorithm may be adaptively matched based on the event element of the target event associated with the target image, so that the target image is evaluated based on the matched evaluation index and analysis algorithm. According to the method provided by the embodiment of the application, manual participation is not needed in the process of evaluating the event-related image, so that the human resources required for evaluating the event-related image can be greatly reduced. In addition, the evaluation process does not need to be manually participated, so that the efficiency of evaluating the event related images can be improved. In addition, the evaluation process does not need to be manually participated, so that the objectivity of the quality result of the target image can be improved, and the accuracy of evaluating the event-related image is improved.
It should be noted that: in the image evaluation apparatus provided in the above embodiment, only the division of the above functional modules is used for illustration, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to perform all or part of the functions described above. In addition, the device for evaluating an image and the method embodiment for evaluating an image provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the device and the method embodiment are detailed in the method embodiment, which is not repeated herein.
Fig. 6 is a block diagram of a terminal 600 according to an embodiment of the present application. Any of the foregoing image quality evaluation systems may be implemented by the terminal shown in fig. 6. The terminal 600 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal 600 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, etc.
In general, the terminal 600 includes: a processor 601 and a memory 602.
Processor 601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 601 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 601 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 601 may be integrated with a GPU (Graphics Processing Unit, image processor) for taking care of rendering and rendering of content that the display screen is required to display. In some embodiments, the processor 601 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
The memory 602 may include one or more computer-readable storage media, which may be non-transitory. The memory 602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 602 is used to store at least one instruction for execution by processor 601 to implement the method of evaluating an image provided by the method embodiments herein.
In some embodiments, the terminal 600 may further optionally include: a peripheral interface 603, and at least one peripheral. The processor 601, memory 602, and peripheral interface 603 may be connected by a bus or signal line. The individual peripheral devices may be connected to the peripheral device interface 603 via buses, signal lines or a circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 604, a display 605, a camera assembly 606, audio circuitry 607, a positioning assembly 608, and a power supply 609.
Peripheral interface 603 may be used to connect at least one Input/Output (I/O) related peripheral to processor 601 and memory 602. In some embodiments, the processor 601, memory 602, and peripheral interface 603 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 601, memory 602, and peripheral interface 603 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 604 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 604 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 604 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 604 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuitry 604 may also include NFC (Near Field Communication, short range wireless communication) related circuitry, which is not limited in this application.
The display screen 605 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 605 is a touch display, the display 605 also has the ability to collect touch signals at or above the surface of the display 605. The touch signal may be input as a control signal to the processor 601 for processing. At this point, the display 605 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 605 may be one, providing a front panel of the terminal 600; in other embodiments, the display 605 may be at least two, respectively disposed on different surfaces of the terminal 600 or in a folded design; in other embodiments, the display 605 may be a flexible display, disposed on a curved surface or a folded surface of the terminal 600. Even more, the display 605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display 605 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 606 is used to capture images or video. Optionally, the camera assembly 606 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 606 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 607 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and environments, converting the sound waves into electric signals, and inputting the electric signals to the processor 601 for processing, or inputting the electric signals to the radio frequency circuit 604 for voice communication. For the purpose of stereo acquisition or noise reduction, a plurality of microphones may be respectively disposed at different portions of the terminal 600. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 601 or the radio frequency circuit 604 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 607 may also include a headphone jack.
The location component 608 is used to locate the current geographic location of the terminal 600 to enable navigation or LBS (Location Based Service, location based services). The positioning component 608 may be a positioning component based on the United states GPS (Global Positioning System ), the Beidou system of China, the Granati system of Russia, or the Galileo system of the European Union.
A power supply 609 is used to power the various components in the terminal 600. The power source 609 may be alternating current, direct current, disposable battery or rechargeable battery. When the power source 609 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 600 further includes one or more sensors 610. The one or more sensors 610 include, but are not limited to: acceleration sensor 611, gyroscope sensor 612, pressure sensor 613, fingerprint sensor 614, optical sensor 615, and proximity sensor 616.
The acceleration sensor 611 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 600. For example, the acceleration sensor 611 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 601 may control the display screen 605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 611. The acceleration sensor 611 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 612 may detect a body direction and a rotation angle of the terminal 600, and the gyro sensor 612 may collect a 3D motion of the user on the terminal 600 in cooperation with the acceleration sensor 611. The processor 601 may implement the following functions based on the data collected by the gyro sensor 612: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 613 may be disposed at a side frame of the terminal 600 and/or at a lower layer of the display 605. When the pressure sensor 613 is disposed at a side frame of the terminal 600, a grip signal of the terminal 600 by a user may be detected, and a left-right hand recognition or a shortcut operation may be performed by the processor 601 according to the grip signal collected by the pressure sensor 613. When the pressure sensor 613 is disposed at the lower layer of the display screen 605, the processor 601 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 605. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 614 is used for collecting the fingerprint of the user, and the processor 601 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 614, or the fingerprint sensor 614 identifies the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 601 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 614 may be provided on the front, back, or side of the terminal 600. When a physical key or vendor Logo is provided on the terminal 600, the fingerprint sensor 614 may be integrated with the physical key or vendor Logo.
The optical sensor 615 is used to collect ambient light intensity. In one embodiment, processor 601 may control the display brightness of display 605 based on the intensity of ambient light collected by optical sensor 615. Specifically, when the intensity of the ambient light is high, the display brightness of the display screen 605 is turned up; when the ambient light intensity is low, the display brightness of the display screen 605 is turned down. In another embodiment, the processor 601 may also dynamically adjust the shooting parameters of the camera assembly 606 based on the ambient light intensity collected by the optical sensor 615.
A proximity sensor 616, also referred to as a distance sensor, is typically provided on the front panel of the terminal 600. The proximity sensor 616 is used to collect the distance between the user and the front of the terminal 600. In one embodiment, when the proximity sensor 616 detects a gradual decrease in the distance between the user and the front face of the terminal 600, the processor 601 controls the display 605 to switch from the bright screen state to the off screen state; when the proximity sensor 616 detects that the distance between the user and the front surface of the terminal 600 gradually increases, the processor 601 controls the display screen 605 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 6 is not limiting of the terminal 600 and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
The present embodiments also provide a non-transitory computer-readable storage medium, which when executed by a processor of a terminal, enables the terminal to perform the method of evaluating an image provided in the above embodiments.
The present embodiments also provide a computer program product containing instructions that, when run on a terminal, cause the terminal to perform the method of evaluating an image provided by the above embodiments.
Fig. 7 is a schematic diagram of a server structure according to an embodiment of the present application. Any of the foregoing image quality evaluation systems may be implemented by a server shown in fig. 7. The server may be a server in a backend server cluster. Specifically, the present invention relates to a method for manufacturing a semiconductor device.
The server 700 includes a Central Processing Unit (CPU) 701, a system memory 704 including a Random Access Memory (RAM) 702 and a Read Only Memory (ROM) 703, and a system bus 705 connecting the system memory 704 and the central processing unit 701. The server 700 also includes a basic input/output system (I/O system) 706, which helps to transfer information between various devices within the computer, and a mass storage device 707 for storing an operating system 713, application programs 714, and other program modules 715.
The basic input/output system 706 includes a display 708 for displaying information and an input device 709, such as a mouse, keyboard, or the like, for a user to input information. Wherein both the display 708 and the input device 709 are coupled to the central processing unit 701 through an input output controller 710 coupled to the system bus 705. The basic input/output system 706 may also include an input/output controller 710 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, the input output controller 710 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 707 is connected to the central processing unit 701 through a mass storage controller (not shown) connected to the system bus 705. The mass storage device 707 and its associated computer readable media provide non-volatile storage for the server 700. That is, the mass storage device 707 may include a computer readable medium (not shown) such as a hard disk or CD-ROM drive.
Computer readable media may include computer storage media and communication media without loss of generality. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will recognize that computer storage media are not limited to the ones described above. The system memory 704 and mass storage device 707 described above may be collectively referred to as memory.
According to various embodiments of the present application, server 700 may also operate by a remote computer connected to the network through a network, such as the Internet. I.e., server 700 may be connected to network 712 through a network interface unit 711 coupled to system bus 705, or other types of networks or remote computer systems (not shown) may be coupled using network interface unit 711.
The memory also includes one or more programs, one or more programs stored in the memory and configured to be executed by the CPU. The one or more programs include instructions for performing the methods of evaluating images provided by embodiments of the present application.
The present embodiment also provides a non-transitory computer readable storage medium, which when executed by a processor of a server, enables the server to perform the method for evaluating an image provided in the above embodiment.
The present embodiments also provide a computer program product containing instructions that, when run on a server, cause the server to perform the method of evaluating an image provided by the above embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments is merely illustrative of the present application and is not intended to limit the embodiments of the present application, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the embodiments of the present application are intended to be included within the scope of the present application.

Claims (10)

1. A method of evaluating an image, the method comprising:
acquiring a target image to be evaluated, wherein the target image is one of a plurality of images associated with a target event;
determining a target evaluation index corresponding to each event element in at least one event element according to at least one event element of the target event;
and determining the quality result of the target image according to the analysis result of the target image by an analysis algorithm corresponding to the target evaluation index.
2. The method of claim 1, wherein,
in the case that the event element of the target event includes an event occurrence site, the target evaluation index includes an evaluation index for evaluating whether a shooting area in an image is the event occurrence site; and/or
In the case where the event element of the target event includes an event occurrence period, the target evaluation index includes an evaluation index for evaluating a relationship between a collection time of an image and the event occurrence period; and/or
In the case that the event element of the target event includes an event object of interest, the target evaluation index includes an evaluation index for evaluating whether the event object of interest exists in the image, and/or for evaluating whether the event object of interest has been marked in the image, and/or for evaluating the sharpness of the event object of interest in the image, where the event object of interest refers to any object that needs to be focused on by analyzing the event; and/or
In the case that the event element of the target event includes event audio, the target evaluation index includes an evaluation index for evaluating whether the event audio corresponds to the image and/or for evaluating the sharpness of the event audio corresponding to the image.
3. The method of claim 1, wherein the method further comprises:
acquiring a target comprehensive quality evaluation model matched with the target image;
the determining the quality result of the target image according to the analysis result of the target image by the analysis algorithm corresponding to the target evaluation index comprises the following steps:
and determining the quality result of the target image based on the analysis results corresponding to the target evaluation indexes and the logic relation of the analysis results configured in the target comprehensive quality evaluation model.
4. The method of claim 1, wherein,
the method further comprises the steps of:
acquiring a target grading model matched with the target image;
and determining the quality gear of the target image based on the corresponding relation between the quality result and the quality gear configured in the target gear model.
5. An apparatus for evaluating an image, the apparatus comprising:
the acquisition module is used for acquiring a target image to be evaluated, wherein the target image is one of a plurality of images associated with a target event;
the determining module is used for determining a target evaluation index corresponding to each event element in at least one event element according to at least one event element of the target event;
the determining module is further configured to determine a quality result of the target image according to an analysis result of the target image by an analysis algorithm corresponding to the target evaluation index.
6. The apparatus of claim 5, wherein,
in the case that the event element of the target event includes an event occurrence site, the target evaluation index includes an evaluation index for evaluating whether a shooting area in an image is the event occurrence site; and/or
In the case where the event element of the target event includes an event occurrence period, the target evaluation index includes an evaluation index for evaluating a relationship between a collection time of an image and the event occurrence period; and/or
In the case that the event element of the target event includes an event object of interest, the target evaluation index includes an evaluation index for evaluating whether the event object of interest exists in the image, and/or for evaluating whether the event object of interest has been marked in the image, and/or for evaluating the sharpness of the event object of interest in the image, where the event object of interest refers to any object that needs to be focused on by analyzing the event; and/or
In the case that the event element of the target event includes event audio, the target evaluation index includes an evaluation index for evaluating whether the event audio corresponds to the image and/or for evaluating the sharpness of the event audio corresponding to the image.
7. The apparatus of claim 5, wherein the acquisition module is further to:
acquiring a target comprehensive quality evaluation model matched with the target image;
the determining module is used for:
And determining the quality result of the target image based on the analysis results corresponding to the target evaluation indexes and the logic relation of the analysis results configured in the target comprehensive quality evaluation model.
8. The apparatus of claim 5, wherein,
the acquisition module is also used for acquiring a target grading model matched with the target image;
the determining module is further configured to:
and determining the quality gear of the target image based on the corresponding relation between the quality result and the quality gear configured in the target gear model.
9. An apparatus for evaluating an image, the apparatus comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the steps of the method of any of the preceding claims 1 to 4.
10. A computer readable storage medium having stored thereon instructions which, when executed by a processor, implement the steps of the method of any of the preceding claims 1 to 4.
CN202011459266.4A 2020-12-11 2020-12-11 Method and device for evaluating image and computer storage medium Active CN112529871B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011459266.4A CN112529871B (en) 2020-12-11 2020-12-11 Method and device for evaluating image and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011459266.4A CN112529871B (en) 2020-12-11 2020-12-11 Method and device for evaluating image and computer storage medium

Publications (2)

Publication Number Publication Date
CN112529871A CN112529871A (en) 2021-03-19
CN112529871B true CN112529871B (en) 2024-02-23

Family

ID=74999182

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011459266.4A Active CN112529871B (en) 2020-12-11 2020-12-11 Method and device for evaluating image and computer storage medium

Country Status (1)

Country Link
CN (1) CN112529871B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113923444B (en) * 2021-10-08 2024-04-30 广州辰达精密光电科技有限公司 Zoom lens quality evaluation method and device
CN117370602A (en) * 2023-04-24 2024-01-09 深圳云视智景科技有限公司 Video processing method, device, equipment and computer storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103098088A (en) * 2011-07-13 2013-05-08 松下电器产业株式会社 Image evaluation device, image evaluation method, program, and integrated circuit
CN110246110A (en) * 2018-03-01 2019-09-17 腾讯科技(深圳)有限公司 Image evaluation method, device and storage medium
CN111475613A (en) * 2020-03-06 2020-07-31 深圳壹账通智能科技有限公司 Case classification method and device, computer equipment and storage medium
CN111612657A (en) * 2020-05-22 2020-09-01 创新奇智(重庆)科技有限公司 Client type identification method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108090210A (en) * 2017-12-29 2018-05-29 广州酷狗计算机科技有限公司 The method and apparatus for searching for audio

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103098088A (en) * 2011-07-13 2013-05-08 松下电器产业株式会社 Image evaluation device, image evaluation method, program, and integrated circuit
CN110246110A (en) * 2018-03-01 2019-09-17 腾讯科技(深圳)有限公司 Image evaluation method, device and storage medium
CN111475613A (en) * 2020-03-06 2020-07-31 深圳壹账通智能科技有限公司 Case classification method and device, computer equipment and storage medium
CN111612657A (en) * 2020-05-22 2020-09-01 创新奇智(重庆)科技有限公司 Client type identification method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112529871A (en) 2021-03-19

Similar Documents

Publication Publication Date Title
CN110795236B (en) Method, device, electronic equipment and medium for adjusting capacity of server
CN110278464B (en) Method and device for displaying list
CN113204298B (en) Method and device for displaying release progress, electronic equipment and storage medium
CN111083516B (en) Live broadcast processing method and device
CN111327953B (en) Live broadcast voting method and device and storage medium
CN110147503B (en) Information issuing method and device, computer equipment and storage medium
CN111104980B (en) Method, device, equipment and storage medium for determining classification result
CN111062824B (en) Group member processing method, device, computer equipment and storage medium
CN111078521A (en) Abnormal event analysis method, device, equipment, system and storage medium
CN111338910A (en) Log data processing method, log data display method, log data processing device, log data display device, log data processing equipment and log data storage medium
CN112529871B (en) Method and device for evaluating image and computer storage medium
CN110929159B (en) Resource release method, device, equipment and medium
CN111857793B (en) Training method, device, equipment and storage medium of network model
CN111796990A (en) Resource display method, device, terminal and storage medium
CN113506086A (en) Task issuing method and device, computer equipment and medium
CN111064657B (en) Method, device and system for grouping concerned accounts
CN112100528A (en) Method, device, equipment and medium for training search result scoring model
CN114827651B (en) Information processing method, information processing device, electronic equipment and storage medium
CN114143280B (en) Session display method and device, electronic equipment and storage medium
CN114153963A (en) Document recommendation method and device, computer equipment and medium
CN112237743B (en) User data statistics method, device, computer equipment and storage medium
CN112990424B (en) Neural network model training method and device
CN109344284B (en) Song file playing method, device, equipment and storage medium
CN114238859A (en) Data processing system, method, electronic device, and storage medium
CN113408809A (en) Automobile design scheme evaluation method and device and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant