CN112581001A - Device evaluation method and device, electronic device and readable storage medium - Google Patents

Device evaluation method and device, electronic device and readable storage medium Download PDF

Info

Publication number
CN112581001A
CN112581001A CN202011556717.6A CN202011556717A CN112581001A CN 112581001 A CN112581001 A CN 112581001A CN 202011556717 A CN202011556717 A CN 202011556717A CN 112581001 A CN112581001 A CN 112581001A
Authority
CN
China
Prior art keywords
marked
equipment
sample
evaluated
quality score
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011556717.6A
Other languages
Chinese (zh)
Other versions
CN112581001B (en
Inventor
袁霖
田野
何世伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Anyixun Technology Co ltd
Original Assignee
Chengdu Anyixun Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Anyixun Technology Co ltd filed Critical Chengdu Anyixun Technology Co ltd
Priority to CN202011556717.6A priority Critical patent/CN112581001B/en
Publication of CN112581001A publication Critical patent/CN112581001A/en
Application granted granted Critical
Publication of CN112581001B publication Critical patent/CN112581001B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Strategic Management (AREA)
  • Quality & Reliability (AREA)
  • General Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Game Theory and Decision Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Tourism & Hospitality (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The application provides an equipment evaluation method and device, electronic equipment and a readable storage medium. A method of evaluating a device, comprising: acquiring an image to be identified; the image to be identified comprises equipment to be evaluated; inputting the image to be identified into a pre-trained evaluation model to obtain defect information and quality information of the equipment to be evaluated; the defect information comprises the defect area of the equipment to be evaluated and the information of the defect area of the equipment to be evaluated, and the quality information comprises the quality score of the equipment to be evaluated. The evaluation method is used for improving the efficiency of equipment evaluation and the accuracy of an equipment evaluation result.

Description

Device evaluation method and device, electronic device and readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a device evaluation method and apparatus, an electronic device, and a readable storage medium.
Background
Existing equipment defect assessment (such as defect assessment of used equipment) mainly depends on manual treatment.
There are two problems with this approach, one is that the artificial approach is less efficient; the other is that the subjective way is more subjective, which results in poor accuracy of the evaluation result.
Disclosure of Invention
An object of the embodiments of the present application is to provide a device evaluation method and apparatus, an electronic device, and a readable storage medium, so as to improve the device evaluation efficiency and the device evaluation result accuracy.
In a first aspect, an embodiment of the present application provides an apparatus evaluation method, including: acquiring an image to be identified; the image to be identified comprises equipment to be evaluated; inputting the image to be identified into a pre-trained evaluation model to obtain defect information and quality information of the equipment to be evaluated; the defect information comprises the defect area of the equipment to be evaluated and the information of the defect area of the equipment to be evaluated, and the quality information comprises the quality score of the equipment to be evaluated.
In the embodiment of the application, through a pre-trained evaluation model, when equipment defect evaluation is required, an image to be identified including equipment to be evaluated is input into the evaluation model, the evaluation model can directly output an evaluation result, and the evaluation result not only includes defect information of the equipment to be evaluated, but also includes quality information of the equipment to be evaluated. Compared with the prior art, the defect evaluation of the equipment is not required to be carried out in a manual mode, and the efficiency of the equipment evaluation is improved; moreover, subjective influence caused by a manual mode is avoided, and the accuracy of an evaluation result is higher. In addition, the evaluation result also comprises quality information, so that the evaluation result is more applicable, such as: assuming that the device is a second-hand device, the second-hand device can be reasonably rated based on the quality information.
As a possible implementation manner, the architecture of the pre-trained evaluation model is yolo architecture, the backbone network of the pre-trained evaluation model is Resnet50, and the pre-trained evaluation model includes a loss function for calculating a quality score; the inputting the image to be recognized into a pre-trained evaluation model to obtain the defect information and the quality information of the equipment to be evaluated comprises the following steps: obtaining defect information of the equipment to be evaluated through the yolo architecture and the Resnet 50; and obtaining the quality information of the equipment to be evaluated through the yolo architecture, the Resnet50 and the loss function.
In the embodiment of the application, the defect information of the equipment to be evaluated is accurately acquired through the yolo architecture and Resnet 50; accurate acquisition of quality information of the equipment to be evaluated is realized through a yolo framework, Resnet50 and a loss function for calculating a quality score; the accuracy of the evaluation result of the equipment is improved.
As a possible implementation manner, before the acquiring the image to be identified, the evaluation method further includes: acquiring a training sample set; the training sample set comprises a plurality of marked sample images, each marked sample image comprises a sample device, each sample device has marked defect information and marked quality information, each marked defect information comprises a marked defect area and information of the marked defect area, and each marked quality information comprises a quality score of the sample device; and inputting the training sample set into an initial evaluation model for training until the model converges, and obtaining a trained evaluation model.
In the embodiment of the application, the training of the initial evaluation model based on the training sample set is realized by acquiring the training sample set, so that the trained evaluation model can be used for evaluating the equipment.
As a possible implementation manner, the obtaining a training sample set includes: acquiring a plurality of sample images to be marked; the multiple sample images to be marked comprise sample equipment to be marked; marking a defect area of the sample device to be marked; determining information of a marked defect area and marking; marking the quality score of the sample device to be marked according to the marked defect area to obtain the training sample set.
In the embodiment of the application, when the training sample set is obtained, the training sample set is effectively obtained by correspondingly marking the to-be-marked sample image.
As a possible implementation manner, the marking the quality score of the sample device to be marked according to the marked defect area to obtain the training sample set includes: determining a ratio of an area of the marked defect region to an area of the sample device to be marked; determining a first quality score of the sample equipment to be marked according to the ratio and a preset quality score range; determining and marking the quality score of the sample equipment to be marked according to the first quality score and a preset quality coefficient of the sample equipment to be marked; the quality coefficient is used to represent the degree of ageing of the sample device to be marked.
In the embodiment of the application, when the quality score is marked, the first quality score is determined, and then the quality score is determined according to the first quality score and a preset quality coefficient, so that the quality score is effectively and accurately marked.
As a possible implementation manner, after determining the first quality score of the sample device to be labeled according to the ratio and a preset quality score range, the evaluation method further includes: converting the first quality score through a preset rounding function to obtain a converted first quality score; correspondingly, the determining and marking the quality score of the sample device to be marked according to the first quality score and a preset quality coefficient of the sample device to be marked comprises: and determining and marking the quality score of the sample equipment to be marked according to the converted first quality score and a preset quality coefficient of the sample equipment to be marked.
In the embodiment of the application, the first quality score is converted, and then the quality score of the mark is determined through the converted first quality score, so that the quality scores of the mark are all integers, and the consistency of the quality scores is improved.
As a possible implementation manner, the information of the defect area of the device to be evaluated includes: the area name corresponding to the defect area of the equipment to be evaluated and the defect element in the defect area of the equipment to be evaluated.
In the embodiment of the application, through two items of information, namely the area name corresponding to the defect area of the equipment to be evaluated and the defect element in the defect area of the equipment to be evaluated, the defect of the equipment can be accurately described, and the comprehensiveness and the referential property of the evaluation result are improved.
In a second aspect, an embodiment of the present application provides an apparatus for evaluating a device, where the apparatus includes functional modules for implementing the method for evaluating a device described in the first aspect and any one of the possible implementation manners of the first aspect.
In a third aspect, an embodiment of the present application provides an electronic device, including: a memory, a processor; the memory has stored therein computer program instructions which, when read and executed by the processor, perform the method of evaluating a device as described in the first aspect and any one of its possible implementations.
In a fourth aspect, the present application provides a readable storage medium, on which a computer program is stored, where the computer program is executed by a computer to perform the method for evaluating an apparatus according to the first aspect and any one of the possible implementation manners of the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic diagram of an electronic device provided in an embodiment of the present application;
fig. 2 is a flowchart of an evaluation method of an apparatus provided in an embodiment of the present application;
FIG. 3 is a diagram illustrating a defective area provided in an embodiment of the present application;
fig. 4 is a block diagram of an evaluation apparatus of a device according to an embodiment of the present application.
Icon: 100-an electronic device; 110-a memory; 120-a bus; 130-a processor; 300 — evaluation means of the device; 310-an acquisition module; 320-processing module.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. The particular methods of operation in the method embodiments may also be applied to apparatus embodiments or system embodiments. In the description of the present application, "at least one" includes one or more unless otherwise specified. "plurality" means two or more. For example, at least one of A, B and C, comprising: a alone, B alone, a and B in combination, a and C in combination, B and C in combination, and A, B and C in combination. In this application, "/" means "or, for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone.
The technical scheme provided by the embodiment of the application can be applied to defect assessment of equipment, and the equipment can be used equipment (such as used equipment) or new equipment. Such as: a second-hand mobile phone, a second-hand computer, a new mobile phone, a new computer, etc.
The device evaluation method provided by the embodiment of the application can be applied to an evaluation platform of a device, and the evaluation platform of the device can be various implementable expression forms such as an application program and an applet. For the evaluation platform, the hardware operating environment may include: the front end can be understood as a client, and the back end can be understood as a server.
The evaluation method can be applied to the front end as well as to the back end. If the evaluation method is applied to the front-end, the evaluation model required in the evaluation method may be stored on the back-end, from which the front-end retrieves the evaluation model when it is required to use the evaluation model. Of course, if the conditions of the front end allow, the evaluation model can be stored in both the front end and the back end, and the front end can also perform the evaluation method separately. If the evaluation method is applied to the back end, the front end serves as a user interaction end and sends various requests or commands to the back end, the back end processes data based on the requests or commands, the data are fed back to the front end after corresponding processing results are obtained, and the front end feeds back the data to the user.
The hardware environment applied by the evaluation method can be reasonably configured in combination with actual hardware conditions, and is only described as an example in the embodiment of the present application, and does not constitute a limitation on the embodiment of the present application.
Referring to fig. 1, an electronic device 100 provided in the embodiment of the present application includes: memory 110, bus 120, and processor 130. Wherein the processor 130 and the memory 110 are connected by a bus 120.
In the embodiment of the present application, the electronic device 100 may be a server (i.e., a back end) or a front end. When the electronic device 100 is a server, for example, it may be a web server, a database server, a cloud server, or a server assembly composed of a plurality of sub servers; alternatively, when the electronic device 100 is a front end, it may be, for example, a personal computer, a tablet computer, a smart phone, a personal digital assistant, and the like. Of course, the above-mentioned devices are provided for facilitating understanding of the embodiments of the present application, and should not be taken as limiting the embodiments of the present application.
In the embodiment of the present application, the memory 110 stores a program required to implement the evaluation method of the apparatus provided in the embodiment of the present application.
The Memory 110 may include, but is not limited to, a RAM (Random Access Memory), a ROM (Read Only Memory), a PROM (Programmable Read-Only Memory), an EPROM (Erasable Read-Only Memory), an EEPROM (electrically Erasable Read-Only Memory), and the like.
The bus 120 may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Enhanced Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 1, but this does not indicate only one bus or one type of bus.
The processor 130 is used to execute executable modules, such as computer programs, stored in the memory 110. The method performed by the apparatus according to the processes or definitions disclosed in the embodiments of the present application may be implemented in the processor 130 or implemented by the processor 130. After the processor 130 receives the execution instruction and calls the program stored in the memory 110 through the bus 120, the processor 130 may implement the flow of the evaluation method of the operating device by executing the program.
The processor 130 may be an integrated circuit chip having signal processing capabilities. The Processor 130 may be a general-purpose Processor including a CPU (Central Processing Unit), an NP (Network Processor), and the like; but may also be a digital signal processor, an application specific integrated circuit, an off-the-shelf programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components. Which may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The components and configurations of electronic device 100 shown in FIG. 1 are for example, and not for limitation, and electronic device 100 may have other components and configurations as desired. For example, in the embodiment of the present application, if the electronic device 100 is a front end, the electronic device 100 may further include components such as a camera, a display, and an input/output module (e.g., a keyboard and a mouse).
With reference to fig. 2, a flowchart of an evaluation method for a device according to an embodiment of the present application is shown based on the introduction of the hardware operating environment and the application scenario, where the evaluation method includes: step 210, step 220.
Step 210: and acquiring an image to be identified. The image to be identified comprises the equipment to be evaluated.
Step 220: and inputting the image to be identified into a pre-trained evaluation model to obtain the defect information and the quality information of the equipment to be evaluated. The defect information comprises the defect area of the equipment to be evaluated and the information of the defect area of the equipment to be evaluated, and the quality information comprises the quality score of the equipment to be evaluated.
In the embodiment of the application, through a pre-trained evaluation model, when equipment defect evaluation is required, an image to be identified including equipment to be evaluated is input into the evaluation model, the evaluation model can directly output an evaluation result, and the evaluation result not only includes defect information of the equipment to be evaluated, but also includes quality information of the equipment to be evaluated. Compared with the prior art, the defect evaluation of the equipment is not required to be carried out in a manual mode, and the efficiency of the equipment evaluation is improved; moreover, subjective influence caused by a manual mode is avoided, and the accuracy of an evaluation result is higher. In addition, the evaluation result also comprises quality information, so that the evaluation result is more applicable, such as: assuming that the device to be evaluated is a second-hand device, the second-hand device can be reasonably rated based on the quality information.
A detailed implementation of steps 210 and 220 is described next.
In step 210, the device to be evaluated is included in the image to be identified, and for the image to be identified, the image is obtained by shooting the device to be evaluated.
During shooting, all parts of the equipment to be evaluated are shot as much as possible by combining the characteristics of the equipment to be evaluated. Such as: the front side of the equipment such as a mobile phone/a tablet is shot, one side of the equipment can be parallel to a transverse axis or a longitudinal axis of an image to be recognized, and therefore the front side shooting effect is guaranteed.
In order to ensure that the area occupied by the equipment to be evaluated in the image to be identified is as large as possible, when the equipment to be evaluated is shot, each edge of the equipment to be evaluated is made to be as close to the boundary of the viewing frame as possible.
In addition, during shooting, conditions such as illumination, shooting brightness, shooting focal length and the like can be adaptively adjusted, so that the effect of the shot image to be identified is ensured as much as possible.
And, if the device to be evaluated comprises a plurality of faces, such as: the mobile phone comprises a front side and a back side, and at the moment, if one image to be identified cannot simultaneously comprise the front side and the back side of the mobile phone, the mobile phone can correspond to two images to be identified, one image to be identified is an image comprising the front side of the mobile phone, and the other image to be identified is an image comprising the back side of the mobile phone. That is, one or more than two images to be recognized may correspond to one device to be evaluated.
The above-mentioned shooting process of the image to be recognized may be completed by the user, or may be completed by the front-end assistant user, for example: when the user shoots, the front end designates various limiting conditions so that the user shoots an image to be recognized with good effect according to the limiting conditions.
Furthermore, in step 210, if the execution subject of the evaluation method is the front end, the front end may obtain the image to be recognized determined by the user, and if the execution subject of the evaluation method is the back end, the back end obtains the image to be recognized sent by the front end.
As another embodiment, the image to be recognized acquired in step 210 may also be a pre-stored image to be recognized, such as: after the user shoots the image to be identified each time, the image is uploaded to the front end or the rear end to be stored. In this case, the execution condition of step 210 may be: when the front end or the back end receives a processing request of a user, the corresponding stored image to be identified is obtained based on the processing request, and then processing is carried out. Or the front end or the back end sets a processing period (for example, 1 day), and the stored unprocessed image to be recognized is actively acquired and then processed every time one processing period passes. And when executing other business processes, the front end or the back end acquires the corresponding stored image to be identified through the calling of other business processes and then processes the image.
In the embodiment of the application, the image to be identified may be one or more images corresponding to one device to be evaluated; or may be one or more images corresponding to a plurality of devices to be evaluated.
After the image to be recognized is acquired in step 210, in step 220, the image to be recognized is input into a pre-trained evaluation model, and defect information and quality information of the device to be evaluated are obtained.
The defect information includes a defect area of the device to be evaluated and defect information of the defect area. For the defect area, the image can be embodied by combining the recognized image, such as: in the identified image, the evaluation completed device is marked with a defective area. For ease of understanding, please refer to fig. 3, which is a schematic diagram of a defect region, in fig. 3, a total of three defect regions are included on the device in the identified image: defective area 1, defective area 2, and defective area 3.
The defect information of the defective area may include an area name corresponding to the defective area of the device and a defective element in the defective area of the device. The area name corresponding to the defect area may be used to characterize the location of the defect area on the device, and the defect element in the defect area may be used to describe the defect. Taking a mobile phone as an example, the defect information may be: the mobile phone comprises a screen, a sensor, a camera, a microphone, a key, a mobile phone main body and a shell, wherein the screen is broken, and a painted surface is scratched; the mobile phone comprises a screen, a sensor, a camera, a microphone, a key and a mobile phone body, wherein the screen, the sensor, the camera, the microphone, the key and the mobile phone body belong to area names; the items of shell fracture, screen damage and paint surface scratching belong to defect elements.
In the embodiment of the application, through two items of information, namely the area name corresponding to the defect area of the equipment to be evaluated and the defect element in the defect area of the equipment to be evaluated, the defect of the equipment can be accurately described, and the comprehensiveness and the referential property of the evaluation result are improved.
The quality information includes a quality score of the device, and the embodiment of the quality score is introduced together when the embodiment of the training model is introduced.
In step 220, the images input into the pre-trained evaluation model may be provided with numbers, and when the evaluation model outputs the evaluation result, the evaluation result is correspondingly identified according to the numbers.
In combination with the introduction of the image to be recognized, when the evaluation result is output, the evaluation result corresponding to one device may be output, or the evaluation results corresponding to a plurality of devices may be output.
In step 220, the model applied is a pre-trained evaluation model, and the training process of the evaluation model is described next.
As an optional implementation manner, before step 210, the evaluation method further includes: acquiring a training sample set; the training sample set comprises a plurality of marked sample images, each marked sample image comprises sample equipment, each sample equipment has marked defect information and marked quality information, each marked defect information comprises a marked defect area and information of the marked defect area, and each marked quality information comprises a quality score of each sample equipment; and inputting the training sample set into an initial evaluation model for training until the model converges, and obtaining a trained evaluation model.
In the implementation of the present application, the obtaining process of the training sample set may include: acquiring a plurality of sample images to be marked; the multiple sample images to be marked comprise sample equipment to be marked; marking a defect area of a sample device to be marked; determining information of a marked defect area and marking; and marking the quality score of the sample equipment to be marked according to the marked defect area to obtain a training sample set.
Wherein the sample device is the same as the device to be evaluated in its embodiment. The training sample set may include a plurality of labeled sample images, the plurality of labeled sample images may be a plurality of labeled sample images corresponding to a plurality of sample devices, and each sample device may correspond to one labeled sample image or a plurality of labeled sample images.
Correspondingly, when a plurality of sample images to be marked are obtained, a plurality of sample images to be marked corresponding to a plurality of sample devices to be marked can be obtained, and each sample device to be marked can correspond to one sample image to be marked and can also correspond to a plurality of sample images to be marked.
For the process of obtaining the image of the sample to be marked, the process of obtaining the corresponding image by shooting the device of the sample to be marked refers to the embodiment of obtaining the image to be identified by shooting the device to be evaluated, and the description is not repeated here.
After obtaining the image of the sample to be marked, the defect area of the sample device to be marked can be marked. As an alternative embodiment, the marking process comprises: displaying a sample image to be marked, and marking the sample image to be marked by a user through a marking tool (which can be a marking tool provided at the front end); after the user finishes marking, identifying the defect area marked by the user (front end or rear end) in a target detection mode, and then adding a mark to the identified defect area. The marking tool used by the user can also have a target detection function, and when the user finishes marking, the defect area of the sample image to be marked is also finished marking. Of course, as another alternative embodiment, the user may also mark the defective area in the sample image to be marked on line, and then upload the sample image to be marked with the defective area to the front end or the back end, and the front end or the back end performs subsequent processing based on the marking of the user.
The embodiments of the labeling means, target detection, etc. may be implemented using techniques that are well known in the art and will not be described in detail herein.
In the embodiment of the present application, the defect area may be provided with corresponding mark forms, such as: the defective area is marked with a rectangular frame or other fixed-shape frame.
Based on the marked defective area, information of the marked defective area may be determined by description information of the defective area input by a user. Such as: and the user inputs the area name corresponding to the defect area, the user inputs the defect element in the defect area, and then the front end or the rear end takes the information as the identification information of the defect area to carry out identification so as to finish the marking of the information of the defect area.
Based on the marked defective areas, the quality scores of the sample devices to be marked may also be marked. As an alternative embodiment, the process comprises: determining the ratio of the area of the marked defect region to the area of the sample device to be marked; determining a first quality score of the sample device to be marked according to the ratio and a preset quality score range; determining and marking the quality score of the sample equipment to be marked according to the first quality score and a preset quality coefficient of the sample equipment to be marked; the mass coefficient is used to represent the degree of ageing of the sample device to be labelled.
Wherein the defect region corresponds to the shape of the defect region, and based on the corresponding shape, the calculation of the area of the defect region can be realized, such as: if the defect region is rectangular, the area of the defect region is the product of the length and the width. For the area of the sample device to be labeled, the sample device to be labeled in the sample image to be labeled may be detected first, such as: in the sample image to be marked, the pixel value of the area where the sample device to be marked is located is different from the pixel values of other blank areas, and based on the different pixel values, the detection of the sample device to be marked can be realized. And after the detection is finished, carrying out corresponding area calculation based on the detected area of the sample equipment to be marked, and realizing the area calculation of the sample equipment to be marked.
For the quality score, there may be different implementations, such as: the quality score is [1,2,3,4,5], and the range of the quality score is [1,5 ]; for another example: the quality score is [1,2,3,4,5, 6, 7, 8, 9, 10], and the range of the quality score is [1, 10 ]. The quality score and the range of the quality score may be appropriately set in accordance with an actual application scenario, and are not limited in the embodiment of the present application.
Since the area of the defect region is smaller than or equal to the area of the sample device to be marked, the area ratio is in the range of (0, 1), and therefore, in order to convert the area ratio into the quality score, the maximum quality score in the quality score range needs to be multiplied on the basis of the area ratio, for example, the area ratio is multiplied by 5.
In the embodiment of the present application, after obtaining the first quality score, the quality score of the sample device to be labeled may be calculated based on the first quality score directly, or the quality score of the sample device to be labeled may be calculated after processing the first quality score. As an optional processing manner, after the first quality score is obtained, the first quality score is converted through a preset rounding function to obtain a converted first quality score, and correspondingly, when labeling, the quality score of the sample device to be labeled is determined and labeled based on the converted first quality score.
The preset rounding function may be a rounding function, or a floating point to integer function, which may implement a rounding operation. By way of example, assume that the first quality score, before being converted, is: 3.85, then after conversion, the first mass score is 4.
Of course, instead of the rounding function, other functions or algorithms may be used to process the first quality score to make the quality scores of the final labels consistent.
Whether the final quality score is determined based on the first quality score directly or based on the converted first quality score, the preset quality coefficient may represent the old degree of the sample device to be marked, and the value range of the quality coefficient may be: (0-1.0.) the preset quality coefficient may be information that the user inputs together when marking the sample image to be marked.
When determining the final quality score, the first quality score or the product of the converted first quality score and the quality coefficient may be obtained first, and then the product value is converted by the rounding function to obtain the final quality score.
In the embodiment of the application, when the quality score is marked, the first quality score is determined, and then the quality score is determined according to the first quality score and a preset quality coefficient, so that the quality score is effectively and accurately marked.
When the quality score is marked, the quality score is marked as the identification information of the defect area in a similar way to the marking of the information of the defect area, and then the marking of the quality score can be completed.
After each item of content is marked, the resulting image is a marked sample image, wherein the device is a sample device corresponding to the defect information and the quality information.
In the embodiment of the present application, the process of obtaining the training sample set may be in real time or non-real time. If the model is real-time, the training sample set is acquired in real time by adopting the embodiment when the model training is needed. If the training sample set is not real-time, the training sample set can be collected and stored when model training is not needed, and when the model training is needed, part or all of the sample set can be directly obtained from the collected training sample set.
Based on the training sample set, training of the samples may be performed. In the embodiment of the present application, the architecture of the initial evaluation model adopted may be a yolo architecture (an architecture of a deep neural network model), and the backbone network thereof may be Resnet 50. On the basis of the yolo architecture and the Resnet50 backbone network, a loss function for calculating the quality score can be added, which can be understood as a branch of the Resnet50 backbone network. For the yolo architecture and the Resnet50 backbone network, detection and information identification of target areas can already be achieved, namely: the defective area and information of the defective area are determined. The calculation of the quality score may be achieved by adding branches of the loss function for calculating the quality score.
And inputting the training sample set into the initial evaluation model for continuous training, wherein the trained evaluation model can realize the detection of the defect region, the determination of the information of the defect region and the determination of the quality score. In the embodiment of the present application, the implementation of the specific training process for the neural network model belongs to the technology mature in the field, and is not described in detail herein.
After the trained evaluation model is obtained, the accuracy of the trained evaluation model can be tested, and if the test result shows that the accuracy is low, the evaluation model can be optimized or trained continuously to improve the evaluation accuracy of the evaluation model. The evaluation model can be applied directly if the test results indicate a high accuracy.
In the embodiments of the present application, the testing and the accuracy optimization of the model are well-known in the art, and are not described in detail in the embodiments of the present application.
In connection with the above embodiment of training of the evaluation model, it can be understood that, in step 220, the defect information of the device to be evaluated is obtained through the yolo architecture and the Resnet 50; and obtaining the quality information of the equipment to be evaluated through the yolo architecture, Resnet50 and the loss function.
In the embodiment of the application, the defect information of the equipment to be evaluated is accurately acquired through the yolo architecture and Resnet 50; accurate acquisition of quality information of the equipment to be evaluated is realized through a yolo framework, Resnet50 and a loss function for calculating a quality score; and further improve the accuracy of the evaluation result of the equipment.
In the embodiment of the present application, in addition to the intuitive quality score, the quality information may also be provided with a rating of the quality score, such as: when the quality score range is [1,5] and the quality score is between 4 and 5, the grade of the quality score is A; when the quality score is between 3 and 4, the grade of the quality score is B; when the quality score is between 2 and 3, the grade of the quality score is C; when the quality score is between 0 and 2, the quality score is rated as D. Therefore, the implementation of the quality information may be flexibly changed in combination with the actual application scenario, and is not limited in the embodiment of the present application.
If the quality information comprises the quality score rating, the final output evaluation result also comprises the quality rating, and for the evaluation model, the quality score rating can be realized by adding a classifier on the basis of the loss function of the quality score.
In the embodiment of the application, if there is no defect area in the final evaluation result of the image to be recognized, the defect area is marked as none or null, and the information of the defect area is correspondingly marked as none or null.
In the embodiment of the present application, there may be a plurality of application modes for the evaluation result of the device, such as: in the evaluation application scenario of the used device, the user may evaluate the used device based on the quality score (and the rating of the quality score) in the evaluation result; in the application scenario of evaluating a new device, the quality score in the evaluation result is usually high, and at this time, the new device may be returned for repair based on the information of the defect area and the defect area.
Of course, there may be more applications based on different application scenarios and evaluation results of different devices, and the embodiments of the present application are not limited.
Based on the same inventive concept, please refer to fig. 4, an embodiment of the present application further provides an apparatus 300 for evaluating a device, including: an acquisition module 310 and a processing module 320.
An obtaining module 310, configured to obtain an image to be identified; the image to be identified comprises equipment to be evaluated. The processing module 320 is configured to input the image to be recognized into a pre-trained evaluation model, and obtain defect information and quality information of the device to be evaluated; the defect information comprises the defect area of the equipment to be evaluated and the information of the defect area of the equipment to be evaluated, and the quality information comprises the quality score of the equipment to be evaluated.
In the embodiment of the present application, the architecture of the pre-trained evaluation model is a yolo architecture, the backbone network of the pre-trained evaluation model is Resnet50, and the pre-trained evaluation model includes a loss function for calculating a quality score. The processing module 320 is specifically configured to: obtaining defect information of the equipment to be evaluated through the yolo architecture and the Resnet 50; and obtaining the quality information of the equipment to be evaluated through the yolo architecture, the Resnet50 and the loss function.
In this embodiment of the present application, the obtaining module 310 is further configured to obtain a training sample set; the training sample set comprises a plurality of marked sample images, the marked sample images comprise sample equipment, the sample equipment is provided with marked defect information and marked quality information, the marked defect information comprises a marked defect area and the information of the marked defect area, and the marked quality information comprises the quality score of the sample equipment. The processing module 320 is further configured to input the training sample set into an initial evaluation model for training until the model converges, so as to obtain a trained evaluation model.
In this embodiment of the application, the obtaining module 310 is specifically configured to: acquiring a plurality of sample images to be marked; the multiple sample images to be marked comprise sample equipment to be marked; marking a defect area of the sample device to be marked; determining information of a marked defect area and marking; marking the quality score of the sample device to be marked according to the marked defect area to obtain the training sample set.
In this embodiment of the application, the processing module 320 is specifically configured to: determining a ratio of an area of the marked defect region to an area of the sample device to be marked; determining a first quality score of the sample equipment to be marked according to the ratio and a preset quality score range; determining and marking the quality score of the sample equipment to be marked according to the first quality score and a preset quality coefficient of the sample equipment to be marked; the quality coefficient is used to represent the degree of ageing of the sample device to be marked.
In this embodiment of the application, the processing module 320 is further configured to: converting the first quality score through a preset rounding function to obtain a converted first quality score; and in particular for: and determining and marking the quality score of the sample equipment to be marked according to the converted first quality score and a preset quality coefficient of the sample equipment to be marked.
The embodiment of the present application further provides a storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a computer, the method for evaluating the device in the embodiment of the present application is executed.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A method for evaluating a device, comprising:
acquiring an image to be identified; the image to be identified comprises equipment to be evaluated;
inputting the image to be identified into a pre-trained evaluation model to obtain defect information and quality information of the equipment to be evaluated; the defect information comprises the defect area of the equipment to be evaluated and the information of the defect area of the equipment to be evaluated, and the quality information comprises the quality score of the equipment to be evaluated.
2. The evaluation method according to claim 1, wherein the architecture of the pre-trained evaluation model is yolo architecture, the backbone network of the pre-trained evaluation model is Resnet50, and the pre-trained evaluation model comprises a loss function for calculating the quality score; the inputting the image to be recognized into a pre-trained evaluation model to obtain the defect information and the quality information of the equipment to be evaluated comprises the following steps:
obtaining defect information of the equipment to be evaluated through the yolo architecture and the Resnet 50;
and obtaining the quality information of the equipment to be evaluated through the yolo architecture, the Resnet50 and the loss function.
3. The evaluation method according to claim 1, wherein prior to said acquiring an image to be identified, the evaluation method further comprises:
acquiring a training sample set; the training sample set comprises a plurality of marked sample images, each marked sample image comprises a sample device, each sample device has marked defect information and marked quality information, each marked defect information comprises a marked defect area and information of the marked defect area, and each marked quality information comprises a quality score of the sample device;
and inputting the training sample set into an initial evaluation model for training until the model converges, and obtaining a trained evaluation model.
4. The evaluation method of claim 3, wherein the obtaining a training sample set comprises:
acquiring a plurality of sample images to be marked; the multiple sample images to be marked comprise sample equipment to be marked;
marking a defect area of the sample device to be marked;
determining information of a marked defect area and marking;
marking the quality score of the sample device to be marked according to the marked defect area to obtain the training sample set.
5. The evaluation method according to claim 4, wherein the marking the quality score of the sample device to be marked according to the marked defect region to obtain the training sample set comprises:
determining a ratio of an area of the marked defect region to an area of the sample device to be marked;
determining a first quality score of the sample equipment to be marked according to the ratio and a preset quality score range;
determining and marking the quality score of the sample equipment to be marked according to the first quality score and a preset quality coefficient of the sample equipment to be marked; the quality coefficient is used to represent the degree of ageing of the sample device to be marked.
6. The evaluation method of claim 5, wherein after determining the first quality score of the sample device to be labeled according to the ratio and a preset quality score range, the evaluation method further comprises:
converting the first quality score through a preset rounding function to obtain a converted first quality score;
correspondingly, the determining and marking the quality score of the sample device to be marked according to the first quality score and a preset quality coefficient of the sample device to be marked comprises:
and determining and marking the quality score of the sample equipment to be marked according to the converted first quality score and a preset quality coefficient of the sample equipment to be marked.
7. The evaluation method according to claim 1, wherein the information of the defective area of the device to be evaluated includes: the area name corresponding to the defect area of the equipment to be evaluated and the defect element in the defect area of the equipment to be evaluated.
8. An apparatus for evaluating a device, comprising:
the acquisition module is used for acquiring an image to be identified; the image to be identified comprises equipment to be evaluated;
the processing module is used for inputting the image to be identified into a pre-trained evaluation model to obtain the defect information and the quality information of the equipment to be evaluated; the defect information comprises the defect area of the equipment to be evaluated and the information of the defect area of the equipment to be evaluated, and the quality information comprises the quality score of the equipment to be evaluated.
9. An electronic device, comprising: a memory, a processor;
the memory has stored therein computer program instructions which, when read and executed by the processor, perform the method of any one of claims 1-7.
10. A readable storage medium, having stored thereon a computer program which, when executed by a computer, performs the method of any one of claims 1-7.
CN202011556717.6A 2020-12-24 2020-12-24 Evaluation method and device of equipment, electronic equipment and readable storage medium Active CN112581001B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011556717.6A CN112581001B (en) 2020-12-24 2020-12-24 Evaluation method and device of equipment, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011556717.6A CN112581001B (en) 2020-12-24 2020-12-24 Evaluation method and device of equipment, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN112581001A true CN112581001A (en) 2021-03-30
CN112581001B CN112581001B (en) 2024-06-07

Family

ID=75140642

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011556717.6A Active CN112581001B (en) 2020-12-24 2020-12-24 Evaluation method and device of equipment, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112581001B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113793332A (en) * 2021-11-15 2021-12-14 山东德普检测技术有限公司 Experimental instrument defect identification and classification method and system

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2810219A1 (en) * 2012-02-01 2014-12-10 Ecoatm Inc. Method and apparatus for recycling electronic devices
CN204116894U (en) * 2014-09-19 2015-01-21 杭州车水马龙信息科技有限公司 Used car intelligent detection device
CN104461682A (en) * 2014-10-31 2015-03-25 武汉钢铁(集团)公司 Steel rolling steel plate simulation slitting quality judgment system and method
CN105809821A (en) * 2014-12-12 2016-07-27 埃科亚特姆公司 Systems and methods for recycling consumer electronic devices
CN109117958A (en) * 2017-06-26 2019-01-01 深圳回收宝科技有限公司 Portable electronic device valuation methods, recovery method, terminal, cloud and system
CN109632814A (en) * 2019-02-01 2019-04-16 东莞中科蓝海智能视觉科技有限公司 Part defect detection method
CN109816626A (en) * 2018-12-13 2019-05-28 深圳高速工程检测有限公司 Road surface crack detection method, device, computer equipment and storage medium
CN109886964A (en) * 2019-03-29 2019-06-14 北京百度网讯科技有限公司 Circuit board defect detection method, device and equipment
CN110009433A (en) * 2019-04-26 2019-07-12 阿里巴巴集团控股有限公司 Assess the method and device of product price
CN110047073A (en) * 2019-05-05 2019-07-23 北京大学 A kind of X-ray weld image fault grading method and system
CN110349126A (en) * 2019-06-20 2019-10-18 武汉科技大学 A kind of Surface Defects in Steel Plate detection method based on convolutional neural networks tape label
CN110400099A (en) * 2019-08-09 2019-11-01 马鞍山钢铁股份有限公司 A kind of belt steel product surface quality stage division
CN111179229A (en) * 2019-12-17 2020-05-19 中信重工机械股份有限公司 Industrial CT defect detection method based on deep learning
CN111192262A (en) * 2020-01-03 2020-05-22 腾讯云计算(北京)有限责任公司 Product defect classification method, device, equipment and medium based on artificial intelligence
CN111242896A (en) * 2019-12-31 2020-06-05 电子科技大学 Color printing label defect detection and quality rating method
CN111311546A (en) * 2020-01-19 2020-06-19 上海箱云物流科技有限公司 Container detection method, device and computer readable storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2810219A1 (en) * 2012-02-01 2014-12-10 Ecoatm Inc. Method and apparatus for recycling electronic devices
CN204116894U (en) * 2014-09-19 2015-01-21 杭州车水马龙信息科技有限公司 Used car intelligent detection device
CN104461682A (en) * 2014-10-31 2015-03-25 武汉钢铁(集团)公司 Steel rolling steel plate simulation slitting quality judgment system and method
CN105809821A (en) * 2014-12-12 2016-07-27 埃科亚特姆公司 Systems and methods for recycling consumer electronic devices
CN109117958A (en) * 2017-06-26 2019-01-01 深圳回收宝科技有限公司 Portable electronic device valuation methods, recovery method, terminal, cloud and system
CN109816626A (en) * 2018-12-13 2019-05-28 深圳高速工程检测有限公司 Road surface crack detection method, device, computer equipment and storage medium
CN109632814A (en) * 2019-02-01 2019-04-16 东莞中科蓝海智能视觉科技有限公司 Part defect detection method
CN109886964A (en) * 2019-03-29 2019-06-14 北京百度网讯科技有限公司 Circuit board defect detection method, device and equipment
CN110009433A (en) * 2019-04-26 2019-07-12 阿里巴巴集团控股有限公司 Assess the method and device of product price
CN110047073A (en) * 2019-05-05 2019-07-23 北京大学 A kind of X-ray weld image fault grading method and system
CN110349126A (en) * 2019-06-20 2019-10-18 武汉科技大学 A kind of Surface Defects in Steel Plate detection method based on convolutional neural networks tape label
CN110400099A (en) * 2019-08-09 2019-11-01 马鞍山钢铁股份有限公司 A kind of belt steel product surface quality stage division
CN111179229A (en) * 2019-12-17 2020-05-19 中信重工机械股份有限公司 Industrial CT defect detection method based on deep learning
CN111242896A (en) * 2019-12-31 2020-06-05 电子科技大学 Color printing label defect detection and quality rating method
CN111192262A (en) * 2020-01-03 2020-05-22 腾讯云计算(北京)有限责任公司 Product defect classification method, device, equipment and medium based on artificial intelligence
CN111311546A (en) * 2020-01-19 2020-06-19 上海箱云物流科技有限公司 Container detection method, device and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113793332A (en) * 2021-11-15 2021-12-14 山东德普检测技术有限公司 Experimental instrument defect identification and classification method and system

Also Published As

Publication number Publication date
CN112581001B (en) 2024-06-07

Similar Documents

Publication Publication Date Title
CN110660066B (en) Training method of network, image processing method, network, terminal equipment and medium
CN110428475B (en) Medical image classification method, model training method and server
CN109871845B (en) Certificate image extraction method and terminal equipment
CN111369550A (en) Image registration and defect detection method, model, training method, device and equipment
CN111461101A (en) Method, device and equipment for identifying work clothes mark and storage medium
CN110473181A (en) Screen content image based on edge feature information without ginseng quality evaluating method
CN113643260A (en) Method, apparatus, device, medium and product for detecting image quality
CN112328822A (en) Picture pre-labeling method and device and terminal equipment
CN105488470A (en) Method and apparatus for determining character attribute information
CN113177397B (en) Table adjusting method, device, equipment and storage medium
CN112581001B (en) Evaluation method and device of equipment, electronic equipment and readable storage medium
CN113298161A (en) Image recognition model testing method and device, computer equipment and storage medium
CN111179245B (en) Image quality detection method, device, electronic equipment and storage medium
CN117197479A (en) Image analysis method, device, computer equipment and storage medium applying corn ear outer surface
CN117078670A (en) Production control system of cloud photo frame
CN116596903A (en) Defect identification method, device, electronic equipment and readable storage medium
CN116597246A (en) Model training method, target detection method, electronic device and storage medium
CN108074240B (en) Recognition method, recognition apparatus, computer-readable storage medium, and program product
CN115984786A (en) Vehicle damage detection method and device, terminal and storage medium
CN113450381B (en) System and method for evaluating accuracy of image segmentation model
CN113537407B (en) Image data evaluation processing method and device based on machine learning
CN113610856B (en) Method and device for training image segmentation model and image segmentation
CN112668637B (en) Training method, recognition method and device of network model and electronic equipment
CN113870210A (en) Image quality evaluation method, device, equipment and storage medium
CN112200222A (en) Model training apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant