CN109584208B - Inspection method for intelligent identification model of industrial structure defects - Google Patents

Inspection method for intelligent identification model of industrial structure defects Download PDF

Info

Publication number
CN109584208B
CN109584208B CN201811237173.XA CN201811237173A CN109584208B CN 109584208 B CN109584208 B CN 109584208B CN 201811237173 A CN201811237173 A CN 201811237173A CN 109584208 B CN109584208 B CN 109584208B
Authority
CN
China
Prior art keywords
identification
threshold value
frame
confidence
coincidence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811237173.XA
Other languages
Chinese (zh)
Other versions
CN109584208A (en
Inventor
王铁军
范学领
李鸿宇
张钰
蒋昊南
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201811237173.XA priority Critical patent/CN109584208B/en
Publication of CN109584208A publication Critical patent/CN109584208A/en
Application granted granted Critical
Publication of CN109584208B publication Critical patent/CN109584208B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for inspecting an intelligent identification model for industrial structure defects, which comprises the following steps: 1) a preparation stage, forming a detection data set through image acquisition, sorting and screening, marking an object to be detected by using a rectangular frame for each picture, and then inputting the images in the detection data set into a to-be-detected identification network to obtain an identification result; 2) selecting confidence threshold values according to engineering requirements, and calculating the coincidence value CAr of each recognition mark frame with the confidence coefficient higher than the threshold value and each preset answer mark frame in a single image; 3) constructing a coincidence degree matrix of a single image; 4) selecting a coincidence degree threshold value according to engineering requirements, judging whether the identification frame is correct or incorrect and counting the number; 5) and calculating to obtain the identification detection rate and accuracy of the evaluation model. The method can solve the problem of model evaluation result distortion caused by large-scale jitter and overlapping of the recognition frames when the multi-scale divisible target recognition effect is calculated by the traditional mAP detection method.

Description

Inspection method for intelligent identification model of industrial structure defects
Technical Field
The invention belongs to the technical field of artificial intelligence, and particularly relates to a method for inspecting an intelligent identification model for industrial structural defects.
Background
The artificial intelligence technology has become the trend of global research and industry development, the technological progress in the field leads various industries to enter an unprecedented technological innovation wave, and at present, the artificial intelligence technology begins to permeate into various aspects of production and life. In the traditional industrial field, safety monitoring needs to consume huge manpower and material resources, and the artificial intelligence technology, especially the intelligent recognition model therein, has become a new efficient solution to the problem.
A paper published in 2015 by Cynanchum wilfordii et al, "faster r-cnn: a method for locating and identifying multiple types of objects directly from a picture with high speed and high accuracy is provided, and a structural diagram is shown in figure 1. The model firstly extracts features and information in an image by using a deep convolutional neural network to form a feature map, then introduces concepts of anchor points and candidate frames, divides all the candidate frames into two by using a RegionProusalnetwork (RPN), and finally classifies and identifies the features extracted from corresponding positions in the extracted feature map by using a Softmax classifier on the basis of the boundaries of the selected candidate frames. In addition to this, several intelligent recognition networks with similar functions have been proposed, whose common features are: rectangular box marks fully defined in the image by four parameters are used for each identified object, and a plurality of small-range jittered identification boxes are generated by one object. In actual scientific research and engineering applications, non-maximum suppression (NMS) processing algorithms are often used to overcome the problem of repeated identification of a single object.
Whether the intelligent identification model can effectively and reliably complete the identification task is the most concerned problem when the intelligent identification model is applied to industrial defect identification. Therefore, model inspection indexes are very important in the aspects of algorithm development, scientific research, engineering application and the like, and an objective inspection method is a precondition for model optimization and industrial product evaluation. Currently, an average accuracy (mAP) method commonly used in the field of intelligent identification is an objective and effective inspection method, but in practice, the mAP index obtained in industrial defect identification inspection is far lower than the level of the same algorithm for identifying general objects under the condition that the visual effect is different. Through multiple network training and inspection studies, the problem is found to be ubiquitous in a special class of targets: the multi-scale partitionable object, i.e. any part of the image of the object of this type, can be considered as an independent object, and this feature is schematically illustrated in fig. 2. This makes the jitter recognition box coverage for an object extremely large, the non-extreme suppression (NMS) processing algorithm no longer effective, and the model test index obtained therefrom is also no longer objective and reliable.
Reference documents:
[1]Ren S,He K,Girshick R,et al.Faster R-CNN:towards real-time object detection with region proposal networks[C]//International Conference on Neural Information Processing Systems.MIT Press,2015:91-99.
disclosure of Invention
The invention aims to provide a simple, high-efficiency and practical-significance inspection method for an intelligent identification model for industrial structure defects, and the method can solve the problem of model evaluation result distortion caused by large-range shaking and overlapping of identification frames when a multi-scale divisible target identification effect is calculated by a traditional mAP inspection method.
The invention is realized by the following technical scheme:
an inspection method for an industrial structure defect intelligent identification model comprises the following steps:
1) a preparation stage, forming a detection data set through image acquisition, sorting and screening, marking an object to be detected by using a rectangular frame for each picture, evaluating a preset answer for a model formed by a plurality of marked coordinates in a form, inputting the image in the detection data set into a to-be-detected identification network, and acquiring an identification result of the image, wherein the identification result is a plurality of marked frame coordinates and confidence degrees thereof in the form;
2) selecting confidence threshold values according to engineering requirements, and calculating the coincidence value CAr of each recognition mark frame with the confidence coefficient higher than the threshold value and each preset answer mark frame in a single image;
3) constructing a coincidence degree matrix of a single image;
4) selecting a coincidence degree threshold value according to engineering requirements, judging whether the identification frame is correct or incorrect and counting the number;
5) and calculating the identification detection rate and accuracy of the evaluation model according to the following formula:
Figure BDA0001838469560000031
Figure BDA0001838469560000032
the invention is further improved in that in the step 2), the confidence threshold value is 0.5 for the general task, and the confidence threshold value is 0.1 for the high-standard task.
A further improvement of the present invention is that the calculation formula adopted in step 2) is as follows:
Figure BDA0001838469560000033
in the formula, Ai and Dj represent any preset answer mark frame and the identification mark frame with the confidence coefficient higher than the threshold value, SAi、SDjDenotes the area thereof, and S (Ai # Dj) denotes the area of the area where the two intersect.
The invention further improves that in the step 3), a coincidence ratio matrix of a single image is constructed according to the following format:
Figure BDA0001838469560000034
each row of the matrix represents an identification mark frame with the confidence coefficient higher than a threshold value, each column represents a preset answer mark frame, and each element represents the coincidence value of the current row and the current column mark frame.
The invention is further improved in that in the step 4), the identification frames are judged to be correct and incorrect according to the following criteria and the number of the identification frames is counted:
401) for each row of elements in the matrix, if at least one value is greater than or equal to the coincidence degree threshold value, the preset answer mark corresponding to the row is considered to be correctly detected;
402) and for each column of elements in the matrix, if at least one value is greater than or equal to the coincidence degree threshold value, the identification mark corresponding to the column with the confidence coefficient higher than the threshold value is considered to be correct and not to fall into a blank, and accordingly, the correct detected number and the accurate detected number are counted.
A further improvement of the invention is that for a general task the overlap ratio threshold is taken to be 0.5 and for a high standard task the overlap ratio threshold is taken to be 0.8.
The invention has the following beneficial technical effects:
the invention provides a new algorithm which can be widely used for detecting the industrial structure defect recognition model according to the common multi-scale divisible characteristics of the industrial structure defects by aiming at objectively evaluating the actual use effect of the intelligent recognition model. It has the following advantages:
firstly, the method comprises the following steps: the algorithm is simple and clear, the implementation is convenient by using a computer program, the matrix change and calculation method design contained in the algorithm is suitable for the operation of multithread acceleration or high-performance GPU, and the efficiency is high.
Secondly, the method comprises the following steps: the algorithm weakens the scale requirements on the identification frame and the marking frame by defining the coincidence degree value, and the change conforms to the common multi-scale divisible characteristics of industrial structure defects, so that the problem of numerical value conversion distortion when the coincidence degree algorithm of the traditional mAP detection method is applied to the coincidence degree value is solved.
Thirdly, the method comprises the following steps: the algorithm simplifies the practical significance and the calculation difficulty by defining the coincidence degree matrix, weakens the one-to-one correspondence requirement of preset answers and recognition results in various classical models, meets the common multi-scale divisible characteristics of industrial structure defects, and solves the problem of numerical value conversion distortion caused by the reasons of various inspection methods.
Fourthly: the two key indexes provided by the algorithm, namely the detection rate and the accuracy rate, accord with various classical index definition modes in similar problems, and therefore the method has extremely high practical value.
In conclusion, the method for testing the intelligent identification model of the industrial structure defects provided by the invention ensures the feasibility and the effectiveness of the algorithm through practical verification. By utilizing two key indexes provided by the algorithm, the method can be easily combined with the traditional algorithm to form single-value evaluation, which means that the algorithm is used as a main part and has large expansion space. In addition, the application range of the invention is not limited to the object recognition model inspection with multi-scale divisible features, and the invention can be used for traditional object inspection and provides inspection results similar to the trend of the classical inspection method. The invention has passed practical verification to ensure its reliability.
Drawings
FIG. 1 is a schematic structural diagram of an advanced target recognition network faster R-CNN.
Fig. 2 is a multi-scale segmentable schematic diagram of an industrial damage object, wherein fig. 2(a) is an example of crack object identification (typical multi-scale segmentable industrial damage to be detected), and fig. 2(b) is an example of animal identification (general object without multi-scale segmentable features).
FIG. 3 is a diagram illustrating a comparison between the redefined overlap ratio index and the overlap ratio index used in the conventional algorithm, wherein FIG. 3(a) is the overlap ratio index IoU defined by the conventional algorithm, and FIG. 3(b) is the overlap ratio index CAr newly defined by the present invention.
FIG. 4 is a diagram illustrating the calculation process of the detection rate and the accuracy rate by defining the coincidence degree matrix according to the present invention.
Fig. 5 shows three verification images prepared in the embodiment, including the original image and the preset answer mark visualization box, which are consistent with the object coordinates listed in the embodiment.
FIG. 6 shows the recognition result obtained by the network for automatic recognition in the embodiment, and the blocks and their confidence levels are labeled in the figure.
Detailed Description
The invention is further described below with reference to the following figures and examples.
The invention provides a method for inspecting an intelligent industrial defect identification model, which comprises the following specific steps:
step 1, in a preparation stage, an inspection data set is formed through image acquisition, sorting and screening, an object to be inspected is marked by using a rectangular frame for each picture, and a model formed by a plurality of marked coordinates is evaluated to form a preset answer (Grountruth). And then, inputting the images in the inspection data set into an identification network to be inspected, and acquiring the identification result of the images, wherein the identification result is in the form of coordinates of a plurality of marking frames and Confidence degrees (Confidence). After a group of preset answers, corresponding mark coordinates and confidence degrees of the test are obtained through the steps, the test is completed according to the following 5 steps.
Step 2, selecting a proper confidence threshold according to engineering requirements, generally taking 0.5 (general task) or 0.1 (high standard task), and calculating the contact ratio (CAr) value of each identification mark frame with the confidence higher than the threshold and each preset answer mark frame in a single image according to the following formula:
Figure BDA0001838469560000051
in the formula, Ai and Dj represent any preset answer mark frame and the identification mark frame with the confidence coefficient higher than the threshold value, SAi、SDjDenotes the area thereof, and S (Ai # Dj) denotes the area of the area where the two intersect.
And 3, constructing a coincidence degree matrix (CArMatrix) of a single image according to the following format:
Figure BDA0001838469560000061
each row of the matrix represents an identification mark frame with the confidence coefficient higher than a threshold value, each column represents a preset answer mark frame, and each element represents the coincidence value of the current row and the current column mark frame.
And 4, selecting a proper contact ratio threshold value (representing the accuracy requirement on the identification object) according to the engineering requirement, and generally taking 0.5 (general task) or 0.8 (high-standard task). And judging whether the identification frames are correct or incorrect and counting the number according to the following criteria:
401) for each row of elements in the matrix, if at least one value is greater than or equal to the coincidence degree threshold value, the preset answer mark corresponding to the row is considered to be correctly detected;
402) and for each column of elements in the matrix, if at least one value is greater than or equal to the coincidence degree threshold value, the identification mark corresponding to the column with the confidence coefficient higher than the threshold value is considered to be correct and not to fall into a blank, and accordingly, the correct detected number and the accurate detected number are counted.
Step 5, calculating the identification detection rate and the accuracy of the evaluation model according to the following formula:
Figure BDA0001838469560000062
Figure BDA0001838469560000063
and 6, comprehensively analyzing the target identification effects of the network used in scientific research, technical development and engineering application according to the specific engineering task and requirements and on the basis of the calculated detectable rate and accuracy, and performing the operations of model optimization, parameter setting, model structure adjustment, engineering model acceptance and the like. Specific examples are as follows:
for the model building work, the level of building the model can be evaluated according to the above indexes.
The detection rate and the accuracy rate index are set according to the actual situation for engineering development, application and acceptance inspection, and the detection rate and the accuracy rate obtained by the method are compared to complete the development and acceptance of the intelligent detection equipment.
FIG. 3 is a diagram illustrating a comparison between the redefined overlap ratio index and the overlap ratio index used in the conventional algorithm, wherein FIG. 3(a) is the overlap ratio index IoU defined by the conventional algorithm, and FIG. 3(b) is the overlap ratio index CAr newly defined by the present invention. FIG. 4 is a diagram illustrating the calculation process of the detection rate and the accuracy rate by defining the coincidence degree matrix according to the present invention.
Example (b):
testing the trained intelligent crack recognition network based on the fasterR-CNN
First, a training set containing 88 images and their preset answer labels are prepared as follows:
Figure BDA0001838469560000071
the image is illustrated in fig. 5, which shows three verification images prepared in the embodiment, including the original image and the visual box of the preset answer mark thereof, where the visual box is consistent with the object coordinates listed in the embodiment.
Then inputting the inspection image into a network, and acquiring the identification effect with the form consistent with the above, wherein one identification frame comprises 5 parameters (4 positioning parameters and confidence degrees), and the image visualization effect is shown in the attached drawing.
And then, obtaining the detection rate and the accuracy value of each image according to the calculation steps, and averaging the results of 88 images to obtain the comprehensive performance of the whole network, wherein the final result is shown in the table below.
Average detection rate Average rate of accuracy Traditional mAP assay
Test point (%) 96.63 90.46 41.6
By taking the image recognition visual effect shown in the attached figure 6 as a reference, the evaluation method score in the invention is compared with the traditional mAP test score, so that the underestimation of the model capability of the traditional mAP test method and the superiority of the evaluation method in the invention can be obviously perceived.

Claims (2)

1. An inspection method for an industrial structure defect intelligent identification model is characterized by comprising the following steps:
1) a preparation stage, forming an inspection data set through image acquisition, sorting and screening, marking an object to be inspected by using a rectangular frame for each image to obtain a plurality of preset answer mark frames, then inputting the images in the inspection data set into an identification network to be inspected, and obtaining an identification result of the images to obtain a plurality of identification mark frames with confidence;
2) selecting confidence threshold values according to engineering requirements, and calculating the coincidence value CAr of each recognition mark frame with the confidence coefficient higher than the threshold value and each preset answer mark frame in a single image; for a common task, the confidence coefficient threshold value is 0.5, and for a high-standard task, the confidence coefficient threshold value is 0.1; the calculation formula adopted by the coincidence value CAr is as follows:
Figure FDA0002666461840000011
in the formula, Ai and Dj represent any preset answer mark frame and the identification mark frame with the confidence coefficient higher than the threshold value, SAi、SDjRepresents the area thereof, S (Ai # Dj) represents the area of the intersection region of the two;
3) constructing a coincidence degree table of a single image; the coincidence degree table of a single image is constructed according to the following format:
Figure FDA0002666461840000012
in the table, each label of a row represents an identification mark frame with a confidence coefficient higher than a threshold value, each label of a column represents a preset answer mark frame, internal data is a matrix element, and each element represents a coincidence value of a current row mark frame and a current column mark frame;
4) selecting a coincidence degree threshold value according to engineering requirements, judging whether the identification mark frames are correct or incorrect and counting the number; judging whether the identification mark frames are correct or not and counting the number according to the following criteria:
401) regarding the elements in each column in the table, if at least one value is greater than or equal to the coincidence degree threshold value, determining that the preset answer mark frames corresponding to the columns in the table are correctly detected, and counting the number of the correctly detected preset answer mark frames as the detection number;
402) regarding the elements in each row in the table, if at least one value is greater than or equal to the contact ratio threshold value, the identification mark frames with the confidence degrees higher than the threshold value corresponding to the rows in the table are considered to be accurately detected, and the number of the identification mark frames with the accurately detected confidence degrees higher than the threshold value is counted as the accurate detection number;
5) and calculating the identification detection rate and accuracy of the evaluation model according to the following formula:
Figure FDA0002666461840000021
Figure FDA0002666461840000022
2. an inspection method for an intelligent identification model of industrial structural defects according to claim 1, wherein in the step 4), the threshold value of the contact ratio is 0.5 for general tasks, and 0.8 for high-standard tasks.
CN201811237173.XA 2018-10-23 2018-10-23 Inspection method for intelligent identification model of industrial structure defects Active CN109584208B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811237173.XA CN109584208B (en) 2018-10-23 2018-10-23 Inspection method for intelligent identification model of industrial structure defects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811237173.XA CN109584208B (en) 2018-10-23 2018-10-23 Inspection method for intelligent identification model of industrial structure defects

Publications (2)

Publication Number Publication Date
CN109584208A CN109584208A (en) 2019-04-05
CN109584208B true CN109584208B (en) 2021-02-02

Family

ID=65920383

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811237173.XA Active CN109584208B (en) 2018-10-23 2018-10-23 Inspection method for intelligent identification model of industrial structure defects

Country Status (1)

Country Link
CN (1) CN109584208B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110689538B (en) * 2019-10-12 2022-03-29 太原科技大学 Tunnel lining crack image detection method
CN111710412B (en) * 2020-05-29 2023-07-25 北京百度网讯科技有限公司 Diagnostic result verification method and device and electronic equipment
CN113034498B (en) * 2021-04-28 2023-11-28 江苏欧密格光电科技股份有限公司 LED lamp bead defect detection and assessment method, device, computer equipment and medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6260483B2 (en) * 2014-07-16 2018-01-17 株式会社デンソー Target detection device
CN105488478B (en) * 2015-12-02 2020-04-07 深圳市商汤科技有限公司 Face recognition system and method
CN108108887A (en) * 2017-12-18 2018-06-01 广东广业开元科技有限公司 A kind of Internet of Things based on multidimensional data is traveled out the intelligent evaluation model of row index
CN108460341B (en) * 2018-02-05 2020-04-07 西安电子科技大学 Optical remote sensing image target detection method based on integrated depth convolution network
CN108681693B (en) * 2018-04-12 2022-04-12 南昌大学 License plate recognition method based on trusted area

Also Published As

Publication number Publication date
CN109584208A (en) 2019-04-05

Similar Documents

Publication Publication Date Title
CN111080622B (en) Neural network training method, workpiece surface defect classification and detection method and device
CN109584208B (en) Inspection method for intelligent identification model of industrial structure defects
CN108257114A (en) A kind of transmission facility defect inspection method based on deep learning
CN108346144B (en) Automatic bridge crack monitoring and identifying method based on computer vision
CN110473173A (en) A kind of defect inspection method based on deep learning semantic segmentation
CN101576956B (en) On-line character detection method based on machine vision and system thereof
CN108711148B (en) Tire defect intelligent detection method based on deep learning
CN111401419A (en) Improved RetinaNet-based employee dressing specification detection method
CN110490842A (en) A kind of steel strip surface defect detection method based on deep learning
CN108830332A (en) A kind of vision vehicle checking method and system
CN109544522A (en) A kind of Surface Defects in Steel Plate detection method and system
CN104268538A (en) Online visual inspection method for dot matrix sprayed code characters of beverage cans
CN107292310A (en) A kind of circular pointer dial plate vision positioning and automatic reading method
CN110232379A (en) A kind of vehicle attitude detection method and system
CN109284779A (en) Object detection method based on deep full convolution network
CN111103307A (en) Pcb defect detection method based on deep learning
CN115187527A (en) Separation and identification method for multi-source mixed ultrahigh frequency partial discharge spectrum
CN111815573B (en) Coupling outer wall detection method and system based on deep learning
CN113962951A (en) Training method and device for detecting segmentation model, and target detection method and device
CN109615610B (en) Medical band-aid flaw detection method based on YOLO v2-tiny
CN115082444A (en) Copper pipe weld defect detection method and system based on image processing
CN115100188A (en) Steel plate surface quality automatic grading and judging method for hierarchical defect analysis
CN117114420B (en) Image recognition-based industrial and trade safety accident risk management and control system and method
CN114219753A (en) Power equipment surface defect detection method based on deep learning and terminal
CN111414855B (en) Telegraph pole sign target detection and identification method based on end-to-end regression model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant