CN111340783A - Real-time cloth defect detection method - Google Patents

Real-time cloth defect detection method Download PDF

Info

Publication number
CN111340783A
CN111340783A CN202010122300.2A CN202010122300A CN111340783A CN 111340783 A CN111340783 A CN 111340783A CN 202010122300 A CN202010122300 A CN 202010122300A CN 111340783 A CN111340783 A CN 111340783A
Authority
CN
China
Prior art keywords
cloth
image
defect
pixels
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010122300.2A
Other languages
Chinese (zh)
Inventor
张发恩
陈冰
张泽覃
黄泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alnnovation Guangzhou Technology Co ltd
Original Assignee
Alnnovation Guangzhou Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alnnovation Guangzhou Technology Co ltd filed Critical Alnnovation Guangzhou Technology Co ltd
Priority to CN202010122300.2A priority Critical patent/CN111340783A/en
Publication of CN111340783A publication Critical patent/CN111340783A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention discloses a real-time cloth defect detection method, which relates to a cloth defect detection technology and comprises the following steps: collecting a cloth picture to be detected with a preset first size; performing feature extraction on the cloth image to be detected according to a feature extractor to obtain a cloth feature image corresponding to the cloth image to be detected; sampling the cloth characteristic diagram by a preset step length by adopting a sliding window with a preset second size to obtain a plurality of sampling characteristic diagrams; respectively carrying out cloth classification on each sampling characteristic graph according to a cloth classification model to obtain cloth classification results corresponding to each sampling characteristic graph; and generating a classification result thermodynamic diagram according to each cloth classification result, and processing the classification result thermodynamic diagram to obtain a cloth defect area in the cloth diagram to be detected and the defect type of the cloth defect area. The invention can achieve real-time detection under the condition of ensuring the precision.

Description

Real-time cloth defect detection method
Technical Field
The invention relates to the technical field of cloth defect detection, in particular to a real-time cloth defect detection method.
Background
In the textile production process, the surface defects of the cloth are key factors influencing the quality of the cloth, the cloth detection is generally completed manually for a long time, the manual detection depends on the experience and proficiency of cloth inspectors, the evaluation standard is unstable and inconsistent, and therefore false detection and missed detection can be frequently generated, so that the machine vision is adopted to replace manual automation to identify the defects of the cloth, and the inevitable trend of the textile industry development is formed.
Conventional machine vision methods fall into the following categories: statistics-based (GLCM, LBP, MEAN-STD), machine learning model-based (GMM, MRF), frequency domain-based (FFT, Wavelet Transform), filter-based (Sobel, Gabor), texture unit-based (self-similarity calculation, etc.), where either the calculation is too complex, a graph is calculated for tens of seconds; or parameters need to be modified aiming at different materials; or is too sensitive to some changes of the material and has poor robustness.
According to the target detection-based method, due to the fact that the shapes of characteristic defects are different, a large number of normal cloth areas can be marked as defects when the defects are represented by rectangular frames, and due to the fact that the defect proportion is extremely low, target detection accuracy is difficult to meet production requirements, and finally most target detection algorithms are difficult to meet real-time requirements.
The semantic segmentation method based on deep learning has high requirements on labeling, has poor detection effect on mutually overlapped defects, and generally hardly meets the real-time requirements.
Disclosure of Invention
The invention aims to provide a real-time cloth defect detection method.
In order to achieve the purpose, the invention adopts the following technical scheme:
the method for detecting the defects of the cloth in real time comprises a process of generating a feature extractor and a cloth classification model in advance, and specifically comprises the following steps:
step A1, collecting an original large cloth image with a preset first size;
step A2, carrying out pixel level labeling on the cloth defect area in the cloth original big image to obtain a cloth labeled big image;
step A3, randomly sampling the cloth labeled large graph to obtain a plurality of cloth sample images with preset second sizes;
step A4, respectively carrying out data preprocessing on each cloth sample image to obtain a plurality of cloth preprocessing images;
step A5, training according to the cloth preprocessing image to obtain the feature extractor;
and A6, training according to the cloth preprocessing image to obtain the cloth classification model.
The method also comprises a real-time cloth defect detection process, and specifically comprises the following steps:
step S1, collecting the cloth picture to be detected with the preset first size;
step S2, extracting the characteristics of the cloth image to be detected according to the characteristic extractor to obtain a cloth characteristic image corresponding to the cloth image to be detected;
step S3, sampling the cloth feature map by a preset step length by adopting a sliding window with the preset second size to obtain a plurality of sampling feature maps;
step S4, performing cloth classification on each sampling feature map according to the cloth classification model to obtain cloth classification results corresponding to each sampling feature map;
and step S5, generating a classification result thermodynamic diagram according to each cloth classification result, and processing the classification result thermodynamic diagram to obtain a cloth defect area in the cloth diagram to be detected and the defect type of the cloth defect area.
As a preferred aspect of the present invention, the preset first size is 8192 pixels by 8192 pixels.
As a preferred aspect of the present invention, the predetermined second size is 256 pixels by 256 pixels.
In a preferred embodiment of the present invention, in the step a4, the data preprocessing includes performing random color perturbation, and/or random truncation, and/or histogram equalization, and/or random horizontal direction flipping, and/or random vertical direction flipping, and/or random erasure, and/or normalization on each cloth sample image.
As a preferred scheme of the invention, the feature extractor is a convolutional neural network model before the global average pooling layer of se-renet 50 based on the residual error and channel attention mechanism.
As a preferred aspect of the present invention, the network structure of the piece goods classification model includes a global pooling layer, a fully-connected layer connected to the global pooling layer, and a softmax layer connected to the fully-connected layer.
As a preferred scheme of the present invention, in the training process of the cloth classification model, the loss function used is a weighted loss function, and a calculation formula of the weighted loss function is as follows:
TotalLoss=λ1L62L7
wherein,
Figure BDA0002393321460000021
Figure BDA0002393321460000031
wherein,
TotalLoss is used to represent the weighted loss function;
L1for representing a cosface loss function;
L2for representing an arcface loss function;
λ1、λ2weights of the cosface loss function and the arcface loss function are respectively set;
m is used for representing the sample number of the current training batch;
m is used for representing the model hyper-parameter of the cloth classification model;
yia label for representing the ith sample;
i is used for representing the ith training sample in the current training batch;
j is used to represent the jth training sample in the current training batch.
As a preferred embodiment of the present invention, λ1A value of 0.5, λ2The value is 0.5.
As a preferable solution of the present invention, before performing step S2, the method further includes performing image preprocessing on the cloth image to be detected, where the image preprocessing includes histogram equalization and/or normalization.
As a preferable embodiment of the present invention, in the step S3, the preset step size is 128 pixels.
In a preferred embodiment of the present invention, in step S4, the piece goods classification result is a normal piece goods, a mixed line, a wrinkle, a filament drawing, a stain, or a hole.
As a preferable embodiment of the present invention, the step S5 specifically includes:
step S51, constructing an initial thermodynamic diagram, and adding one to the pixel value of the corresponding area on the initial thermodynamic diagram according to each piece of cloth classification result to obtain a classification result thermodynamic diagram containing the pixel value of each pixel position;
step S52, comparing each of the pixel values with a preset pixel threshold:
if the pixel value is smaller than the pixel threshold value, exiting;
if the pixel value is not smaller than the pixel threshold value, extracting the corresponding pixel position;
step S53, a plurality of connected regions formed by the extracted pixel positions are used as cloth defect regions in the cloth image to be detected and output;
step S54, for each cloth defect region, counting the number of pixels included in each cloth classification result corresponding to the cloth defect region, and sorting the number of pixels to obtain a pixel sequence;
and step S55, taking the cloth classification result corresponding to the number of the pixels which are sequenced most front in the pixel sequence as the defect type of the cloth defect area in the cloth image to be detected and outputting the defect type.
The invention has the beneficial effects that:
1) a high resolution large map of 8192 pixels per second 8192 pixels can be processed in real time on RTX 2080;
2) by adopting the method of classifying and enlarging the sliding window of the graph characteristics in a small scale, defects are represented in a finer granularity, normal samples are utilized more, the cloth defect detection progress is effectively improved, and real-time detection can be achieved under the condition of ensuring the precision.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments of the present invention will be briefly described below. It is obvious that the drawings described below are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a flowchart illustrating a process of pre-generating a feature extractor and a cloth classification model according to an embodiment of the present invention.
Fig. 2 is a flowchart illustrating a process of detecting a defect in cloth in real time according to an embodiment of the present invention.
FIG. 3 is a block flow diagram of the prediction phase according to an embodiment of the invention.
Fig. 4 is a schematic flow chart illustrating a process of performing post-processing on a cloth classification result according to an embodiment of the present invention.
Detailed Description
The technical scheme of the invention is further explained by the specific implementation mode in combination with the attached drawings.
Wherein the showings are for the purpose of illustration only and are shown by way of illustration only and not in actual form, and are not to be construed as limiting the present patent; to better illustrate the embodiments of the present invention, some parts of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The same or similar reference numerals in the drawings of the embodiments of the present invention correspond to the same or similar components; in the description of the present invention, it should be understood that if the terms "upper", "lower", "left", "right", "inner", "outer", etc. are used for indicating the orientation or positional relationship based on the orientation or positional relationship shown in the drawings, it is only for convenience of description and simplification of description, but it is not indicated or implied that the referred device or element must have a specific orientation, be constructed in a specific orientation and be operated, and therefore, the terms describing the positional relationship in the drawings are only used for illustrative purposes and are not to be construed as limitations of the present patent, and the specific meanings of the terms may be understood by those skilled in the art according to specific situations.
In the description of the present invention, unless otherwise explicitly specified or limited, the term "connected" or the like, if appearing to indicate a connection relationship between the components, is to be understood broadly, for example, as being fixed or detachable or integral; can be mechanically or electrically connected; they may be directly connected or indirectly connected through intervening media, or may be connected through one or more other components or may be in an interactive relationship with one another. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Based on the technical problems in the prior art, the invention provides a real-time cloth defect detection method, as shown in fig. 1, which comprises a process of pre-generating a feature extractor and a cloth classification model, and specifically comprises the following steps:
step A1, collecting an original large cloth image with a preset first size;
step A2, performing pixel level labeling on a cloth defect area in an original big cloth image to obtain a cloth labeled big cloth image;
step A3, randomly sampling the cloth labeled large graph to obtain a plurality of cloth sample images with preset second sizes;
step A4, respectively carrying out data preprocessing on each cloth sample image to obtain a plurality of cloth preprocessing images;
step A5, training according to cloth preprocessing images to obtain a feature extractor;
and step A6, training according to the cloth preprocessing image to obtain a cloth classification model.
The method also comprises a real-time cloth defect detection process, as shown in fig. 2, and specifically comprises the following steps:
step S1, collecting a cloth picture to be detected with a preset first size;
step S2, extracting the characteristics of the cloth image to be detected according to the characteristic extractor to obtain a cloth characteristic image corresponding to the cloth image to be detected;
step S3, sampling the cloth feature map by a preset step length by adopting a sliding window with a preset second size to obtain a plurality of sampling feature maps;
step S4, respectively carrying out cloth classification on each sampling feature map according to the cloth classification model to obtain cloth classification results corresponding to each sampling feature map;
and step S5, generating a classification result thermodynamic diagram according to each cloth classification result, and processing the classification result thermodynamic diagram to obtain a cloth defect area in the cloth diagram to be detected and the defect type of the cloth defect area.
Specifically, in this embodiment, the real-time cloth defect detecting method of the present invention is a real-time detecting method for detecting multiple defects of a fast moving cloth, and can process two high-resolution cloth pictures of 8192 pixels by 8192 pixels per second, and the high-resolution cloth picture of 8192 pixels by 8192 pixels wins an actual cloth size of 0.8 m by 0.8 m.
The method comprises a training stage, wherein a feature extractor and a cloth classification model are generated in advance in the training stage, specifically, in the training stage, a defect sample and a normal sample are obtained by respectively and randomly sampling at a cloth defect position and a normal area in an original large cloth image in a preset second size, preferably a fixed small size of 256 pixels by 256 pixels, the balance of various samples is ensured by random sampling, and then the feature extractor and the cloth classification model are trained, wherein the cloth classification model can distinguish five defects of normal cloth, miscellaneous lines, folds, silks, dirt and broken holes.
More specifically, in the training stage, firstly, pixel-level labeling is performed on defects of the original large cloth image, and then a 256-pixel-by-256-pixel cloth sample image is obtained in the original large cloth image through random sampling, which corresponds to a cloth area with an actual cloth size of 2.5 cm by 2.5 cm. For each piece of cloth sample image, it is preferable that a defective sample having a defective pixel proportion of more than 20% be used, and a normal sample having a defective pixel proportion of less than 1% be used. Then, respectively carrying out data preprocessing including but not limited to random color disturbance, random interception, histogram equalization, random horizontal or vertical direction turning, random erasing and normalization processing on each piece of cloth sample image with 256 pixels by 256 pixels to obtain a plurality of piece of cloth preprocessed images;
the above feature extractor preferably uses a neural network model before the global average pooling layer of se-renet 50 based on residual and channel attention mechanisms to extract features, and adds a spatial attention mechanism to the feature extraction to focus on the important region.
In order to make the extracted features more discriminable, namely, increase the inter-class distance and decrease the intra-class distance, the invention preferably uses weighting loss functions of arcfacce and cosface in the training process of the piece goods classification model, wherein the two loss functions are both improved versions of softmax cross entropy loss functions, and after the feature vector and the weight vector are normalized by softmax, the cosface introduces a Margin constant into the included angle of the two vectors, and the formula is as follows:
Figure BDA0002393321460000061
a constant is introduced to the cosine of the included angle of the two vectors by the arcface, and the formula is as follows:
Figure BDA0002393321460000062
finally, the total loss function is defined as totaloss ═ λ1L62L7Wherein λ1And λ2Can be dynamically adjusted, and is preferably set to 0.5 and 0.5 respectively in the invention, and further, m is a model hyper-parameter which can be adjusted, and the formula L6In the formula, m is preferably 0.5, i.e., formula L7In (1), m is preferably 0.5.
The invention also comprises a prediction stage, wherein the feature extractor and the cloth classification model generated in the training stage are adopted in the prediction stage to predict the cloth defect area of the cloth to be detected and the defect type of the cloth defect area. Specifically, preferably, for a large original cloth image of 8192 pixels by 8192 pixels, a feature extractor is used for extracting to obtain a cloth feature image, then a small step window is used for sliding on the feature image to obtain a plurality of small feature images serving as sampling feature images, and then the cloth classification model is respectively called for classification. The window size preferably corresponds to the size of 256 pixels 256 in the original piece of cloth, and a piece of cloth classification result is used for constructing a 8192 classification result thermodynamic diagram, wherein the hot spot region in the classification result thermodynamic diagram is the piece of cloth defect region.
Furthermore, the method adopts a mode of small-scale classification and enlarging a sliding window of the image features, combines the advantages of classification and target detection, and compared with the traditional classification, because the image has high resolution and extremely small defect proportion and the traditional classification precision is very low, the method focuses on the defect area through small-window sliding, so that the defect proportion is greatly improved, and the defect classification performance is greatly improved. Compared with target detection, the method adopts small-size sample classification training in the training stage, so that defects can be represented in a finer granularity manner, normal cloth with a large proportion can be utilized, and the main reason that the precision is higher than the target detection is; in the prediction stage, the generation of candidate frames represented by anchors or pixel points and the like in the target detection greatly simplified by the characteristic diagram sliding of the original large diagram is adopted, the detection time is accelerated, and the defect positioning error is optimized by the post-processing of the thermodynamic diagram, so that the method can achieve real time under the condition of ensuring the precision.
In a preferred embodiment of the present invention, as shown in fig. 3, the real-time cloth defect detecting method of the present invention can process cloth with a width of 1.6 meters at a detecting speed of 1 meter per second, and preferably two original large images of the cloth with 8192 pixels by 8192 pixels are output per second by using two line scan cameras as the cloth image to be detected. Before the cloth images to be detected are detected, each cloth image to be detected is preferably preprocessed, the preprocessing comprises histogram equalization and normalization, and in order to improve concurrency and video memory limitation, two cloth images to be detected of 8192 pixels are preferably cut into 16 images of 1024 pixels 8192 pixels. Then, a feature extractor is used to extract a piece feature map of each picture of 1024 pixels × 8192 pixels. And then, performing window sliding on the cloth feature map, preferably using a window with the size corresponding to 256 pixels by 256 pixels of the cloth feature map to be detected, and sliding from left to right and from top to bottom by a step size corresponding to 128 pixels of the cloth feature map to be detected to obtain a plurality of sampling feature maps. And classifying the sampling characteristic graph intercepted by each sliding window by adopting a cloth classification model to obtain a corresponding cloth classification result. And finally, performing post-processing on the cloth classification result to obtain a cloth defect area in the cloth image to be detected and the defect type of the cloth defect area. Preferably, an initial thermodynamic diagram of 1024 pixels × 8192 pixels is initialized, pixel positions of corresponding regions are increased by one according to the cloth classification result, a connected region with the maximum value in the thermodynamic diagram is extracted by using a threshold value to serve as a defect regression frame, and the type of the defect is determined through a voting mechanism.
As a preferable aspect of the present invention, the predetermined first size is 8192 pixels by 8192 pixels.
As a preferable aspect of the present invention, the predetermined second size is 256 pixels by 256 pixels.
In a preferred embodiment of the present invention, in step a4, the data preprocessing includes subjecting each cloth sample image to random color perturbation, and/or random truncation, and/or histogram equalization, and/or random horizontal direction inversion, and/or random vertical direction inversion, and/or random erasure, and/or normalization.
As a preferred aspect of the present invention, the feature extractor is a convolutional neural network model before the global mean pooling layer of se-resnet50 based on the residual and channel attention mechanisms.
As a preferable scheme of the invention, the network structure of the cloth classification model comprises a global pooling layer, a full-connection layer connected with the global pooling layer and a softmax layer connected with the full-connection layer.
As a preferred scheme of the invention, in the training process of the cloth classification model, the adopted loss function is a weighted loss function, and the calculation formula of the weighted loss function is as follows:
TotalLoss=λ1L62L7
wherein,
Figure BDA0002393321460000081
Figure BDA0002393321460000082
wherein,
TotalLoss is used to represent a weighted loss function;
L6for representing a cosface loss function;
L7for representing an arcface loss function;
λ1、λ2respectively weighting a cosface loss function and an arcface loss function;
m is used for representing the sample number of the current training batch;
m is used for representing the model hyper-parameter of the cloth classification model;
yia label for representing the ith sample;
i is used for representing the ith training sample in the current training batch;
j is used to represent the jth training sample in the current training batch.
As a preferred embodiment of the present invention, λ1A value of 0.5, λ2The value is 0.5.
As a preferred embodiment of the present invention, before performing step S2, the method further includes performing image preprocessing on the cloth image to be detected, where the image preprocessing includes histogram equalization and/or normalization.
In a preferred embodiment of the present invention, in step S3, the preset step size is 128 pixels.
In a preferred embodiment of the present invention, in step S4, the piece goods is classified into a normal piece goods, a mixed line, a wrinkle, a silk drawing, a dirt, or a hole.
As a preferable aspect of the present invention, as shown in fig. 4, step S5 specifically includes:
step S51, constructing an initial thermodynamic diagram, and adding one to the pixel value of the corresponding area on the initial thermodynamic diagram according to each cloth classification result to obtain a classification result thermodynamic diagram containing the pixel value of each pixel position;
step S52, comparing each pixel value with a preset pixel threshold:
if the pixel value is smaller than the pixel threshold value, exiting;
if the pixel value is not less than the pixel threshold value, extracting the corresponding pixel position;
step S53, a plurality of connected areas formed by the extracted pixel positions are used as cloth defect areas in the cloth image to be detected and output;
step S54, for each cloth defect area, counting the number of pixels contained in each corresponding cloth classification result in the cloth defect area, and sequencing the number of pixels to obtain a pixel sequence;
and step S55, taking the cloth classification result corresponding to the quantity of the top-ranked pixels in the pixel sequence as the defect type of the cloth defect area in the cloth image to be detected and outputting the defect type.
It should be understood that the above-described embodiments are merely preferred embodiments of the invention and the technical principles applied thereto. It will be understood by those skilled in the art that various modifications, equivalents, changes, and the like can be made to the present invention. However, such variations are within the scope of the invention as long as they do not depart from the spirit of the invention. In addition, certain terms used in the specification and claims of the present application are not limiting, but are used merely for convenience of description.

Claims (12)

1. A real-time cloth defect detection method is characterized by comprising a process of pre-generating a feature extractor and a cloth classification model, and specifically comprises the following steps:
step A1, collecting an original large cloth image with a preset first size;
step A2, carrying out pixel level labeling on the cloth defect area in the cloth original big image to obtain a cloth labeled big image;
step A3, randomly sampling the cloth labeled large graph to obtain a plurality of cloth sample images with preset second sizes;
step A4, respectively carrying out data preprocessing on each cloth sample image to obtain a plurality of cloth preprocessing images;
step A5, training according to the cloth preprocessing image to obtain the feature extractor;
step A6, training according to the cloth preprocessing image to obtain the cloth classification model;
the method also comprises a real-time cloth defect detection process, and specifically comprises the following steps:
step S1, collecting the cloth picture to be detected with the preset first size;
step S2, extracting the characteristics of the cloth image to be detected according to the characteristic extractor to obtain a cloth characteristic image corresponding to the cloth image to be detected;
step S3, sampling the cloth feature map by a preset step length by adopting a sliding window with the preset second size to obtain a plurality of sampling feature maps;
step S4, performing cloth classification on each sampling feature map according to the cloth classification model to obtain cloth classification results corresponding to each sampling feature map;
and step S5, generating a classification result thermodynamic diagram according to each cloth classification result, and processing the classification result thermodynamic diagram to obtain a cloth defect area in the cloth diagram to be detected and the defect type of the cloth defect area.
2. The method of claim 1, wherein the predetermined first size is 8192 pixels by 8192 pixels.
3. The method of claim 1, wherein the predetermined second size is 256 pixels by 256 pixels.
4. The method for detecting cloth defects in real time as claimed in claim 1, wherein in the step a4, the data preprocessing comprises subjecting each cloth sample image to random color perturbation, and/or random truncation, and/or histogram equalization, and/or random horizontal direction inversion, and/or random vertical direction inversion, and/or random erasure, and/or normalization.
5. The real-time cloth defect detection method of claim 1, wherein the feature extractor is a convolutional neural network model before a global mean pooling layer of se-renet 50 based on residual and channel attention mechanisms.
6. The real-time cloth defect detection method of claim 1, wherein the network structure of the cloth classification model comprises a global pooling layer, a fully-connected layer connected with the global pooling layer, and a softmax layer connected with the fully-connected layer.
7. The real-time cloth defect detecting method of claim 6, wherein the loss function adopted in the training process of the cloth classification model is a weighted loss function, and the calculation formula of the weighted loss function is as follows:
TotalLoss=λ1L62L7
wherein,
Figure FDA0002393321450000021
Figure FDA0002393321450000022
wherein,
TotalLoss is used to represent the weighted loss function;
L6for representing a cosface loss function;
L7for representing an arcface loss function;
λ1、λ2weights of the cosface loss function and the arcface loss function are respectively set;
m is used for representing the sample number of the current training batch;
m is used for representing the model hyper-parameter of the cloth classification model;
yia label for representing the ith sample;
i is used for representing the ith training sample in the current training batch;
j is used to represent the jth training sample in the current training batch.
8. The method of claim 7, wherein λ is a measure of the distance between the web and the web1A value of 0.5, λ2The value is 0.5.
9. The method for detecting defects in cloth of claim 1, wherein before the step S2, the method further comprises performing image preprocessing on the cloth to be detected, wherein the image preprocessing includes histogram equalization and/or normalization.
10. The method as claimed in claim 1, wherein the step S3 is performed in such a way that the preset step size is 128 pixels.
11. The method as claimed in claim 1, wherein in step S4, the cloth classification result is normal cloth, or miscellaneous threads, or wrinkles, or silks, or stains, or holes.
12. The method for detecting defects in cloth of claim 1, wherein the step S5 specifically includes:
step S51, constructing an initial thermodynamic diagram, and adding one to the pixel value of the corresponding area on the initial thermodynamic diagram according to each piece of cloth classification result to obtain a classification result thermodynamic diagram containing the pixel value of each pixel position;
step S52, comparing each of the pixel values with a preset pixel threshold:
if the pixel value is smaller than the pixel threshold value, exiting;
if the pixel value is not smaller than the pixel threshold value, extracting the corresponding pixel position;
step S53, a plurality of connected regions formed by the extracted pixel positions are used as cloth defect regions in the cloth image to be detected and output;
step S54, for each cloth defect region, counting the number of pixels included in each cloth classification result corresponding to the cloth defect region, and sorting the number of pixels to obtain a pixel sequence;
and step S55, taking the cloth classification result corresponding to the number of the pixels which are sequenced most front in the pixel sequence as the defect type of the cloth defect area in the cloth image to be detected and outputting the defect type.
CN202010122300.2A 2020-02-27 2020-02-27 Real-time cloth defect detection method Pending CN111340783A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010122300.2A CN111340783A (en) 2020-02-27 2020-02-27 Real-time cloth defect detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010122300.2A CN111340783A (en) 2020-02-27 2020-02-27 Real-time cloth defect detection method

Publications (1)

Publication Number Publication Date
CN111340783A true CN111340783A (en) 2020-06-26

Family

ID=71183711

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010122300.2A Pending CN111340783A (en) 2020-02-27 2020-02-27 Real-time cloth defect detection method

Country Status (1)

Country Link
CN (1) CN111340783A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113327621A (en) * 2021-06-09 2021-08-31 携程旅游信息技术(上海)有限公司 Model training method, user identification method, system, device and medium
CN114235759A (en) * 2022-02-25 2022-03-25 季华实验室 Defect detection method, device, equipment and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106996935A (en) * 2017-02-27 2017-08-01 华中科技大学 A kind of multi-level fuzzy judgment Fabric Defects Inspection detection method and system
CN109509187A (en) * 2018-11-05 2019-03-22 中山大学 A kind of efficient check algorithm for the nibs in big resolution ratio cloth image
CN109635920A (en) * 2018-11-12 2019-04-16 北京市商汤科技开发有限公司 Neural network optimization and device, electronic equipment and storage medium
CN110175988A (en) * 2019-04-25 2019-08-27 南京邮电大学 Cloth defect inspection method based on deep learning
CN110827260A (en) * 2019-11-04 2020-02-21 燕山大学 Cloth defect classification method based on LBP (local binary pattern) features and convolutional neural network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106996935A (en) * 2017-02-27 2017-08-01 华中科技大学 A kind of multi-level fuzzy judgment Fabric Defects Inspection detection method and system
CN109509187A (en) * 2018-11-05 2019-03-22 中山大学 A kind of efficient check algorithm for the nibs in big resolution ratio cloth image
CN109635920A (en) * 2018-11-12 2019-04-16 北京市商汤科技开发有限公司 Neural network optimization and device, electronic equipment and storage medium
CN110175988A (en) * 2019-04-25 2019-08-27 南京邮电大学 Cloth defect inspection method based on deep learning
CN110827260A (en) * 2019-11-04 2020-02-21 燕山大学 Cloth defect classification method based on LBP (local binary pattern) features and convolutional neural network

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113327621A (en) * 2021-06-09 2021-08-31 携程旅游信息技术(上海)有限公司 Model training method, user identification method, system, device and medium
CN114235759A (en) * 2022-02-25 2022-03-25 季华实验室 Defect detection method, device, equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN106875381B (en) Mobile phone shell defect detection method based on deep learning
CN113592845A (en) Defect detection method and device for battery coating and storage medium
CN109191421B (en) Visual detection method for pits on circumferential surface of cylindrical lithium battery
WO2020048248A1 (en) Textile defect detection method and apparatus, and computer device and computer-readable medium
CN101930549B (en) Second generation curvelet transform-based static human detection method
CN110889838A (en) Fabric defect detection method and device
CN112862744B (en) Intelligent detection method for internal defects of capacitor based on ultrasonic image
CN112488211A (en) Fabric image flaw classification method
CN110334594A (en) A kind of object detection method based on batch again YOLO algorithm of standardization processing
CN111340783A (en) Real-time cloth defect detection method
CN109858570A (en) Image classification method and system, computer equipment and medium
CN116664565A (en) Hidden crack detection method and system for photovoltaic solar cell
CN111783885A (en) Millimeter wave image quality classification model construction method based on local enhancement
CN110781913A (en) Zipper cloth belt defect detection method
CN115775236A (en) Surface tiny defect visual detection method and system based on multi-scale feature fusion
An et al. Fabric defect detection using deep learning: An Improved Faster R-approach
CN115984186A (en) Fine product image anomaly detection method based on multi-resolution knowledge extraction
Bennamoun et al. Automatic visual inspection and flaw detection in textile materials: Past, present and future
Gui et al. A fast caption detection method for low quality video images
CN108257148A (en) The target of special object suggests window generation method and its application in target following
Juang et al. Inspection of lead frame defects using deep CNN and cycle-consistent GAN-based defect augmentation
CN106943116A (en) A kind of infant eyesight automatic testing method
Alimohamadi et al. Defect detection in textiles using morphological analysis of optimal Gabor wavelet filter response
CN107679528A (en) A kind of pedestrian detection method based on AdaBoost SVM Ensemble Learning Algorithms
Niskanen et al. Experiments with SOM based inspection of wood

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200626

RJ01 Rejection of invention patent application after publication