CN114913168A - Fabric texture abnormity detection method - Google Patents

Fabric texture abnormity detection method Download PDF

Info

Publication number
CN114913168A
CN114913168A CN202210638094.XA CN202210638094A CN114913168A CN 114913168 A CN114913168 A CN 114913168A CN 202210638094 A CN202210638094 A CN 202210638094A CN 114913168 A CN114913168 A CN 114913168A
Authority
CN
China
Prior art keywords
texture
fabric
detection method
image
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210638094.XA
Other languages
Chinese (zh)
Inventor
应志平
李建强
胡旭东
吴震宇
向忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sci Tech University ZSTU
Original Assignee
Zhejiang Sci Tech University ZSTU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sci Tech University ZSTU filed Critical Zhejiang Sci Tech University ZSTU
Priority to CN202210638094.XA priority Critical patent/CN114913168A/en
Publication of CN114913168A publication Critical patent/CN114913168A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/086Learning methods using evolutionary algorithms, e.g. genetic algorithms or genetic programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Physiology (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a fabric texture abnormity detection method, which comprises the following process steps: 1) collecting fabric image information; 2) decomposing texture characteristic information of the fabric texture; 3) extracting a texture feature vector; 4) judging and classifying texture abnormality; 5) and sending a corresponding control instruction to the lower computer according to the texture abnormal information. The fabric texture anomaly detection method extracts texture feature information based on image scale space distribution features, thereby realizing classification and identification of fabric texture anomalies; the method comprehensively considers factors of texture abnormity, improves the detection rate and ensures accurate detection results.

Description

Fabric texture abnormity detection method
[ technical field ] A method for producing a semiconductor device
The invention relates to a detection method of a fabric, in particular to a detection method of fabric texture abnormity, and belongs to the technical field of textile detection.
[ background of the invention ]
The textile fabric is divided into two categories of woven fabric and knitted fabric, and the woven fabric is interwoven in two directions of warp and weft to form various weave structures (plain weave, twill weave and satin weave); the knitted fabric has more complex and diversified weave structures, and the texture abnormity is inevitably generated in the weaving process to influence the fabric quality.
The fabric texture abnormity detection is a key point and difficulty of textile quality detection, and the fabric texture abnormity comprises missing stitches, pattern stitches, loose density, crossroads, broken yarns and the like. The existing fabric defect detection method is lack of a detection method aiming at fabric texture abnormity, and the detection rate is low due to the lack of understanding and comprehensive consideration of weaving technologies such as weaving mechanism, fabric structure, weaving control and the like, and the detection result is difficult to feed back to a production field to improve the fabric quality.
Therefore, in order to solve the above technical problems, it is necessary to provide an innovative method for detecting fabric texture abnormality, so as to overcome the above-mentioned drawbacks in the prior art.
[ summary of the invention ]
The invention aims to provide a fabric texture abnormity detection method which is simple in process, comprehensive in abnormity feature consideration and high in detection result accuracy.
In order to realize the purpose, the invention adopts the technical scheme that: a fabric texture anomaly detection method comprises the following process steps:
1) collecting fabric image information;
2) decomposing texture characteristic information of the fabric texture;
3) extracting a texture feature vector;
4) judging and classifying the texture abnormality;
5) and sending a corresponding control instruction to the lower computer according to the texture abnormal information.
The fabric texture abnormality detection method of the invention further comprises the following steps: the image information acquisition of the step 1) comprises the following steps:
1-1), combining a plurality of high-speed industrial cameras which are arranged at equal intervals on the width of the fabric;
1-2), a rear light source illumination mode is adopted, and the texture abnormal features of the fabric are highlighted;
1-3), the high bandwidth image acquisition card acquires image information.
The fabric texture abnormality detection method of the invention further comprises the following steps: in the step 1-2), the fabric texture abnormal features comprise holes, dropped stitches, patterned stitches, hanging stitches, dense roads, cross roads, broken yarns and triangular eyes.
The fabric texture abnormality detection method of the invention further comprises the following steps: the step 2) is specifically as follows:
2-1), spreading a fabric image through a window empirical mode to obtain a time domain and frequency domain texture characteristic signal with periodic fluctuation, and decomposing and spreading the fabric image into:
Figure BDA0003682764620000021
2-2), solving local maxima and minima of the signal, the modal components of which are represented as follows:
h 1 (t)=x(t)-m 1 (t);
2-3), surrounding the extreme points by adopting a cubic spline curve;
2-4), calculating the average value m of the enclosing lines of the extreme points 1 (t);
2-5) carrying out signal screening on the average value signal based on the original signal, wherein the specific method is as follows:
r 1 (t)=x(t)-c 1 (t)
2-6), the screening signal is compared with the standard deviation for control, and the following steps are carried out:
Figure BDA0003682764620000031
when the value of SD is between 0.2 and 0.3, the screening condition is met;
2-7), obtaining a new signal after the difference between the original signal and the screening signal, namely:
Figure BDA0003682764620000032
where the number of extrema of r is less than 2.
The fabric texture abnormality detection method of the invention further comprises the following steps: the step 3) is specifically as follows:
3-1), dividing the texture characteristics of the fabric into a coil (texture element) and a string-sleeve relation (space dependency relation);
3-2), acquiring a feature vector of the uniformity degree of the gray level distribution of the image and a feature vector of the complexity degree through a gray level co-occurrence matrix; setting a direction parameter a and a direction parameter b of an image with M multiplied by N and S gray scale, wherein the value can be 0 or 1; if the gray value of any point (x, y) in the image is i, the gray value of the point (x + a, y + b) is j, and (i, j) is a gray level symbiotic point;
the image gray level distribution uniformity description equation:
Figure BDA0003682764620000033
the image gray level distribution complexity description equation:
Figure BDA0003682764620000034
3-3), expressing the characteristic vector of the texture change track through the direction degree, wherein the formula is as follows:
Figure BDA0003682764620000035
wherein n is p Is the number of peaks in the probability histogram, P is the peak point, phi p Is the center point of wave front, w p The trough distance on both sides of the peak, H D (phi) is the edge probability;
3-3), fusing a plurality of groups of feature vectors, wherein the fusion feature CWI is expressed as: CWI ═ a + E + F.
The fabric texture abnormality detection method of the invention further comprises the following steps: the step 4) is specifically as follows:
4-1), building an evolutionary neural network model;
4-2), carrying out global optimization in an evolutionary neural network to obtain an initialization population of an optimal Harris eagle model, wherein the position vector of the Harris eagle model is expressed as follows:
Figure BDA0003682764620000041
wherein X prey (t) is the optimal position, X (t) represents the current position, X rand (t) is a random position, r i Is [0,1 ]]UB, LB are the upper and lower limits of the variables, Xm (t) is the average position of the individual, expressed as:
Figure BDA0003682764620000042
4-3), optimizing the weight value (w) and the bias value (b) in the optimal Harris eagle model to obtain an optimal regularization extreme learning machine, which is expressed as: h is i (x)=g(w i ,b i ,x)=g(w i x+b i ) Wherein g (x) is a Sigmoid activation function;
4-4) performing texture abnormity judgment and classification detection based on an improved regularization extreme learning machine.
The fabric texture abnormality detection method of the invention may further include: the specific method for building the evolutionary neural network model comprises the following steps:
randomly generating an initial population of size M × N, whose individual representations:
X i (0)=[x i,1 (0),x i,2 (0),x i,3 (0),......,x i,N (0)](i ═ 1, 2, 3.... M), where X is i,j (0) Representing 0 generation population, and the number of initial population is 50;
the DE algorithm is used to implement individual variation, expressed as:
V i (g)=X best (g)+,[X p1 (g)-X p2 (g)]+F·[X p3 (g)-X p4 (g)]wherein X is best (g) Represents the bestIndividual, X pi (g) Representing random individuals, the value of the scaling factor F is 1;
the cross variation is expressed as:
Figure BDA0003682764620000051
wherein cr is [0,1 ]]Selecting the cross probability cr value to be 0.8;
the optimal generation is represented as:
Figure BDA0003682764620000052
compared with the prior art, the invention has the following beneficial effects:
1. the fabric texture anomaly detection method extracts texture feature information based on image scale space distribution features, thereby realizing classification and identification of fabric texture anomalies.
2. The fabric texture abnormity detection method comprehensively considers the factors of texture abnormity, improves the detection rate and ensures that the detection result is accurate.
3. The fabric texture abnormity detection method can issue corresponding control instructions to production field equipment according to the texture abnormity information, thereby improving the production quality of the fabric.
[ description of the drawings ]
Fig. 1 is an overall flowchart of a fabric texture abnormality detection method of the present invention.
FIG. 2 is a flow chart of the decomposition of the texture characteristic information of the fabric in the step 2) of the invention.
FIG. 3 is a flow chart of the present invention for extracting the texture feature vector of the fabric in step 3).
FIG. 4 is a flowchart of texture anomaly determination and classification detection in step 4) of the present invention.
[ detailed description ] embodiments
Referring to the attached drawings 1 to 4 in the specification, the invention relates to a fabric texture abnormity detection method, which comprises the following process steps:
1) collecting fabric image information; the method specifically comprises the following steps:
1-1), combining a plurality of high-speed industrial cameras which are arranged at equal intervals on the width of the fabric;
1-2), a rear light source illumination mode is adopted, and the texture abnormal features of the fabric are highlighted;
1-3), the high bandwidth image acquisition card acquires image information.
2) Decomposing texture characteristic information of the fabric texture; please refer to fig. 2 of the specification, which specifically includes the following steps:
2-1), spreading a fabric image through a window empirical mode to obtain a time domain and frequency domain texture characteristic signal with periodic fluctuation, and decomposing and spreading the fabric image into:
Figure BDA0003682764620000061
the fabric texture abnormal features comprise holes, drop stitches, pattern stitches, drop stitches, thick roads, transverse roads, broken yarns and triangular eyes.
2-2), solving local maxima and minima of the signal, the modal components of which are represented as follows:
h 1 (t)=x(t)-m 1 (t)。
2-3), and surrounding the extreme points by a cubic spline curve.
2-4), calculating the average value m of the enclosing lines of the extreme points 1 (t)。
2-5) performing signal screening on the average value signal based on the original signal, wherein the specific method is as follows:
r 1 (t)=x(t)-c 1 (t)。
2-6), the screening signal is compared with the standard deviation for control, and the following steps are carried out:
Figure BDA0003682764620000062
and when the value of SD is between 0.2 and 0.3, the screening condition is met.
2-7), obtaining a new signal after the difference between the original signal and the screening signal, namely:
Figure BDA0003682764620000063
wherein the number of extrema of r is less than 2.
3) Extracting a texture feature vector; please refer to fig. 3 of the specification, which specifically includes the following steps:
3-1), dividing the texture characteristics of the fabric into a coil and string sleeve relationship;
3-2), acquiring a feature vector of the image gray level distribution uniformity and a feature vector of the complexity through a gray level co-occurrence matrix; setting a direction parameter a and a direction parameter b of an image with M multiplied by N and S gray scale, wherein the value can be 0 or 1; if the gray value of any point (x, y) in the image is i, the gray value of the point (x + a, y + b) is j, and (i, j) is a gray level symbiotic point;
the image gray level distribution uniformity description equation:
Figure BDA0003682764620000071
the image gray distribution complexity description equation:
Figure BDA0003682764620000072
3-3), expressing the characteristic vector of the texture change track through the direction degree, wherein the formula is as follows:
Figure BDA0003682764620000073
wherein n is p Is the number of peaks in the probability histogram, P is the peak point, phi p Is the center point of wave front, w p The trough distance on both sides of the peak, H D (phi) is the edge probability;
3-3), fusing a plurality of groups of feature vectors, wherein the fusion feature CWI is expressed as: CWI ═ a + E + F.
4) Judging and classifying texture abnormality; please refer to fig. 4 in the specification, which specifically includes the following steps:
4-1), constructing an evolutionary neural network model, wherein the specific method comprises the following steps:
the specific method comprises the following steps:
randomly generating an initial population of size M × N, whose individual representations:
X i (0)=[x i,1 (0),x i,2 (0),x i,3 (0),......,x i,N (0)](i ═ 1, 2, 3.... M), where X is i,j (0) Representing 0 generation population, and the number of initial population is 50;
the DE algorithm is used to implement individual variation, which is expressed as:
V i (g)=X best (g)+F·[X p1 (g)-X p2 (g)]+F·[x p3 (g)-X p4 (g)]wherein X is best (g) Represents the best individual, X pi (g) Representing random individuals, the value of the scaling factor F is 1;
the cross variation is expressed as:
Figure BDA0003682764620000081
wherein cr is [0,1 ]]Selecting the cross probability cr value to be 0.8;
the optimal generation is represented as:
Figure BDA0003682764620000082
4-2), carrying out global optimization in an evolutionary neural network to obtain an initialization population of an optimal Harris eagle model, wherein the position vector of the Harris eagle model is expressed as follows:
Figure BDA0003682764620000083
wherein X prey (t) is the optimal position, X (t) represents the current position, X rand (t) is a random position, r i Is [0,1 ]]UB, LB are the upper and lower limits of the variables, Xm (t) is the average position of the individual, expressed as:
Figure BDA0003682764620000084
4-3), optimizing the weight value (w) and the bias value (b) in the optimal Harris eagle model to obtain an optimal regularization extreme learning machine, which is expressed as: h is a total of i (x)=g(w i ,b i ,x)=g(w i x+b i ) Wherein g (x) is a Sigmoid activation function;
4-4) performing texture abnormity judgment and classification detection based on an improved regularization extreme learning machine.
5) And sending a corresponding control instruction to the lower computer according to the texture abnormal information.
The fabric texture anomaly detection method comprehensively considers factors of texture anomaly, improves the detection rate and ensures that the detection result is accurate; meanwhile, a corresponding control command can be issued to production field equipment according to the texture abnormal information, so that the production quality of the fabric is improved.
The above embodiments are merely preferred embodiments of the present disclosure, which are not intended to limit the present disclosure, and any modifications, equivalents, improvements and the like, which are within the spirit and principle of the present disclosure, should be included in the scope of the present disclosure.

Claims (7)

1. A fabric texture anomaly detection method is characterized by comprising the following steps: the method comprises the following process steps:
1) collecting fabric image information;
2) decomposing texture characteristic information of the fabric texture;
3) extracting a texture feature vector;
4) judging and classifying texture abnormality;
5) and sending a corresponding control instruction to the lower computer according to the texture abnormity information.
2. The fabric texture abnormality detection method according to claim 1, characterized in that: the image information acquisition of the step 1) comprises the following steps:
1-1), combining a plurality of high-speed industrial cameras which are arranged at equal intervals on the width of the fabric;
1-2), a rear light source illumination mode is adopted, and the texture abnormal features of the fabric are highlighted;
1-3), the high bandwidth image acquisition card acquires image information.
3. The fabric texture abnormality detection method according to claim 2, characterized in that: in the step 1-2), the fabric texture abnormal features comprise holes, dropped stitches, patterned stitches, hanging stitches, dense roads, cross roads, broken yarns and triangular eyes.
4. The fabric texture abnormality detection method according to claim 1, characterized in that: the step 2) is specifically as follows:
2-1), spreading a fabric image through a window empirical mode to obtain a time domain and frequency domain texture characteristic signal with periodic fluctuation, and decomposing and spreading the fabric image into:
Figure FDA0003682764610000021
2-2), solving local maxima and minima of the signal, the modal components of which are represented as follows:
h 1 (t)=x(t)-m 1 (t);
2-3), surrounding the extreme points by adopting a cubic spline curve;
2-4), calculating the average value m of the enclosing lines of the extreme points 1 (t);
2-5) carrying out signal screening on the average value signal based on the original signal, wherein the specific method is as follows:
r 1 (t)=x(t)-c 1 (t)
2-6), the screening signal is compared with the standard deviation for control, and the following steps are carried out:
Figure FDA0003682764610000022
when the value of SD is between 0.2 and 0.3, the screening condition is met;
2-7), obtaining a new signal after the difference between the original signal and the screening signal, namely:
Figure FDA0003682764610000023
where the number of extrema of r is less than 2.
5. The fabric texture abnormality detection method according to claim 1, characterized in that: the step 3) is specifically as follows:
3-1), dividing the texture characteristics of the fabric into a coil and string sleeve relationship;
3-2), acquiring a feature vector of the image gray level distribution uniformity and a feature vector of the complexity through a gray level co-occurrence matrix; setting a direction parameter a and a direction parameter b of an image with M multiplied by N and S gray scale, wherein the value can be 0 or 1; if the gray value of any point (x, y) in the image is i, the gray value of the point (x + a, y + b) is j, and (i, j) is a gray level symbiotic point;
the image gray level distribution uniformity description equation:
Figure FDA0003682764610000031
the image gray distribution complexity description equation:
Figure FDA0003682764610000032
3-3), expressing the characteristic vector of the texture change track through the direction degree, wherein the formula is as follows:
Figure FDA0003682764610000033
wherein n is p Is the number of peaks in the probability histogram, P is the peak point, φ p Is the center point of wave front, w p The trough distance on both sides of the peak, H D (phi) is the edge probability;
3-3), fusing a plurality of groups of feature vectors, wherein the fusion feature CWI is expressed as: CWI ═ a + E + F.
6. The fabric texture abnormality detection method according to claim 1, characterized in that: the step 4) is specifically as follows:
4-1), building an evolutionary neural network model;
4-2), carrying out global optimization in an evolutionary neural network to obtain an initialization population of an optimal Harris eagle model, wherein the position vector of the Harris eagle model is expressed as follows:
Figure FDA0003682764610000034
wherein X prey (t) is the optimal position, X (t) represents the current position, X rand (t)Is a random position, r i Is [0,1 ]]UB, LB are the upper and lower limits of the variables, Xm (t) is the average position of the individual, expressed as:
Figure FDA0003682764610000035
4-3), optimizing the weight value (w) and the bias value (b) in the optimal Harris eagle model to obtain an optimal regularization extreme learning machine, which is expressed as: h is i (x)=g(w i ,b i ,x)=g(w i x+b i ) Wherein g (x) is a Sigmoid activation function;
4-4) performing texture abnormity judgment and classification detection based on an improved regularization extreme learning machine.
7. The fabric texture abnormality detection method according to claim 6, characterized in that: the specific method for constructing the evolutionary neural network model comprises the following steps:
randomly generating an initial population of size M × N, whose individual representations: x i (0)=[x i,1 (0),x i,2 (0),x i,3 (0),......,x i,N (0)](i ═ 1, 2, 3.... M), where X is i,j (0) Representing 0 th generation of population, and the number of initial population is 50;
the DE algorithm is used to implement individual variation, which is expressed as: v i (g)=X best (g)+F·[X p1 (g)-X p2 (g)]+F·[X p3 (g)-X p4 (g)]Wherein X is best (g) Represents the best individual, X pi (g) Representing random individuals, and the value of a scaling factor F is 1;
the cross variation is expressed as:
Figure FDA0003682764610000041
wherein cr is [0,1 ]]Selecting the cross probability cr value to be 0.8;
the optimal generation is represented as:
Figure FDA0003682764610000042
CN202210638094.XA 2022-06-08 2022-06-08 Fabric texture abnormity detection method Pending CN114913168A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210638094.XA CN114913168A (en) 2022-06-08 2022-06-08 Fabric texture abnormity detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210638094.XA CN114913168A (en) 2022-06-08 2022-06-08 Fabric texture abnormity detection method

Publications (1)

Publication Number Publication Date
CN114913168A true CN114913168A (en) 2022-08-16

Family

ID=82771381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210638094.XA Pending CN114913168A (en) 2022-06-08 2022-06-08 Fabric texture abnormity detection method

Country Status (1)

Country Link
CN (1) CN114913168A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116156157A (en) * 2023-04-24 2023-05-23 长沙海信智能***研究院有限公司 Camera shielding abnormality detection method and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116156157A (en) * 2023-04-24 2023-05-23 长沙海信智能***研究院有限公司 Camera shielding abnormality detection method and electronic equipment
CN116156157B (en) * 2023-04-24 2023-08-18 长沙海信智能***研究院有限公司 Camera shielding abnormality detection method and electronic equipment

Similar Documents

Publication Publication Date Title
Bu et al. Fabric defect detection based on multiple fractal features and support vector data description
CN105701507B (en) Image classification method based on dynamic random pond convolutional neural networks
CN103996209B (en) Infrared vessel object segmentation method based on salient region detection
CN109711437A (en) A kind of transformer part recognition methods based on YOLO network model
CN114913168A (en) Fabric texture abnormity detection method
Schneider et al. Vision-based on-loom measurement of yarn densities in woven fabrics
CN110414571A (en) A kind of website based on Fusion Features reports an error screenshot classification method
Shahrabadi et al. Defect detection in the textile industry using image-based machine learning methods: a brief review
Xin et al. Investigation on the classification of weave pattern based on an active grid model
Anila et al. Fabric texture analysis and weave pattern recognition by intelligent processing
Sanchez et al. Analyzing the influence of contrast in large-scale recognition of natural images
CN111445484A (en) Image-level labeling-based industrial image abnormal area pixel level segmentation method
Jia et al. Fabric defect inspection based on lattice segmentation and lattice templates
Xiang et al. Yarn-dyed woven fabric density measurement method and system based on multi-directional illumination image fusion enhancement technology
Meng et al. Automatic recognition of woven fabric structural parameters: a review
Jing et al. Automatic recognition of weave pattern and repeat for yarn-dyed fabric based on KFCM and IDMF
Li et al. Yarn-dyed woven defect characterization and classification using combined features and support vector machine
Chen et al. Image-based textile decoding
Chandra et al. Neural network trained morphological processing for the detection of defects in woven fabric
CN106875459A (en) Self-adaptive equalization method for color jacquard weave structure based on image segmentation
Wei et al. Multi-stage unsupervised fabric defect detection based on DCGAN
Zhang et al. A review of woven fabric pattern recognition based on image processing technology
Liu et al. Objective evaluation of fabric pilling based on multi-view stereo vision
Lu et al. Deep adversarial data augmentation for fabric defect classification with scarce defect data
Basu et al. Sub image based eigen fabrics method using multi-class SVM classifier for the detection and classification of defects in woven fabric

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination