CN111862008A - Yarn defect detection method based on machine vision - Google Patents

Yarn defect detection method based on machine vision Download PDF

Info

Publication number
CN111862008A
CN111862008A CN202010631741.5A CN202010631741A CN111862008A CN 111862008 A CN111862008 A CN 111862008A CN 202010631741 A CN202010631741 A CN 202010631741A CN 111862008 A CN111862008 A CN 111862008A
Authority
CN
China
Prior art keywords
yarn
image
diameter
evenness
machine vision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010631741.5A
Other languages
Chinese (zh)
Inventor
张缓缓
赵妍
景军锋
李鹏飞
苏泽斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Polytechnic University
Original Assignee
Xian Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Polytechnic University filed Critical Xian Polytechnic University
Priority to CN202010631741.5A priority Critical patent/CN111862008A/en
Publication of CN111862008A publication Critical patent/CN111862008A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/08Measuring arrangements characterised by the use of optical techniques for measuring diameters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Signal Processing (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Probability & Statistics with Applications (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a yarn defect detection method based on machine vision, which is characterized by comprising the following steps: step 1, preprocessing a yarn image to be detected to obtain a preprocessed image; step 2, carrying out fuzzy C-means clustering method processing of fusion space on the preprocessed image to obtain a yarn evenness picture; step 3, processing the yarn evenness image by adopting morphological opening operation to obtain a more accurate yarn evenness image, and calculating the average diameter and the measured diameter of the yarn in the more accurate yarn evenness image; and 4, detecting and calculating the type and the number of the yarn defects in the image by using the yarn defect judgment standard. The method reduces the influence of information such as yarn hairiness, yarn evenness peripheral burrs and the like on defect detection, and improves the detection precision of the algorithm.

Description

Yarn defect detection method based on machine vision
Technical Field
The invention belongs to the technical field of yarn defect detection, and relates to a yarn defect detection method based on machine vision.
Background
The yarn defects directly determine the quality of woven and knitted fabrics at the later stage, the types and the number of the defects are important indexes for evaluating the yarn quality, and the detection and the analysis of the defects are necessary conditions for controlling and improving the yarn quality. Yarn defects may occur during the production of the yarn due to various factors, such as raw spinning material, spinning apparatus, environmental conditions, etc. The yarn defects are mainly expressed as sudden change of yarn diameter, the defects of neps, transverse strips, vertical strips and the like of the fabric can be caused by excessively large diameter, and the defects of broken warp and broken weft and the like of the fabric can be caused by excessively small diameter. Detection of yarn defects is therefore indispensable.
The current detection aiming at the yarn defects mainly comprises an eye detection method[5]Capacitance detection method[6]And image analysis[7-8]And the like. The visual inspection method mainly depends on human vision to evaluate, and has the problems of large interference, low accuracy, low discrimination speed and the like in discrimination of human factors; the capacitance detection method is very easily influenced by factors such as environment temperature and humidity, yarn surface hairiness and the like, the most widely applied instrument at present is a hundred thousand meter yarn defect instrument of the USTER company, the price is high, and one machine is as high as about 30 ten thousand. The image analysis method is a method for evaluating the yarn appearance quality by machine vision, and for example, anindas angupta et al proposes a system for detecting a yarn-related index at a low cost, which has relatively comprehensive detection results and low investment cost, but is far from the detection results of USTER in comparison with the detection results of yarn defects. The scenic and military frontier and the like propose that the yarn defects are detected by using a significance algorithm, and the detection result is only compared with the detection result of a visual method, so that the validity of the algorithm cannot be accurately verified. Zhou national celebration et al propose a method for detecting yarn defects based on a line-scan cameraAlthough this method can detect the presence of a yarn defect, the type of the yarn defect cannot be distinguished.
Disclosure of Invention
The invention aims to provide a yarn defect detection method, which reduces the influence of information such as yarn hairiness, yarn evenness peripheral burrs and the like on defect detection and improves the detection precision of an algorithm.
The technical scheme adopted by the invention is that,
a yarn defect detection method based on machine vision comprises the following steps:
step 1, preprocessing a yarn image to be detected to obtain a preprocessed image;
step 2, carrying out fuzzy C-means clustering method processing of fusion space on the preprocessed image to obtain a yarn evenness picture;
step 3, processing the yarn evenness image by adopting morphological opening operation to obtain a more accurate yarn evenness image, and calculating the average diameter and the measured diameter of the yarn in the more accurate yarn evenness image;
and 4, detecting and calculating the type and the number of the yarn defects in the image by using the yarn defect judgment standard.
The invention is characterized in that the method comprises the following steps,
wherein the step 1 comprises the following steps: and zooming the yarn image to be detected to 256 multiplied by 256 pixels, and then converting the yarn image into a gray image to obtain a preprocessed image.
The step 2 comprises the following specific steps:
step 2.1, with X ═ X1,x2,....xi....xN) Labeling the preprocessed image to classify N pixels into C classes, where X iRepresenting a spectral feature;
step 2.2, defining a minimization objective function as:
Figure BDA0002569155310000031
in the formula: u'ijRepresenting x as membership functions of the fused spatial informationjMembership of a pixel in class i; v. ofiThe method is characterized in that the method is an updating formula of a clustering center and represents the ith clustering center, | | | - | represents norm measurement, and m controls the ambiguity of the generated partition;
the updating formula of the clustering center is as follows:
Figure BDA0002569155310000032
the formula of the membership function of the fusion spatial information is as follows:
Figure BDA0002569155310000033
where k is a constant and p and q represent two constants, respectively, the control generates the ambiguity of the partition.
Wherein u isijMembership functions for a fuzzy C-means clustering method; wherein h isijRepresenting a pixel x as a function of spacejProbability of belonging to class i, where uijAnd hijIs as in formula (4) and formula (5):
Figure BDA0002569155310000034
in the formula: NB (x)j) Representing in a spatial neighborhood by pixel xjA window (5 × 5) at the center;
Figure BDA0002569155310000041
and 2.3, starting with the initial hypothesis of each clustering center, namely, i is 0, performing iteration operation on the minimized objective function to obtain a new space mapping picture and an iteration center, judging whether iteration is finished or not by judging the difference value between two adjacent clustering centers, and taking the finally obtained new space mapping picture as a more accurate yarn evenness image after the iteration is finished.
Wherein the mode for judging whether the iteration is finished in the step 2.3 is that v is obtained if the difference value between two adjacent clustering centers is 0.02i+1-viIf the ratio is less than 0.02, the iteration is stopped, otherwise, the iteration is continued.
The specific method of the step 3 comprises the following steps:
and 3.1, performing morphological opening operation processing on the yarn evenness image by using a disc with the size of 7 x 7 to obtain a more accurate yarn evenness image.
And 3.2, calculating the measured diameter and the average diameter of the yarn according to the extracted yarn levelness, wherein the measured diameter is the distance between the upper edge point and the lower edge point of the yarn levelness, and the average diameter of the yarn is the average value of the sum of the measured diameters.
Wherein the step 4 specifically comprises the following steps:
step 4.1, inputting the average diameter of the yarns;
step 4.2, setting a threshold value of the yarn defects; yarn defects are divided into neps, slubs and details; wherein the slubby is defined as a yarn diameter greater than 130% and less than 200% of the average diameter, and a length of not less than 4 mm; details are defined as yarn diameter less than 50% of the mean diameter and length not less than 4 mm; neps are defined as yarns having a diameter greater than 200% of the average diameter and a length of 1 to 4 mm;
step 4.3, inputting the measured diameter, and judging the type of the yarn defect according to the threshold set in the step 4.2;
And 4.4, counting the yarn defects of various types.
The invention has the advantages that
The yarn defect detection method can effectively detect the variety and the number of the yarn defects and is not influenced by factors such as environment temperature and humidity; in the process of extracting yarn levelness, the fusion space FCM algorithm adds space information, solves the problem that FCM is easy to fall into local optimum, effectively filters influence factors such as yarn hairiness and the like, enables the obtained levelness image to be more accurate, and can accurately distinguish the types and the number of yarn defects.
Drawings
FIG. 1 is an algorithmic block diagram of a machine vision based yarn defect detection method of the present invention;
FIG. 2 is a flow chart of step 4 of a machine vision based yarn defect detection method of the present invention;
FIG. 3 is an image to be detected of example 1 in a machine vision-based yarn defect detecting method of the present invention;
FIG. 4 is an image of yarn evenness for example 1 in a machine vision based yarn defect detection method of the present invention;
FIG. 5 is a more accurate yarn evenness image of example 1 in a machine vision based yarn defect detection method of the present invention.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
The invention discloses a yarn defect detection method based on machine vision, which comprises the following specific steps as shown in figure 1:
step 1, preprocessing a yarn image to be detected to obtain a preprocessed image;
step 2, carrying out fuzzy C-means clustering method processing of fusion space on the preprocessed image to obtain a yarn evenness picture;
step 3, processing the yarn evenness image by adopting morphological opening operation to obtain a more accurate yarn evenness image, and calculating the average diameter and the measured diameter of the yarn in the more accurate yarn evenness image;
and 4, detecting and calculating the type and the number of the yarn defects in the image by using the yarn defect judgment standard.
The step 1 comprises the following specific steps: and zooming the yarn image to be detected to 256 multiplied by 256 pixels, and then converting the yarn image into a gray image to obtain a preprocessed image.
The step 2 comprises the following specific steps:
step 2.1, with X ═ X1,x2,....xi....xN) Labeling the preprocessed image to classify N pixels into C classes, where XiRepresenting a spectral feature;
step 2.2, defining a minimization objective function as:
Figure BDA0002569155310000061
in the formula: u'ijRepresenting x as membership functions of the fused spatial information jMembership of a pixel in class i; v. ofiThe method is characterized in that the method is an updating formula of a clustering center and represents the ith clustering center, | | | - | represents norm measurement, and m controls the ambiguity of the generated partition; the objective function is minimal when the pixel has a high degree of membership near the cluster center or when the pixel has a low degree of membership far from the cluster center. Membership functions represent the probability that a pixel belongs to a particular class, which depends on the distance of the pixel from the respective cluster center in the feature domain.
The updating formula of the clustering center is as follows:
Figure BDA0002569155310000071
an important feature of an image is that adjacent pixels are highly correlated, the adjacent pixels have similar feature values, and the probability that the adjacent pixels belong to the same class is high, and the spatial relationship is important in clustering, but the spatial relationship is not applied in the C-means clustering method, so the invention fuses spatial information into the C-means clustering method, and adopts a membership function of the fused spatial information, wherein the formula of the membership function of the fused spatial information is as follows:
Figure BDA0002569155310000072
where k is a constant and p and q represent two constants, respectively, the control generates the ambiguity of the partition.
Wherein u isijMembership functions for a fuzzy C-means clustering method; wherein h isijRepresenting a pixel x as a function of space jProbability of belonging to class i, where uijAnd hijIs as in formula (4) and formula (5):
Figure BDA0002569155310000073
in the formula: NB (x)j) Representing in a spatial neighborhood by pixel xjA window (5 × 5) at the center;
Figure BDA0002569155310000074
step 2.3, starting with the initial assumption of each clustering center, namely, i is 0, performing iterative operation on the minimized objective function to obtain a new space mapping picture and an iterative center, judging whether iteration is finished or not by judging the difference value between two adjacent clustering centers, and if the difference value between two adjacent clustering centers is 0.02, namely, v isi+1-viIf the value is less than 0.02, stopping iteration, otherwise, continuing the iteration; and after the iteration is finished, taking the finally obtained new space mapping picture as a more accurate yarn evenness image.
The specific method of the step 3 comprises the following steps:
and 3.1, performing morphological opening operation processing on the yarn evenness image by using a disc with the size of 7 x 7 to obtain a more accurate yarn evenness image.
And 3.2, calculating the measured diameter and the average diameter of the yarn according to the extracted yarn levelness, wherein the measured diameter is the distance between the upper edge point and the lower edge point of the yarn levelness, and the average diameter of the yarn is the average value of the sum of the measured diameters.
Wherein, the step 4 is as shown in fig. 2, which comprises the following steps:
step 4.1, inputting the average diameter of the yarns;
Step 4.2, setting a threshold value of the yarn defects; yarn defects are divided into neps, slubs and details; wherein the slubby is defined as a yarn diameter greater than 130% and less than 200% of the average diameter, and a length of not less than 4 mm; details are defined as yarn diameter less than 50% of the mean diameter and length not less than 4 mm; neps are defined as yarns having a diameter greater than 200% of the average diameter and a length of 1 to 4 mm;
step 4.3, inputting the measured diameter, and judging the type of the yarn defect according to the threshold set in the step 4.2;
and 4.4, counting the yarn defects of various types.
Example 1
Inputting a yarn image to be detected, and executing step 1 as shown in fig. 3;
step 2 is executed, wherein m is 2, and the obtained yarn evenness image is shown in fig. 4
Step 3 is executed, wherein p is 0, q is 2, and the more accurate yarn evenness image is shown in fig. 5; wherein the calculated average and measured diameters are 0.212mm and 0.214 mm.
And 4, executing the step 4 to obtain that a nep exists in the yarn image to be detected.
This example also carried out the average diameter measurement of yarns of three gauges, 27.8tex, 18.2tex and 14.5tex, using the method of the invention, with 1320 images of the yarns per set.
The results of the measurement of the average diameter are shown in Table 1, and are close to the theoretical diameter, thereby illustrating the feasibility of the method of the present invention.
TABLE 1 mean yarn diameter to theoretical diameter
Figure BDA0002569155310000091
The results of the types and the numbers of the yarn defects are shown in table 2, and the standard capacitive detection result is adopted as the standard result for comparison in the embodiment and is kept highly consistent with the capacitive detection result, so that the algorithm can obtain an accurate result.
TABLE 2 comparison of yarn Defect detection results
Figure BDA0002569155310000092

Claims (6)

1. A yarn defect detection method based on machine vision is characterized by comprising the following steps:
step 1, preprocessing a yarn image to be detected to obtain a preprocessed image;
step 2, carrying out fuzzy C-means clustering method processing of fusion space on the preprocessed image to obtain a yarn evenness picture;
step 3, processing the yarn evenness image by adopting morphological opening operation to obtain a more accurate yarn evenness image, and calculating the average diameter and the measured diameter of the yarn in the more accurate yarn evenness image;
and 4, detecting and calculating the type and the number of the yarn defects in the image by using the yarn defect judgment standard.
2. A machine vision based yarn defect detecting method according to claim 1, characterized in that said step 1 comprises the steps of: and zooming the yarn image to be detected to 256 multiplied by 256 pixels, and then converting the yarn image into a gray image to obtain a preprocessed image.
3. A method for detecting yarn defects based on machine vision according to claim 1, characterized in that said step 2 comprises the following steps:
step 2.1, with X ═ X1,x2,…xi....xN) Labeling the preprocessed image to classify N pixels into C classes, where XiRepresenting a spectral feature;
step 2.2, defining a minimization objective function as:
Figure FDA0002569155300000011
in the formula: u'ijRepresenting x as membership functions of the fused spatial informationjMembership of a pixel in class i; v. ofiThe method is characterized in that the method is an updating formula of a clustering center and represents the ith clustering center, | | | - | represents norm measurement, and m controls the ambiguity of the generated partition;
the updating formula of the clustering center is as follows:
Figure FDA0002569155300000021
the formula of the membership function of the fusion spatial information is as follows:
Figure FDA0002569155300000022
wherein k is a constant, p and q respectively represent two constants, and the ambiguity of the partition is controlled to be generated;
said u isijMembership functions for a fuzzy C-means clustering method; h isijRepresenting a pixel x as a function of spacejProbability of belonging to class i, where uijAnd hijIs as in formula (4) and formula (5):
Figure FDA0002569155300000023
in the formula: NB (x)j) Representing in a spatial neighborhood by pixel xjA window (5 × 5) at the center;
Figure FDA0002569155300000024
and 2.3, starting with the initial hypothesis of each clustering center, namely, i is 0, performing iteration operation on the minimized objective function to obtain a new space mapping picture and an iteration center, judging whether iteration is finished or not by judging the difference value between two adjacent clustering centers, and taking the finally obtained new space mapping picture as a more accurate yarn evenness image after the iteration is finished.
4. A machine vision based yarn defect detection method according to claim 3, characterized in that said step 2.3 of determining whether the iteration is over is such that v is the difference between two adjacent cluster centers of 0.02i+1-vi<0.02, the iteration is stopped, otherwise, the iteration is continued.
5. A method for detecting yarn defects based on machine vision according to claim 1, characterized in that, the specific method of step 3 is as follows:
step 3.1, performing morphological opening operation processing on the yarn evenness image by using a disc with the size of 7 x 7 to obtain a more accurate yarn evenness image;
and 3.2, calculating the measured diameter and the average diameter of the yarn according to the extracted yarn levelness, wherein the measured diameter is the distance between the upper edge point and the lower edge point of the yarn levelness, and the average diameter of the yarn is the average value of the sum of the measured diameters.
6. A method of machine vision based yarn defect detection as claimed in claim 1, wherein said step 4 comprises the steps of:
step 4.1, inputting the average diameter of the yarns;
step 4.2, setting a threshold value of the yarn defects; yarn defects are divided into neps, slubs and details; wherein the slubby is defined as a yarn diameter greater than 130% and less than 200% of the average diameter, and a length of not less than 4 mm; details are defined as yarn diameter less than 50% of the mean diameter and length not less than 4 mm; neps are defined as yarns having a diameter greater than 200% of the average diameter and a length of 1 to 4 mm;
Step 4.3, inputting the measured diameter, and judging the type of the yarn defect according to the threshold set in the step 4.2;
and 4.4, counting the yarn defects of various types.
CN202010631741.5A 2020-07-03 2020-07-03 Yarn defect detection method based on machine vision Pending CN111862008A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010631741.5A CN111862008A (en) 2020-07-03 2020-07-03 Yarn defect detection method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010631741.5A CN111862008A (en) 2020-07-03 2020-07-03 Yarn defect detection method based on machine vision

Publications (1)

Publication Number Publication Date
CN111862008A true CN111862008A (en) 2020-10-30

Family

ID=73152998

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010631741.5A Pending CN111862008A (en) 2020-07-03 2020-07-03 Yarn defect detection method based on machine vision

Country Status (1)

Country Link
CN (1) CN111862008A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855633A (en) * 2012-09-05 2013-01-02 山东大学 Anti-noise quick fuzzy-clustering digital image segmentation method
CN104535004A (en) * 2015-01-29 2015-04-22 江南大学 Image processing-based yarn diameter detection method
CN106845457A (en) * 2017-03-02 2017-06-13 西安电子科技大学 Method for detecting infrared puniness target based on spectrum residual error with fuzzy clustering
CN109472779A (en) * 2018-10-24 2019-03-15 上海工程技术大学 A kind of yarn appearance characteristic parameter extraction and analysis method based on morphosis
CN109509196A (en) * 2018-12-24 2019-03-22 广东工业大学 A kind of lingual diagnosis image partition method of the fuzzy clustering based on improved ant group algorithm
CN109919939A (en) * 2019-03-27 2019-06-21 王合山 A kind of yarn faults detection method and device based on genetic algorithm
CN109978830A (en) * 2019-02-28 2019-07-05 西安工程大学 A kind of fabric defect detection method
CN110458809A (en) * 2019-07-16 2019-11-15 西安工程大学 A kind of yarn evenness detection method based on sub-pixel edge detection

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855633A (en) * 2012-09-05 2013-01-02 山东大学 Anti-noise quick fuzzy-clustering digital image segmentation method
CN104535004A (en) * 2015-01-29 2015-04-22 江南大学 Image processing-based yarn diameter detection method
CN106845457A (en) * 2017-03-02 2017-06-13 西安电子科技大学 Method for detecting infrared puniness target based on spectrum residual error with fuzzy clustering
CN109472779A (en) * 2018-10-24 2019-03-15 上海工程技术大学 A kind of yarn appearance characteristic parameter extraction and analysis method based on morphosis
CN109509196A (en) * 2018-12-24 2019-03-22 广东工业大学 A kind of lingual diagnosis image partition method of the fuzzy clustering based on improved ant group algorithm
CN109978830A (en) * 2019-02-28 2019-07-05 西安工程大学 A kind of fabric defect detection method
CN109919939A (en) * 2019-03-27 2019-06-21 王合山 A kind of yarn faults detection method and device based on genetic algorithm
CN110458809A (en) * 2019-07-16 2019-11-15 西安工程大学 A kind of yarn evenness detection method based on sub-pixel edge detection

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
ZHAO F 等: "Kernel generalized fuzzy c-means clustering with spatial information for image segmentation", 《DIGITAL SIGNAL PROCESSING》, vol. 23, pages 184 - 199 *
张婉婉;景军锋;苏泽斌;张蕾;李鹏飞;: "纱线条干均匀度检测算法研究", 《西安工程大学学报》, vol. 31, no. 01, pages 59 - 61 *
景军锋 等: "基于空间约束的快速FCM纱线表观均匀性检测", 《棉纺织技术》, no. 05, pages 1 - 3 *
王芳;: "RGB空间下基于C-means颜色聚类的印花布图像分割方法研究", 工业控制计算机, no. 03, pages 4 - 5 *
赵妍 等: "融合空间模糊 C -均值聚类的纱线疵点检测算法", 《激光与光电子学进展》, vol. 58, no. 2021, pages 207 - 213 *
邵东锋, 张一心: "基于图像处理的纱线条干检测", 上海纺织科技, no. 08, pages 37 - 38 *

Similar Documents

Publication Publication Date Title
CN115351598B (en) Method for detecting bearing of numerical control machine tool
CN106529559A (en) Pointer-type circular multi-dashboard real-time reading identification method
CN109035195A (en) A kind of fabric defect detection method
CN101996405A (en) Method and device for rapidly detecting and classifying defects of glass image
CN103759662A (en) Dynamic textile yarn diameter rapid-measuring device and method
CN116309671B (en) Geosynthetic fabric quality detection system
CN114841938A (en) Fabric snagging defect detection method
CN111652883B (en) Glass surface defect detection method based on deep learning
CN109523540A (en) Yarn break detection method based on Hough transformation
CN111415349B (en) Polyester filament yarn detection method based on image processing technology
CN113643276B (en) Textile texture defect automatic detection method based on statistical analysis
KR101782363B1 (en) Vision inspection method based on learning data
CN114998321B (en) Textile material surface hairiness degree identification method based on optical means
CN116008289B (en) Nonwoven product surface defect detection method and system
CN116977358A (en) Visual auxiliary detection method for corrugated paper production quality
CN110458809B (en) Yarn evenness detection method based on sub-pixel edge detection
CN115063424B (en) Textile bobbin yarn detection method based on computer vision
CN116523913B (en) Intelligent detection method for quality of screw rod
CN113936001B (en) Textile surface flaw detection method based on image processing technology
CN116894840B (en) Spinning proofing machine product quality detection method and system
CN115311278B (en) Yarn segmentation method for yarn detection
CN111862008A (en) Yarn defect detection method based on machine vision
CN102520028B (en) Method for digitally processing raw silk defects
CN113554604B (en) Melt-blown cloth defect area detection method based on machine vision
CN115294100A (en) Loom parking control method and system based on data processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination