CN114998321B - Textile material surface hairiness degree identification method based on optical means - Google Patents

Textile material surface hairiness degree identification method based on optical means Download PDF

Info

Publication number
CN114998321B
CN114998321B CN202210844374.6A CN202210844374A CN114998321B CN 114998321 B CN114998321 B CN 114998321B CN 202210844374 A CN202210844374 A CN 202210844374A CN 114998321 B CN114998321 B CN 114998321B
Authority
CN
China
Prior art keywords
hairiness
bright
pixel points
area
areas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210844374.6A
Other languages
Chinese (zh)
Other versions
CN114998321A (en
Inventor
陈雯旻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Bona Textile Co ltd
Original Assignee
Nantong Bona Textile Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Bona Textile Co ltd filed Critical Nantong Bona Textile Co ltd
Priority to CN202210844374.6A priority Critical patent/CN114998321B/en
Publication of CN114998321A publication Critical patent/CN114998321A/en
Application granted granted Critical
Publication of CN114998321B publication Critical patent/CN114998321B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the field of data processing, in particular to a textile material surface hairiness degree identification method based on an optical means. The method comprises the following steps: clustering pixel points with the attention degree larger than or equal to a threshold value to obtain each connected domain, and calculating the main direction vector of each connected domain; for any communication domain: firstly, calculating the similarity of the main directions of the connected domain and the adjacent connected domain closest to the connected domain, and then calculating the consistency of the gradient directions of the pixel points around the main directions of the connected domain and the main directions of the bright areas; then judging whether the pixel points between the connected domains belong to the same hairiness with the pixel points in the connected domains; and calculating the severity index according to the quantity and the length of the hairiness. The method is a method for detecting textile materials by utilizing optical means, in particular to a method for detecting the existence of hairiness flaws on the surface of the textile materials. The method can be applied to new material related services, and can realize new material detection, metering, related standardization, authentication and approval services and the like. The invention improves the detection efficiency.

Description

Textile material surface hairiness degree identification method based on optical means
Technical Field
The invention relates to the technical field of data processing, in particular to a textile material surface hairiness degree identification method based on an optical means.
Background
The yarn is formed by twisting, winding, twisting and condensing one or a plurality of long or short fibers, so that the fibers extend out of the yarn main body, and yarn hairiness is formed along with the comprehensive factors of fiber movement, process configuration and mechanical action in production. Yarn hairiness is one of the characteristics that measure the basic structure and appearance of a yarn and is also an important reference index for assessing the quality of a yarn. The short hairiness with proper amount on the yarns can lead the appearance of the fabric to be more plump, and the clothing to be more comfortable and soft to wear; the long hairiness on the yarn can cause rough yarn appearance, and adjacent warps can be intertwined, so that the effective height of an opening of a loom is reduced, the opening of the loom is unclear, false hanging warp, increased broken ends and difficult weft insertion occur, and the weft flight is blocked to cause stop.
The traditional method for detecting the new material of the hairiness of the textile yarns is as follows: the method comprises the steps of obtaining yarn images by utilizing an optical means, manually measuring the amplified yarn images, projecting microscopic yarn images on a large screen or photographing, and judging the defect degree of yarn hairiness flaws according to the physical properties of hairiness in the images, such as color, length and the like, but the method is time-consuming and labor-consuming, and has the problem of difficult definition of yarn boundaries.
Disclosure of Invention
In order to solve the problem that the existing method cannot automatically detect hairiness flaws, the invention aims to provide a textile material surface hairiness degree identification method based on optical means, and the adopted technical scheme is as follows:
the invention provides a textile material surface hairiness degree identification method based on optical means, which comprises the following steps:
acquiring a surface gray level image of the textile yarn, and dividing an image of a yarn evenness area in the surface gray level image of the textile yarn by adopting a graph cutting algorithm to obtain an image of a target area;
dividing the image of the target area into a set number of area images: for any region image: according to the gray values of the pixel points in the region image, calculating the attention degree of the pixel point corresponding to any gray value in the region, and marking the pixel point with the attention degree larger than or equal to a set threshold value; clustering the marked pixel points according to the attention degree of the marked pixel points in the area to obtain the communication areas of the bright areas corresponding to the area;
calculating the main direction vector of each bright region connected region according to the direction vector of the gradient change direction of each marked pixel point in each bright region connected region; for any bright region connected region in any region image: calculating the similarity of the main direction of the bright area connected domain and the main direction of the adjacent bright area connected domain closest to the bright area connected domain according to the main direction vector of the bright area connected domain and the main direction vector of the adjacent bright area connected domain closest to the bright area connected domain;
judging whether the similarity is larger than a first threshold value, if so, calculating a consistency index of the gradient direction of the pixel points in the set range around the main directions of the corresponding two bright areas and the main directions of the bright areas according to the similarity of the gradient directions of the pixel points in the set range around the main directions of the corresponding two bright areas and the main directions of the corresponding bright areas, judging whether the consistency index is larger than a second threshold value, and if so, judging whether each pixel point between the two bright areas is affiliated to the same hairiness with the pixel point in the corresponding two bright areas according to the gradient amplitude of each pixel point between the two bright areas;
and calculating the severity index of hairiness flaws according to the quantity of hairiness and the length of hairiness in each area image.
Preferably, the following formula is adopted to calculate the attention degree of the pixel point corresponding to any gray value in the area, including:
wherein,a degree of attention of a pixel corresponding to any gray value, +.>Is the gray value of the pixel, +.>For the maximum gray value of the pixel in this area,/or->Is the base of natural logarithm, +.>Is the gray threshold.
Preferably, the principal direction vector of each bright region connected region is calculated using the following formula:
wherein,is the main direction vector of any bright area connected domain, +.>Marking the number of pixel points in the connected domain, < >>For the (th)>And each marking pixel point gradient changes the direction vector of the direction.
Preferably, the following formula is adopted to calculate the consistency index of the gradient direction of the pixel points in the set range around the main directions of the two connected areas of the bright areas and the main directions of the bright areas:
wherein,setting the consistency index of the gradient direction and the main direction of the pixel points in the range around the main directions of the two corresponding bright area connected areas, +.>Setting the number of pixel points in a range around the main direction of one of the bright area connected areas, +.>Setting a range of +.about.1 for the main direction of the bright area connected region>A direction vector of gradient change direction of each pixel point,/->For the principal direction vector of the bright region connected domain, < >>For another lightThe number of pixel points in a set range around the main direction of the area communication area, < >>Setting a range of +.about.1 for the main direction of the bright area connected region>A direction vector of gradient change direction of each pixel point,/->Is the principal direction vector of the connected domain of the bright area.
Preferably, if the similarity is smaller than or equal to a first threshold, it is determined that the pixels in the two connected bright areas do not belong to the same hairiness pixel.
Preferably, the determining whether each pixel point between two light area connected areas and the corresponding pixel point in two light area connected areas belong to the same hairiness includes:
detecting pixel points between the main directions of the two corresponding bright area communication areas by using a sliding window, connecting the two pixel points closest to each other on the main direction vectors of the two bright area communication areas, and taking the midpoint on the connecting line as an initial center point of the sliding window;
calculating the variance of the gradient amplitude of the pixel points in the sliding window, and taking the variance as the fluctuation degree of the corresponding pixel points in the sliding window;
judging whether the fluctuation degree is larger than or equal to a fluctuation degree threshold value, if so, judging that the pixel points in the sliding window and the pixel points in the communication areas corresponding to the two bright areas belong to the same hairiness.
Preferably, if the consistency index is smaller than or equal to a second threshold, it is determined that the pixels in the connected areas of the two corresponding bright areas do not belong to the same hairiness.
Preferably, the severity index of hairiness flaws is calculated using the following formula:
wherein,is the severity index of hairiness flaws, +.>Is->The number of hairiness in the individual zones greater than the set length,/->Is->The number of hairiness of a set length or less in each area,/->Is the total number of regions.
The invention has the following beneficial effects: the invention aims to detect the severity of defects on the surface of textile yarns, so that the defect detection needs to be carried out on the novel material of the textile yarns, and the defect detection mainly measures the length of hairiness and the quantity of the hairiness according to the length of the hairiness and the proportion of the hairiness, so that the defect degree of the hairiness defects is judged according to the length of the hairiness and the quantity of the hairiness, which influence the quality of the yarns. Firstly, acquiring a surface gray image of textile yarns by utilizing an optical means, then dividing an image of a yarn trunk area to obtain an image of a target area, eliminating interference of some useless pixel points on a metering result, and reducing calculated amount; the invention obtains the connected domain that the pixel point on hairiness is brighter and darker, the invention has bright pixel point on hairiness first, if there are multiple bright area connected domains on the same hairiness, then the main direction of the bright area connected domain on hairiness has higher similarity, the invention calculates the similarity of main direction of adjacent bright area connected domain, judge whether the similarity is greater than threshold value, if greater, say that two corresponding bright area connected domains are likely to belong to the same hairiness, then calculate the consistency index of the pixel point around two corresponding adjacent bright area connected domains and the main direction of two corresponding bright area connected domains, then judge whether the consistency index is greater than threshold value, if greater, then judge whether the pixel point with smaller gray value between two bright area connected domains and the pixel point in two bright area connected domains belong to the pixel point on the same hairiness, extract skeleton of hairiness by skeleton extraction algorithm after judging; and (3) measuring the quantity and the length of hairiness, and calculating the severity index of hairiness flaws according to the quantity and the length of hairiness. The method is a method for detecting textile materials by utilizing optical means (particularly visible light images), and particularly detecting the existence of hairiness flaws on the surface of the textile materials. The method can be applied to new material related services, and can realize new material detection, metering, related standardization, authentication and approval services and the like. The method provided by the invention can automatically detect hairiness flaws without manually detecting the severity of the hairiness flaws of the yarns, thereby saving time and improving detection efficiency.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for identifying the hairiness degree of the surface of a textile material based on an optical means;
FIG. 2 is a schematic diagram of an image acquisition system according to the present invention;
in the figure: 1. a yarn drive system; 2. a first light source; 3. a second light source; 4. a yarn; 5. a background plate; 6. an image acquisition device.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following detailed description is given to a method for identifying the hairiness degree of the surface of the textile material based on the optical means according to the invention by combining the attached drawings and the preferred embodiment.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the method for identifying the hairiness degree of the surface of the textile material based on an optical means with reference to the attached drawings.
Method for identifying hairiness degree of textile material surface based on optical means
The existing method has the problem that hairiness flaws cannot be automatically detected. In order to solve the above-mentioned problems, the present embodiment proposes a method for identifying the surface hairiness of a textile material based on optical means, as shown in fig. 1, the method for identifying the surface hairiness of the textile material based on optical means of the present embodiment includes the following steps:
step S1, acquiring a surface gray level image of the textile yarns, and dividing an image of a yarn evenness area in the surface gray level image of the textile yarns by adopting a graph cutting algorithm to obtain an image of a target area.
Yarn hairiness defects are parts of the yarn from which the fibers extend beyond the body of the yarn due to failure of the yarn to be wound during the yarn production process, or yarn hairiness fibers break due to excessive tension caused by improper mechanical control during the production process. In actual production, the shape, length, quantity and distribution of the yarn hairiness have extremely important influence on the production efficiency of the subsequent working procedures and the quality of the end product. The length of yarn hairiness is generally between 1mm and 3mm, the requirements of different products on the length of the yarn hairiness are different, and for soft fabrics, the softness of the fabric can be increased by properly increasing the length of the hairiness, but hairiness above 3mm can lead the appearance of the product to be rough and influence the quality and value of the product, so hairiness above 3mm is not allowed to exist.
The specific scene of this embodiment is: the method comprises the steps of carrying out image acquisition on the produced textile yarns, enabling the yarns to enter a detection area through a transmission device system, carrying out polishing and image acquisition on the detection area to obtain gray images on the surfaces of the yarns, analyzing the acquired yarns, detecting the content of long hairiness and calculating the severity index of hairiness flaws.
Because the lengths and the inclination angles of the yarn hairiness are different, the collected yarn hairiness in the yarn images is almost in different forms and is disordered. How to capture the real, complete and clear image of the hairiness in the yarn is the basis for detecting the hairiness of the yarn based on image processing. As with other machine vision detection systems, the yarn hairiness image method measurement system provided in this embodiment includes different components that work in combination.
In this embodiment, defects of hairiness in different forms of the textile yarns need to be identified, so that an image of the surface of the textile yarns is collected by an optical means. As shown in fig. 2, the yarn 4 passes through a yarn transmission system 1 and enters an image acquisition system which acquires an image of the yarn surface, the image acquisition system comprising an electrostatic device, a first light source 2, a second light source 3, a background plate 5 and an image acquisition device 6 (the image acquisition device comprises an industrial CCD camera, a video microscope and a high-speed digital camera). Most of common hairiness is end hairiness, the purpose of arranging an electrostatic device is to enable the hairiness of yarns to be opened to the greatest extent under the action of static electricity, so that the winding of a plurality of hairiness is avoided as much as possible, the follow-up detection of the real length of the hairiness and the statistics of the quantity of the hairiness are facilitated, an industrial CCD (charge coupled device) camera, a video microscope and a high-speed digital camera are located right above the yarns, an overlook acquisition yarn surface image is carried out, an OPT (optical fiber composite) machine vision light source is adopted for carrying out illumination compensation on the surface of the yarns to be detected by the light source, the light source is located obliquely above the yarns, a black background plate is arranged, the black background plate is located right below the yarns, and the visual field of the camera is adjusted in advance, so that the image acquired by the camera is an image of the black background plate area.
The embodiment processes the acquired yarn surface image, removes the influence of other environmental interference and noise, and performs graying processing on the acquired image to obtain a surface gray image of the textile yarn.
The skeleton extraction algorithm has a good effect on obtaining the main skeleton of the target, unnecessary calculation can be reduced by analyzing the skeleton of the target, and meanwhile, the length calculation of the target is more accurate. The skeleton extraction algorithm based on the distance field has a good calculation effect on the distance field with clear target boundaries, but when the target skeleton extraction with poor target integrity is caused by fuzzy boundaries or large gray level differences of the targets, the skeleton extraction is very easy to be inaccurate, a plurality of small skeletons are easily extracted from the same target, the referential of the target skeleton analysis is lost, and therefore, the definition requirement on the target boundaries by the skeleton extraction algorithm based on the distance field is high. Because the lengths, the radiation directions and the inclination angles of the yarn hairiness are different, the collected yarn image has brighter and darker parts, and because the gray value of the darker parts is lower, the same yarn hairiness is divided into a plurality of small sections when threshold segmentation is adopted, the vertical direction of local gradient change of pixel points of the same yarn hairiness is similar, and meanwhile, certain similarity rules exist in gradient change, so that the approximate area where the hairiness is located is determined, and then skeleton extraction is carried out to obtain the hairiness.
The object of the present embodiment is to determine whether the hairiness of the yarn has a defect, and the existence of the pixel point on the yarn trunk will interfere with the detection of the hairiness and increase the calculation amount, so that the yarn trunk is firstly segmented from the yarn surface image for the detection of the hairiness of the subsequent yarn, the hairiness generally exists in a certain range on both sides of the yarn trunk, and the range in the hairiness existing image is determined by segmenting the yarn trunk, thereby reducing the calculation amount. Specifically, a graph cut algorithm is adopted to distinguish pixel points on yarn evenness from other pixel points, the method selects a global optimal threshold to divide an image into subareas by calculating weights meeting some weights on objects and backgrounds in all the cuts, the image is constructed into an undirected graph, wherein nodes represent pixels, and the weights represent boundary energy. The boundary between the object and the background is defined according to the maximum flow minimum cut theorem, when the maximum flow passes from source-S to sink-T via the graph, the edge is saturated while the boundary between the object and the background is determined. The graph cut algorithm is the prior art and will not be described in detail here. Thus, an image of the yarn trunk area is segmented.
Because yarn hairiness only exists in a certain range near the yarn trunk, a large number of background plates exist in the acquired yarn image, the calculation amount is greatly increased when the acquired yarn image is processed, and meanwhile, the acquired yarn image is affected by noise points or other illumination factors, the split yarn trunk is taken as an image center, x pixel points extend up and down along the edge pixel points of the yarn trunk, a rough region where the hairiness exists is split, and the region is a target region which does not contain the yarn trunk. And calculating the pixel points in the area. In this embodiment, the value of x is set to 300, and in a specific application, the setting is performed according to actual needs.
Step S2, dividing the image of the target area into a set number of area images: for any region image: according to the gray values of the pixel points in the region image, calculating the attention degree of the pixel point corresponding to any gray value in the region, and marking the pixel point with the attention degree larger than or equal to a set threshold value; clustering the marked pixel points according to the attention degree of the marked pixel points in the area to obtain the connected areas of the bright areas corresponding to the area.
The present embodiment divides the image of the target area obtained in step S1 intoAnd (3) carrying out independent analysis on each area image, and finally judging the severity of yarn hairiness flaws according to the length of hairiness and the density of hairiness in each area image. />The value of (2) is set according to the specific situation.
For yarn hairiness, under the action of illumination, the gray value of the hairiness pixel point is larger relative to the pixel point on the background plate, and the gray value of the pixel point on the same hairiness is greatly different due to different lengths, radiation directions and inclination angles of the yarn hairiness, namely a bright area with larger gray value and a dark area with smaller gray value exist, so that the gray level of the pixel point most likely to be the hairiness is determined first. For any region image: calculating the attention degree of the pixel point corresponding to any gray value in the region, namely:
wherein,a degree of attention of a pixel corresponding to any gray value, +.>For the maximum gray value of the pixel in this area,/or->Is the gray value of the pixel, +.>Is the base of natural logarithm, and takes the value +.>,/>For gray threshold value +.>In a specific application, the setting is performed according to the specific situation.
Considering that white noise may exist in the target area image, the background is black, but the gray values of the pixels in the background are slightly different, so as to prevent the pixels in the background from being identified as hairiness pixels, the embodiment focuses on the pixels with higher brightness in hairiness first, and gives a larger focus degree to the pixels. I.e. the more Q valueThe larger the pixel point corresponding to the gray value is, the more attention is paid. The present embodiment sets the attention degree thresholdWhen the attention degree of the pixel point is more than or equal to +.>And when the pixel point is judged to be a pixel point with a larger gray value on the yarn hairiness, the pixel point is marked, and the marked pixel point is a pixel point of a bright area on the yarn hairiness. In this embodiment set +.>The value of (2) is 0.8, and in a specific application, is set according to the specific situation.
The pixels of the hairiness area usually exist in clusters, namely the pixels of the hairiness area form a connected domain, so that the embodiment adopts a region growing method to cluster the marked pixels based on the attention degree of the pixels. Specifically, selecting the pixel point with the largest attention degree in the marked pixel points as a growth seed point, randomly selecting one of the pixel points if a plurality of attention degree value maximum values exist, searching in 8 neighborhoods of the selected pixel points, wherein the pixel points in the neighborhoods belong to attention degreesThe pixel points in the neighborhood are reserved and are combined into a region, at the moment, the region is taken as a new growth seed point, searching is carried out in the neighborhood again, and the pixel points in the neighborhood belong to the attention degree->The pixel points of the pixel points are reserved, a new area is updated, a new seed point area is obtained, and the iteration is repeated for a plurality of times until the adjacent area does not contain attention degree +.>Stopping at the pixel point of (2), at this time, obtaining a first connected domain forSelecting the remaining pixel points in the same way, and selecting the pixel point with the largest attention degree as a growth seed point, if the maximum value exists in a plurality of attention degrees, randomly selecting one of the pixel points, searching in 8 neighborhood of the selected pixel point, wherein the pixel points in the neighborhood belong to the attention degree->The reserved pixel points of (2) are combined into a region, the region is taken as a new growth seed point at the moment, searching is carried out in the neighborhood of the region again, and the attention degree of the pixel points in the neighborhood is->The pixel points of the pixel points are reserved, a new area is updated, a new seed point area is obtained, and the iteration is repeated for a plurality of times until the adjacent area does not contain attention degree +.>Stopping the pixel points of (2) to obtain a second connected domain, repeating the operation on the rest pixel points, and iterating to obtain a plurality of connected domains until the attention degree is->The pixel points of the area are clustered, and a plurality of bright area connected areas corresponding to the area are obtained at the moment.
Step S3, calculating the main direction vector of each bright area connected area according to the direction vector of the gradient change direction of each marked pixel point in each bright area connected area; for any bright region connected region in any region image: and calculating the similarity of the main direction of the bright area connected domain and the adjacent bright area connected domain closest to the bright area connected domain according to the main direction vector of the bright area connected domain and the main direction vector of the adjacent bright area connected domain closest to the bright area connected domain.
Considering that the gradient change direction of the pixel points of the hairiness bright area is always perpendicular to the hairiness trunk, the hairiness trunk direction is obtained according to the gradient change condition of the marked pixel points. Calculating all pixel points in target area image by using sobel operatorGradient vector of direction->And->Gradient vector of direction->Gradient amplitude of pixel point>The gradient direction of the pixel point is +.>
Next, the principal direction vector of each bright region connected region is calculated, that is:
wherein,is the main direction vector of any bright area connected domain, +.>Marking the number of pixel points in the connected domain, < >>For the (th)>And each marking pixel point gradient changes the direction vector of the direction.
When the main directions of two adjacent bright area connected domains are similar, the more likely that the two connected domains are the same hairiness is indicated. Considering that hairiness is generally finer, the quantity of hairiness is not particularly large, the length of hairiness is about 3mm, if more than two bright area communicating areas are arranged on the same hairiness, the distance between two adjacent hairiness is very close, the distance between two adjacent hairiness is generally larger than the distance between two adjacent bright area communicating areas on the same hairiness, therefore, the distance between two adjacent bright area communicating areas on the same hairiness is mostly smaller than the distance between two adjacent bright area communicating areas on different hairiness, the two adjacent bright area communicating areas closest to each other possibly belong to the same hairiness, and the main directions of the bright area communicating areas on the same hairiness are similar. The adjacent bright region connected regions mentioned later in this embodiment are all adjacent bright region connected regions closest to each other. For any bright region connected domain: and calculating the similarity of the connected domain and the main direction of the adjacent bright area connected domain closest to the connected domain, namely:
wherein,for the similarity of the main directions of two adjacent connected domains,/->For the principal direction vector of the gradient change of the connected domain of the bright region,>is the principal direction vector of the gradient change of the adjacent bright area connected domain nearest to the connected domain.
So far, the similarity of the main directions of the connected domains of the adjacent bright areas is obtained.
And S4, judging whether the similarity is larger than a first threshold value, if so, calculating a consistency index of the gradient direction of the pixel points in the set range around the main directions of the corresponding two bright area communication areas and the main directions of the corresponding two bright area communication areas according to the gradient direction of the pixel points in the set range around the main directions of the corresponding two bright area communication areas and the similarity of the main directions of the corresponding two bright area communication areas, judging whether the consistency index is larger than a second threshold value, and if so, judging whether the pixel points between the two bright area communication areas and the pixel points in the corresponding two bright area communication areas belong to the same hairiness according to the gradient amplitude of the pixel points between the corresponding two bright area communication areas.
The more similar the main direction of gradient change of two adjacent bright areas connected with each other, namely the similarity of the main directions of gradient change of two adjacent bright areas connected with each otherThe more the value of (2) tends to be 1, the greater the likelihood that two adjacent connected domains are the same hairiness. The present embodiment sets a similarity index threshold +.>And judging whether the similarity of the main directions of the two adjacent bright area connected areas is larger than a set threshold value or not, if so, indicating that the two adjacent bright area connected areas are more likely to be the same hairiness. However, due to the influence of noise points or other factors, the similarity judgment is inaccurate only by the main directions of the connected domains of two adjacent bright areas, the conversion from the hairiness bright areas to the dark areas can be obtained through analysis, and the gradient change direction of the pixel points between the two bright areas is similar to the direction of the hairiness trunks. The specific acquisition method of the surrounding pixel points in the main direction of any bright area connected domain comprises the following steps: establishing +.>In the present embodiment, +.>The value of (2) is 7, and in specific applications, the value is set according to specific situations. Then, calculating the consistency index of the gradient direction of the pixel points around the main directions of the two adjacent bright areas, namely:
wherein,is the consistency index of the gradient direction of the pixel points around the main direction of the two adjacent bright areas,for the number of pixel points around the main direction of one of the bright area connected domain, < >>Is the direction vector of the gradient change direction of the ith pixel point around the main direction of the bright area connected domain, +.>For the number of pixel points around the main direction of the other bright area connected domain, +.>And (3) a direction vector of the gradient change direction of the ith pixel point around the main direction of the bright area connected domain. />The larger the value of (c) is, the greater the probability that the two connected areas of the bright areas and the pixel points around the main direction are the same hairiness is. The present embodiment sets a consistency index threshold +.>I.e. a second threshold, determining if the consistency index is greater than +.>If the difference is larger than the preset value, the subsequent treatment is carried out, and if the difference is smaller than the preset value, the two bright area connected areas are not the same hairiness.
When in agreementThe performance index is greater thanIn this embodiment, the sliding window is used to detect pixels with smaller gray values between two adjacent bright areas and possibly belonging to hairiness, and determine whether the pixels belong to the same hairiness with the pixels in the two bright areas. Specifically, build->A window of a size is marked, a pixel point closest to a main direction vector of a second bright area communication area on a main direction vector of the first bright area communication area is marked as a first datum point, a pixel point closest to the main direction vector of the first bright area communication area on the main direction vector of the second bright area communication area is marked as a second datum point, the first datum point and the second datum point are connected, a midpoint on a connecting line is used as an initial center point of the sliding window, and fluctuation of gradient amplitude of the pixel point in the window is calculated, namely:
wherein,for the fluctuation of the gradient amplitude of the pixel point in the window, < + >>For the side length of the sliding window->Gradient amplitude for the r-th pixel in the window,>is the average value of the gradient amplitude values of the pixel points in the window. />Values of (2)The larger the pixel point in the sliding window is, the more possible the pixel point is hairiness, the gradient amplitude fluctuation degree of the pixel point is small because the background plate is a black background plate, the noise point is usually an isolated point, the fluctuation degree is also small, and the noise point is also a noise point, when the background plate is a black background plate>Greater than or equal to->At this time, the pixel is attached to the hairiness pixel, and +.>Is 3, and in a specific application, is set according to the specific situation.
In this embodiment, a region growing method is used to obtain a hairiness region based on the membership degree of the pixel points, the bright target point is used as a seed point for region growth, region growth is performed, and the fluctuation degree of the pixel points in eight neighborhoods is usedAnd when the number of the pixel points in the sliding window is not less than 3, indicating that the pixel points in the sliding window are affiliated to hairiness, combining the pixel points meeting the requirements with the bright area, updating the seed points at the moment, iterating until the seed point area does not meet the iteration requirements, and stopping, and acquiring the hairiness area at the moment.
And S5, calculating the severity index of the hairiness flaw according to the quantity of hairiness and the length of hairiness in each area image.
In this embodiment, in order to obtain the length of each hairiness, a distance field-based skeleton extraction algorithm is used to perform hairiness skeleton extraction, and the distance from the internal point P of each hairiness region to the boundary point B (O) of the hairiness region, i.e., D (P) =min (D (P, O)), where D (P, O) is the euclidean distance from the point P to the point O. The region growing method and the skeleton extraction algorithm based on the distance field are both existing methods, and are not described in detail herein. Thus, a hairiness skeleton is obtained.
In this embodiment, after obtaining the skeleton of yarn hairiness in each region, the length of each hairiness is obtained, and the severity index of yarn hairiness flaws is calculated according to the lengths of all hairiness and the density of long hairiness in each region, namely:
wherein,is the severity index of yarn hairiness flaw, < >>Is->Number of long hairiness in individual zones, +.>Is->Number of short hairiness in individual zones, +.>The long hairiness is hairiness with the length of more than 3mm, and the short hairiness is hairiness with the length of less than or equal to 3 mm. The denser the distribution of the long hairiness, i.e. the more the quantity of the long hairiness, the greater the severity of the yarn hairiness flaw.
According to the embodiment, quality classification and negative feedback adjustment are carried out on hairiness according to the severity index of the hairiness flaw, when the hairiness flaw is serious, the yarns are recovered and twisted again, and when the hairiness flaw is slight, corresponding singeing treatment is carried out after spinning is finished.
The purpose of this embodiment is to detect the severity of defects on the surface of textile yarns, so that it is necessary to detect defects on the textile yarns by using new materials, and considering that the length of hairiness and the density of long hairiness of yarns affect the quality of yarns, the defect detection in this embodiment mainly measures the length of hairiness and the number of long hairiness, and determines the defect level of hairiness defects according to the ratio of the length of hairiness to the length of long hairiness. The embodiment firstly obtains the surface gray level image of the textile yarn by utilizing an optical means, then cuts out the image of the yarn trunk area to obtain the image of the target area, eliminates the interference of some useless pixel points on the metering result, and reduces the calculated amount; the embodiment firstly obtains a connected domain formed by brighter pixels on hairiness and darker pixels on hairiness, if a plurality of bright area connected domains are arranged on the same hairiness, the main directions of the bright area connected domains on hairiness have higher similarity, the embodiment calculates the similarity of the main directions of adjacent bright area connected domains, judges whether the similarity is larger than a threshold value, if so, indicates that the two corresponding bright area connected domains are likely to belong to the same hairiness, then calculates the consistency index of the pixels around the two corresponding adjacent bright area connected domains and the main directions of the two corresponding bright area connected domains, then judges whether the consistency index is larger than the threshold value, if so, judges whether the pixel with smaller gray value between the two bright area connected domains and the pixel in the two bright area connected domains belong to the pixel on the same hairiness, and extracts the skeleton of the hairiness by adopting a skeleton extraction algorithm after judging; and (3) measuring the quantity and the length of hairiness, and calculating the severity index of hairiness flaws according to the quantity and the length of hairiness. The method is a method for detecting textile materials by utilizing optical means (particularly visible light images), and particularly detecting the existence of hairiness flaws on the surface of the textile materials. The method can be applied to new material related services, and can realize new material detection, metering, related standardization, authentication and approval services and the like. The method provided by the embodiment can automatically detect hairiness flaws without manually detecting the severity of the hairiness flaws of the yarns, thereby saving time and improving detection efficiency.
It should be noted that: the foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the invention are intended to be included within the scope of the invention.

Claims (7)

1. An optical means-based textile material surface hairiness degree identification method is characterized by comprising the following steps:
acquiring a surface gray level image of the textile yarn, and dividing an image of a yarn evenness area in the surface gray level image of the textile yarn by adopting a graph cutting algorithm to obtain an image of a target area;
dividing the image of the target area into a set number of area images: for any region image: according to the gray values of the pixel points in the region image, calculating the attention degree of the pixel point corresponding to any gray value in the region, and marking the pixel point with the attention degree larger than or equal to a set threshold value; clustering the marked pixel points according to the attention degree of the marked pixel points in the area to obtain the communication areas of the bright areas corresponding to the area;
calculating the main direction vector of each bright region connected region according to the direction vector of the gradient change direction of each marked pixel point in each bright region connected region; for any bright region connected region in any region image: calculating the similarity of the main direction of the bright area connected domain and the main direction of the adjacent bright area connected domain closest to the bright area connected domain according to the main direction vector of the bright area connected domain and the main direction vector of the adjacent bright area connected domain closest to the bright area connected domain;
judging whether the similarity is larger than a first threshold value, if so, calculating a consistency index of the gradient direction of the pixel points in the set range around the main directions of the corresponding two bright areas and the main directions of the bright areas according to the similarity of the gradient directions of the pixel points in the set range around the main directions of the corresponding two bright areas and the main directions of the corresponding bright areas, judging whether the consistency index is larger than a second threshold value, and if so, judging whether each pixel point between the two bright areas is affiliated to the same hairiness with the pixel point in the corresponding two bright areas according to the gradient amplitude of each pixel point between the two bright areas;
calculating the severity index of hairiness flaws according to the quantity of hairiness and the length of hairiness in each area image;
the principal direction vector of each bright region connected region is calculated by adopting the following formula:
wherein M is the main direction vector of any connected domain of the bright region, k is the number of marked pixel points in the connected domain, F J A direction vector of the gradient change direction of the J-th marked pixel point in the connected domain;
selecting a pixel point with the largest attention degree in the marked pixel points as a growth seed point, randomly selecting one of the pixel points if a plurality of maximum attention degree values exist, searching in 8 neighborhoods of the selected pixel point, wherein the pixel points in the neighborhoods belong to attention degree Q more than or equal to Q 0 The pixel points in the neighborhood are reserved and are combined into a region, the region is taken as a new growth seed point at the moment, searching is carried out in the neighborhood again, and the pixel points in the neighborhood belong to the attention degree Q is more than or equal to Q 0 The pixel points of the pixel points are reserved, a new area is updated, a new seed point area is obtained, and the iteration is repeated for a plurality of times until the adjacent area does not contain the attention degree Q is more than or equal to Q 0 Stopping the pixel points of (1) to obtain a first connected domain, repeating the steps for the rest pixel points in the same way, and iterating to obtain a plurality of connected domains until the attention degree Q is more than or equal to Q 0 The pixel points of the area are clustered, and a plurality of connected areas corresponding to the area are obtained at the moment, namely a plurality of bright area connected areas; q (Q) 0 For the attention degree threshold value, Q is the attention degree of the pixel point.
2. The method for recognizing the hairiness on the surface of the textile material based on the optical means according to claim 1, wherein the following formula is adopted to calculate the attention degree of the pixel point corresponding to any gray value in the region, comprising:
wherein Q is the attention degree of the pixel point corresponding to any gray value, x is the gray value of the pixel point, delta A is the maximum gray value of the pixel point in the region, e is the base of natural logarithm, and c is the gray threshold.
3. The method for identifying the surface hairiness of the textile material based on the optical means according to claim 1, wherein the following formula is adopted to calculate the consistency index of the gradient direction of the pixel points in the set range around the main directions of the two connected areas of the bright areas and the main directions of the bright areas:
s is a consistency index of the gradient direction and the main direction of the pixel points in a set range around the main direction of the two light areas, and l 1 Setting the number of pixel points in a range around the main direction of one of the bright area connected regions, and x i Setting a direction vector of gradient change direction of the ith pixel point in a range around the main direction of the bright area communication domain, M 1 Is the main direction vector of the connected domain of the bright area, l 2 Setting the number of pixel points in a range around the main direction of the other bright area connected domain, y i Setting a direction vector of gradient change direction of the ith pixel point in a range around the main direction of the bright area communication domain, M 2 Is the principal direction vector of the connected domain of the bright area.
4. The method for identifying the hairiness on the surface of the textile material based on the optical means according to claim 1, wherein if the similarity is smaller than or equal to a first threshold value, it is determined that the pixels in the two connected bright areas do not belong to the same hairiness pixel.
5. The method for identifying the surface hairiness of a textile material based on an optical means according to claim 1, wherein the determining whether each pixel point between two connected areas of bright areas and the corresponding pixel point in the connected areas of the two bright areas belong to the same hairiness comprises:
detecting pixel points between the main directions of the two corresponding bright area communication areas by using a sliding window, connecting the two pixel points closest to each other on the main direction vectors of the two bright area communication areas, and taking the midpoint on the connecting line as an initial center point of the sliding window;
calculating the variance of the gradient amplitude of the pixel points in the sliding window, and taking the variance as the fluctuation degree of the corresponding pixel points in the sliding window;
judging whether the fluctuation degree is larger than or equal to a fluctuation degree threshold value, if so, judging that the pixel points in the sliding window and the pixel points in the communication areas corresponding to the two bright areas belong to the same hairiness.
6. The method for identifying the surface hairiness of a textile material based on an optical means according to claim 1, wherein if the consistency index is less than or equal to a second threshold value, it is determined that the pixels in the two connected bright areas do not belong to the same hairiness pixel.
7. The method for recognizing the surface hairiness of a textile material based on an optical means according to claim 1, wherein the severity index of hairiness flaws is calculated by using the following formula:
wherein T is the severity index of hairiness flaw, s1 a For the number of hairiness greater than the set length in the a-th region, s2 a The number of hairiness in the a-th area is less than or equal to the set length, and l is the total number of areas.
CN202210844374.6A 2022-07-19 2022-07-19 Textile material surface hairiness degree identification method based on optical means Active CN114998321B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210844374.6A CN114998321B (en) 2022-07-19 2022-07-19 Textile material surface hairiness degree identification method based on optical means

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210844374.6A CN114998321B (en) 2022-07-19 2022-07-19 Textile material surface hairiness degree identification method based on optical means

Publications (2)

Publication Number Publication Date
CN114998321A CN114998321A (en) 2022-09-02
CN114998321B true CN114998321B (en) 2023-12-29

Family

ID=83021021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210844374.6A Active CN114998321B (en) 2022-07-19 2022-07-19 Textile material surface hairiness degree identification method based on optical means

Country Status (1)

Country Link
CN (1) CN114998321B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115237083B (en) * 2022-09-23 2024-01-12 南通沐沐兴晨纺织品有限公司 Textile singeing process control method and system based on computer vision
CN117173162B (en) * 2023-11-01 2024-02-13 南通杰元纺织品有限公司 Textile flaw detection method and system
CN117422716B (en) * 2023-12-19 2024-03-08 沂水友邦养殖服务有限公司 Ecological early warning method and system for broiler chicken breeding based on artificial intelligence

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114627111A (en) * 2022-05-12 2022-06-14 南通英伦家纺有限公司 Textile defect detection and identification device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6650779B2 (en) * 1999-03-26 2003-11-18 Georgia Tech Research Corp. Method and apparatus for analyzing an image to detect and identify patterns
CN113012105B (en) * 2021-02-08 2024-04-26 武汉纺织大学 Yarn hairiness detection and rating method based on image processing
CN114897890B (en) * 2022-07-08 2022-09-30 南通华烨塑料工业有限公司 Artificial intelligence-based modified plastic production regulation and control method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114627111A (en) * 2022-05-12 2022-06-14 南通英伦家纺有限公司 Textile defect detection and identification device

Also Published As

Publication number Publication date
CN114998321A (en) 2022-09-02

Similar Documents

Publication Publication Date Title
CN114998321B (en) Textile material surface hairiness degree identification method based on optical means
CN113643289B (en) Fabric surface defect detection method and system based on image processing
CN114842007B (en) Textile wear defect detection method based on image processing
CN116309671B (en) Geosynthetic fabric quality detection system
CN103759662A (en) Dynamic textile yarn diameter rapid-measuring device and method
CN110403232A (en) A kind of cigarette quality detection method based on second level algorithm
CN101424680A (en) Computer automatic recognition apparatus and method for profile fiber
CN114897894A (en) Method for detecting defects of cheese chrysanthemum core
CN115266732B (en) Carbon fiber tow defect detection method based on machine vision
CN114842013B (en) Textile fiber strength detection method and system
CN113936001B (en) Textile surface flaw detection method based on image processing technology
CN109919939B (en) Yarn defect detection method and device based on genetic algorithm
CN116563276B (en) Chemical fiber filament online defect detection method and detection system
CN116894840B (en) Spinning proofing machine product quality detection method and system
TWI417437B (en) Yarn detecting method
CN115496762B (en) Textile technology-based dyeing defect identification method
CN115082489B (en) Colored silk evaluation method
CN114897788B (en) Yarn package hairiness detection method based on guided filtering and discrete difference
Jeffrey Kuo et al. Self-organizing map network for automatically recognizing color texture fabric nature
CN115294165A (en) Intelligent operation method for textile singeing process based on machine vision
CN110838113B (en) Method for detecting monofilament count and monofilament thickness consistency in multifilament synthesis
Ramakrishnan et al. A Novel Fabric Defect Detection Network in textile fabrics based on DLT
Niles et al. A system for analysis, categorisation and grading of fabric defects using computer vision
CN113393444B (en) Polyester DTY network point detection method based on image processing technology
CN117392132B (en) Visual detection method for sewing defects of garment fabric

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant