CN117495798A - Defect detection method for flame-retardant glass fiber mesh cloth - Google Patents

Defect detection method for flame-retardant glass fiber mesh cloth Download PDF

Info

Publication number
CN117495798A
CN117495798A CN202311444552.7A CN202311444552A CN117495798A CN 117495798 A CN117495798 A CN 117495798A CN 202311444552 A CN202311444552 A CN 202311444552A CN 117495798 A CN117495798 A CN 117495798A
Authority
CN
China
Prior art keywords
pixel point
gray
target pixel
value
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311444552.7A
Other languages
Chinese (zh)
Other versions
CN117495798B (en
Inventor
宋丙玉
卜令芳
钟红生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weishan Brother Glass Products Co ltd
Original Assignee
Weishan Brother Glass Products Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Weishan Brother Glass Products Co ltd filed Critical Weishan Brother Glass Products Co ltd
Priority to CN202311444552.7A priority Critical patent/CN117495798B/en
Publication of CN117495798A publication Critical patent/CN117495798A/en
Application granted granted Critical
Publication of CN117495798B publication Critical patent/CN117495798B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/892Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
    • G01N21/898Irregularities in textured or patterned surfaces, e.g. textiles, wood
    • G01N21/8983Irregularities in textured or patterned surfaces, e.g. textiles, wood for testing textile webs, i.e. woven material
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/77Determining position or orientation of objects or cameras using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Textile Engineering (AREA)
  • Immunology (AREA)
  • Chemical & Material Sciences (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Geometry (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Quality & Reliability (AREA)
  • Wood Science & Technology (AREA)
  • Treatment Of Fiber Materials (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention relates to the technical field of defect detection, in particular to a defect detection method of flame-retardant glass fiber mesh cloth. The method uses optical characteristics to collect optical images of the glass fiber mesh cloth through a camera attached with a visible light source. And analyzing the target information to be processed according to the optical characteristics reflected in the image. The pixel value information and the gradient information of the target information under the visible light source are combined to obtain a gray image of the target flame-retardant glass fiber mesh cloth, wherein the gray image has obvious optical characteristics; and then a defective area is obtained. The invention can accurately obtain the defect area on the flame-retardant glass fiber mesh cloth.

Description

Defect detection method for flame-retardant glass fiber mesh cloth
Technical Field
The invention relates to the technical field of defect detection of flame-retardant glass fiber mesh cloth, in particular to a defect detection method of flame-retardant glass fiber mesh cloth.
Background
Glass fiber mesh cloth made of flame-retardant glass fibers, which is made by coating the surface of a base cloth with a high polymer alkali-resistant coating, is being largely used for embedding the glass fiber mesh cloth into a plastering layer to improve the mechanical strength and crack resistance of a protective layer. However, in the production process of the flame-retardant glass fiber mesh cloth, defects are generated on the flame-retardant glass fiber mesh cloth, and the defects on the flame-retardant glass fiber mesh cloth influence the quality and the use effect of the flame-retardant glass fiber mesh cloth, so that the detection of the defects on the flame-retardant glass fiber mesh cloth is very important, and the flame-retardant glass fiber mesh cloth has very important social significance and economic value.
In the production process of the flame-retardant glass fiber mesh cloth, the transmission speed of a transmission belt is generally higher for the efficiency of pipeline detection or production efficiency, so that the flame-retardant glass fiber mesh cloth is in a high-speed motion state, the acquired image is caused to move in a fuzzy manner, the linear characteristics of the warp and weft of the original image are lost, the generated defect area is further caused to be unobvious due to coverage of the fuzzy area, and edge detection and segmentation cannot accurately identify the defect area, so that the detection result is inaccurate and even misjudgment is caused.
Disclosure of Invention
The invention provides a defect detection method of flame-retardant glass fiber mesh cloth, which is used for solving the problem of lower accuracy when the existing method is used for detecting the defects of the flame-retardant glass fiber mesh cloth, and the adopted technical scheme is as follows:
the embodiment of the invention provides a defect detection method of flame-retardant glass fiber mesh cloth, which comprises the following steps:
acquiring a gray image of the flame-retardant glass fiber mesh cloth on a production line of the flame-retardant glass fiber mesh cloth according to a camera attached with a visible light source;
dividing gray values on a gray histogram according to peaks on the gray histogram corresponding to the flame-retardant glass fiber mesh cloth gray image to obtain a target gray value range;
obtaining windows corresponding to all pixel points in the target gray scale range according to a preset sliding window; screening each pixel point in the target gray scale range according to the gray value of each pixel point in a window corresponding to each pixel point in the target gray scale range to obtain each target pixel point;
obtaining a first weight value corresponding to each target pixel point according to the pixel value of each target pixel point; obtaining a second weight value corresponding to each target pixel point according to the gradient direction of each target pixel point;
according to the first weight value and the second weight value corresponding to each target pixel point, the weight of each target pixel point in the structural element is obtained; according to the weight of the target pixel point in the structural element, obtaining a gray image of the target flame-retardant glass fiber mesh cloth; and obtaining a defect area according to the gray level image of the target flame-retardant glass fiber mesh cloth.
Preferably, the method for dividing the gray value on the gray histogram according to the peak on the gray histogram corresponding to the gray image of the flame-retardant glass fiber mesh cloth to obtain the target gray value range comprises the following steps:
according to the gray level histogram, gray level values corresponding to two wave crest positions and gray level values corresponding to one wave trough position on the gray level histogram are obtained; the gray values of the two wave peak positions are respectively marked as a 1 And a 3 The gray value of the trough position is marked as a 2 And a 1 >a 2 >a 3
Fitting the gray level histogram to obtain a fitting curve;
calculating a slope value corresponding to each gray value on the fitting curve;
at 0 to a 3 Selecting the gray value with the largest slope value in the range, and marking the gray value as a 4 The method comprises the steps of carrying out a first treatment on the surface of the At a 1 And a 2 Selecting the gray value with the largest slope value in the range, and marking the gray value as a 5
According to a 1 、a 2 、a 3 、a 4 、a 5 Dividing gray values on the gray histogram to obtain a first gray valueRange b 1 Second gray value range b 2 And a third gray value range b 3 ,b 1 =(0,a 4 ],b 2 =(a 4 ,a 5 ],b 3 =(a 5 ,255];
Second gray value range b 2 And is noted as a target gray scale range.
Preferably, the filtering the pixel points in the target gray scale range according to the gray values of the pixel points in the window corresponding to the pixel points in the target gray scale range to obtain the target pixel points comprises the following steps:
for any pixel point in the target gray scale range:
judging whether the gray value of each pixel point in the window corresponding to the pixel point is in the target gray range, if so, marking the pixel point as a target pixel point.
Preferably, the method for obtaining the first weight value corresponding to each target pixel point according to each target pixel point includes:
for any target pixel point:
acquiring a target pixel point with the largest ordinate value and a target pixel point with the smallest ordinate value on the column where the target pixel point is located;
obtaining a target pixel point set on the row of the target pixel point according to all the target pixel points on the row of the target pixel point;
calculating the gray average value of eight neighborhood pixel points corresponding to each target pixel point in the target pixel point set on the row where the target pixel point is located;
selecting the maximum gray average value and the minimum gray average value in the gray average values of eight adjacent pixel points corresponding to each target pixel point in the target pixel point set on the row of the target pixel point;
and obtaining a first weight value corresponding to the target pixel according to the target pixel with the largest ordinate value and the target pixel with the smallest ordinate value on the column of the target pixel and the maximum gray average value and the minimum gray average value in the gray average values of eight neighborhood pixel corresponding to each target pixel in the target pixel set on the row of the target pixel.
Preferably, the first weight value corresponding to the target pixel point is calculated according to the following formula:
wherein w is 1 For the first weight value, Y, corresponding to the target pixel point eu Y is the ordinate value corresponding to the target pixel point with the largest ordinate value on the column of the target pixel point ed Y is the ordinate value corresponding to the target pixel point with the smallest ordinate value on the column of the target pixel point e For the ordinate value corresponding to the target pixel point,for the maximum gray average value in the gray average values of eight neighborhood pixel points corresponding to each target pixel point in the target pixel point set on the row of the target pixel point, the following is->For the minimum gray average value in the gray average values of eight neighborhood pixel points corresponding to each target pixel point in the target pixel point set on the row of the target pixel point, the following is->The gray average value of the eight neighborhood pixel points corresponding to the target pixel point is obtained.
Preferably, for any target pixel, the following formula calculates a second weight value corresponding to the target pixel:
wherein w is 2 For the second weight value corresponding to the target pixel point, θ is the gradient direction of the target pixel point,representing the opposite direction of the transport direction of the conveyor belt, cos is a cosine similarity calculation function.
Preferably, the method for obtaining the weight of each target pixel point in the structural element according to the first weight value and the second weight value corresponding to each target pixel point comprises the following steps:
taking the product of the first weight value corresponding to each target pixel point and the second weight value corresponding to each target pixel point as the weight of each target pixel point in the structural element.
Preferably, the method for obtaining the gray level image of the target flame-retardant glass fiber mesh cloth according to the weight of the target pixel point in the structural element comprises the following steps:
marking the product of the gray value of each target pixel point and the weight of the corresponding target pixel point in the structural element as the adjustment gray value of each target pixel point;
the gray value of each pixel point except the target pixel point on the gray image of the flame-retardant glass fiber mesh cloth and the image formed by the adjustment gray value of each target pixel point on the gray image of the flame-retardant glass fiber mesh cloth are recorded as a reconstruction image;
and carrying out gray morphological corrosion operation on the reconstructed image, and marking the image after the corrosion operation as a gray image of the target flame-retardant glass fiber mesh cloth.
Preferably, the method for obtaining the defect area according to the gray level image of the target flame-retardant glass fiber mesh cloth comprises the following steps:
and carrying out edge detection on the gray level image of the target flame-retardant glass fiber mesh cloth to obtain a defect area.
The beneficial effects are that: the method is characterized by analyzing optical characteristics of flame-retardant glass fiber mesh cloth under a visible light source, and dividing gray values on a gray histogram according to peaks on the gray histogram corresponding to a gray image of the flame-retardant glass fiber mesh cloth to obtain a target gray value range; obtaining windows corresponding to all pixel points in the target gray scale range according to a preset sliding window; screening each pixel point in the target gray scale range according to the gray value of each pixel point in a window corresponding to each pixel point in the target gray scale range to obtain each target pixel point; according to each target pixel point, a first weight value corresponding to each target pixel point is obtained; obtaining a second weight value corresponding to each target pixel point according to the gradient direction of each target pixel point; according to the first weight value and the second weight value corresponding to each target pixel point, the weight of each target pixel point in the structural element is obtained; according to the weight of the target pixel point in the structural element, obtaining a target flame-retardant glass fiber mesh gray image with obvious characteristics; and obtaining a defect area according to the gray level image of the target flame-retardant glass fiber mesh cloth with obvious optical characteristics. The invention can accurately obtain the defect area on the flame-retardant glass fiber mesh cloth.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for detecting defects in a flame retardant fiberglass scrim according to the present invention;
FIG. 2 is a gray value division diagram of a gray histogram according to the present invention;
fig. 3 is a schematic diagram of pixel location distribution according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, but not all embodiments, and all other embodiments obtained by those skilled in the art based on the embodiments of the present invention are within the scope of protection of the embodiments of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The embodiment provides a defect detection method of flame-retardant glass fiber mesh fabric, which is described in detail as follows:
as shown in FIG. 1, the defect detection method of the flame-retardant glass fiber mesh fabric comprises the following steps:
and S001, acquiring a gray level image of the flame-retardant glass fiber mesh cloth on the production line of the flame-retardant glass fiber mesh cloth.
In the production process of the flame-retardant glass fiber mesh cloth, the high conveying speed is set for the conveying belt for the production efficiency of the assembly line under the normal condition, and the flame-retardant glass fiber mesh cloth is in a high-speed motion state at the moment, so that the acquired image is blurred, the linear characteristics of warps and wefts in the original image are lost, the weft breakage defect generated by the flame-retardant glass fiber mesh cloth is further caused to be inconspicuous due to coverage of the blurred area, and the edge detection and image segmentation are caused to be incapable of accurately identifying the weft breakage defect area, therefore, the embodiment mainly integrates the gray distribution information in the acquired flame-retardant glass fiber mesh cloth blurred image and the staggered weaving characteristics of the wefts, carries out the self-adaptive adjustment of the internal weight of the structural element in the area, carries out gray morphological operation to eliminate the problem of blurring caused by movement, and further improves the accuracy of detecting or identifying the defect area of the flame-retardant glass fiber mesh cloth, namely the flame-retardant glass fiber mesh cloth on the high-speed assembly line can be accurately detected.
Firstly, the embodiment utilizes an industrial camera attached with a visible light source to collect and produce flame-retardant glass fiber mesh cloth images on a flame-retardant glass fiber mesh cloth assembly line, the collection direction of the camera when collecting the images is perpendicular to the surface of the flame-retardant glass fiber mesh cloth on a transmission belt, the collected flame-retardant glass fiber mesh cloth images of each frame are of mutually perpendicular structures, and the linear direction of the warp is the same as the transmission direction of the transmission belt. And then carrying out graying treatment on the acquired flame-retardant glass fiber mesh cloth image on the production flame-retardant glass fiber mesh cloth production line to obtain a flame-retardant glass fiber mesh cloth gray image.
And step S002, dividing the gray value on the gray histogram according to the peak on the gray histogram corresponding to the gray image of the flame-retardant glass fiber mesh cloth to obtain a target gray value range.
In the transmission process of the flame-retardant glass fiber mesh cloth, the acquired image is blurred due to high-speed movement. The warp and weft arrangement structure of the flame-retardant glass fiber mesh cloth is regular, wherein the warp, namely the yarns in the vertical direction are the same as the transmission direction of the conveyor belt. Therefore, the disturbance to warp threads is small due to the blurring effect generated by the vertical movement, and weft threads are greatly influenced, and the phenomenon that lines are changed into a blurring area is shown. The influence of the blurring effect on the detection of the weft breakage defect generated in the transverse direction is extremely large, and the influence on warp breakage is extremely small and even more obvious, so that the weft breakage is mainly aimed at in the scene; and the vertical lines on the gray level image of the flame-retardant glass fiber mesh cloth are warp lines, and the horizontal lines on the gray level image of the flame-retardant glass fiber mesh cloth are weft lines.
In the blurred image acquired by the flame-retardant glass fiber mesh cloth, the influence on the weft is far greater than that on the warp due to the movement in the vertical direction, and the blurring range of the blurring area is based on the relative direction of the transmission direction of the conveyor belt and the warp and the weft. The visual effect is that the edge formed by the weft becomes a wider fuzzy area while the square structure formed by the staggered warp and weft is maintained, the warp is less affected, the brighter area in the center is reserved, and the edge has weaker fuzzy phenomenon; in the acquired gray level image of the flame-retardant glass fiber mesh cloth, the gray level value of the warp and the weft is higher, and the gray level value of a fuzzy area generated by the part of the area is not too low. Since the warp is not affected too much, the high gray part of its body is preserved, and only a small degree of blurring is generated at the edges. The weft loses the high gray part of the main body of the weft and becomes a fuzzy part; and the low gray mesh area composed of warp and weft is partially covered by the blurred portion.
Therefore, in this embodiment, the image is traversed by using a preset sliding window to determine the gray level distribution information of the pixel points in the sliding window, where the gray level distribution information of the pixel points in the sliding window can reflect the position information of the sliding window, and the size of the sliding window in a specific application can be set according to parameters such as the specific resolution of the imaging device and the mesh aperture size of the mesh fabric, where the size of the sliding window is set to 3*3; therefore, the embodiment firstly acquires the gray level histogram corresponding to the gray level image of the flame-retardant glass fiber mesh cloth, and acquires the gray level value corresponding to the peak position and the gray level value corresponding to the trough position according to the gray level histogram; the number of the obtained wave peak positions is two, the number of the wave trough positions is one, and the gray values of the two wave peak positions are respectively marked as a 1 And a 3 And the trough position is positioned between two peaks, and the gray value of the trough position is marked as a 2 And a 1 >a 2 >a 3 The gray values on the gray histogram are then divided based on the peaks and valleys.
Then, a fitting area is distributed on the gray value in the obtained gray histogram to obtain a fitting curve, the abscissa of the fitting curve is the gray value, the ordinate is the gray value, the frequency of the gray value appearing on the image is calculated, and the slope value of each position of the fitting curve, namely the slope value corresponding to each gray value on the fitting curve, is calculated; the reason of the blurring effect is combined, the blurring effect is not the reason of gradual change of the edge, and only one area with the gray scale close to the blurring effect corresponds to the blurring effect, so that the corresponding area in the image is approximately divided into three parts of areas, namely, the pixel points of the mesh area of the flame-retardant glass fiber mesh cloth, the pixel points of the blurring area and the pixel points of the warp main body area. The gray values of the three partial areas are greatly differentiated, so that the peak a is used 3 Selecting the gray value with the largest slope value in the left range, and marking the gray value as a 4 The method comprises the steps of carrying out a first treatment on the surface of the At peak a 1 And trough a 2 Selecting the gray value with the largest slope value, and marking the gray value as a 5 The method comprises the steps of carrying out a first treatment on the surface of the The following is according to a 1 、a 2 、a 3 、a 4 、a 5 Gray values on a gray histogramDividing.
The embodiment divides the gray value on the gray histogram into three gray value ranges; respectively a first gray value range b 1 And b 1 =(0,a 4 ]The method comprises the steps of carrying out a first treatment on the surface of the Second gray value range b 2 And b 2 =(a 4 ,a 5 ]The method comprises the steps of carrying out a first treatment on the surface of the Third gray value range b 3 And b 3 =(a 5 ,255]The method comprises the steps of carrying out a first treatment on the surface of the Wherein b 1 =(0,a 4 ) Representing the gray scale range b 1 For gray level 0 and gray level a 4 The same applies to the gray level range formed by all gray levels in between. Therefore, the gray value dividing region on the gray histogram in this embodiment is as shown in fig. 2; and gray scale range b 1 =(0,a 4 ) Corresponding to the pixel points in the mesh area of the flame-retardant glass fiber mesh cloth, and the gray scale range b 2 =(a 4 ,a 5 ) Corresponding to the pixel points of the fuzzy area (comprising the edge fuzzy area generated by warp and the weft integral fuzzy area), the gray scale range b 3 =(a 5 255) are corresponding to the warp main body area pixel points.
So far, each gray value range is obtained; due to the gray scale range b 2 =(a 4 ,a 5 ) Corresponding to the pixel point of the fuzzy area, and only the gray scale range b is used in the follow-up process 2 =(a 4 ,a 5 ) Performs analysis, thus converting the second gray value range b 2 And is noted as a target gray scale range.
Step S003, obtaining windows corresponding to all pixel points in the target gray scale range according to a preset sliding window; and screening each pixel point in the target gray scale range according to the gray value of each pixel point in the window corresponding to each pixel point in the target gray scale range to obtain each target pixel point.
For the gray scale range of the divided region, b is obvious 1 And b 3 The corresponding pixel points are expressed as areas with higher gray level and lower gray level, and the pixel points in the two gray level ranges can be selected to belong to the gray level range b only by judging whether the pixel points in the two gray level ranges appear in the structural element 1 And b 3 The converted gray value is just; but the blurred region includes warpIf the part of the area is deblurred directly, the edge blurring of the warp is removed, and meanwhile, the blurring part formed by the weft is removed completely, so that the blurring area is required to be divided separately; the method comprises the following steps:
firstly traversing a gray level image of the flame-retardant glass fiber mesh cloth by using a preset sliding window size to obtain windows corresponding to all pixel points on the gray level image of the flame-retardant glass fiber mesh cloth; and this embodiment mainly deals with the following three case windows: the first is that the gray value of each pixel point in the window belongs to b 1 Of also b 2 The positions of the pixel points in the window conforming to the phenomenon are fuzzy areas generated by the rest mesh areas and warps and wefts in the meshes of the flame-retardant glass fiber mesh cloth; the second is that the gray value of each pixel point in the window belongs to b 2 Of also b 3 The positions of the pixel points in the window conforming to the phenomenon are the junction areas of the warp main body part and the warp edge blurring; third, the gray values of all pixel points in the window are b 2 The pixel points in the window conforming to the phenomenon are all fuzzy area pixel points; specifically, as shown in FIG. 3, the direction area inside the dotted arc in FIG. 3 is b 3 Representing warp main body area and high gray scale range, the remainder being b 2 I.e. blurred regions, wz 1 In the first case, wz 2 In the second case, wz 3 Corresponding to the third case.
Besides the three conditions, there are two conditions, namely, the pixel point in the sliding window is b 1 Or all b 3 For both cases, since the gray scale range is more specific, it can be determined in the image as the main part in the figure, i.e. the warp main part or the black area in the mesh, and thus no adjustment of the center pixel is required for both cases.
Step S004, according to the gray value corresponding to each target pixel point, a first weight value corresponding to each target pixel point is obtained; and obtaining a second weight value corresponding to each target pixel point according to the gradient direction of each target pixel point.
When the gray values of all the pixel points in the window belong to b 2 The window is positioned in the full-fuzzy area, wherein the composition of the fuzzy area is divided into two types, namely a small quantity of fuzzy area caused by the edges of warp yarns and a fuzzy area with a larger range formed after the weft yarns lose the main structural characteristics of the weft yarns. Therefore, if the pixel points in the gray range of the fuzzy area are directly transformed or eliminated, the weft area is completely lost; meanwhile, the effect of the warp and the weft in the blurring is different, the transmission direction of the conveyor belt is the same as that of the warp, so that the blurring direction generated by the warp is generated by left and right shaking generated by the warp, the blurring direction is transverse, and the linear direction of the weft is perpendicular to the transmission direction of the conveyor belt, so that the blurring direction generated by the weft is in an obvious vertical blurring direction (the blurring direction represents the gray gradient direction generated by the blurring area).
For any target pixel point:
if the target pixel point is adjacent to the target pixel point in the vertical direction, judging that the target pixel points are in the same fuzzy range; for the target pixel point in the same blurring range, it is marked as U e The method comprises the steps of carrying out a first treatment on the surface of the Acquiring a target pixel point with the largest ordinate value and a target pixel point with the smallest ordinate value on the column of the target pixel point, respectively marking as U eu And U ed U, i.e. U ed For the lower edge pixel point, U eu Is the upper edge pixel point. Namely, at this time, pixel point U e On this column, the length of the blurring range is Y eu -Y ed Wherein Y is eu And Y ed Respectively the target pixel points U eu And U ed Is defined by the vertical coordinate of (c).
Then, obtaining a target pixel point set on the row of the target pixel point according to all the target pixel points on the row of the target pixel point; then calculating the gray average value of eight neighborhood pixel points corresponding to each target pixel point in the target pixel point set on the row where the target pixel point is located; selecting the maximum gray average value and the minimum gray average value in the gray average values of eight adjacent pixel points corresponding to each target pixel point in the target pixel point set on the row of the target pixel point; calculating a first weight value corresponding to the target pixel point according to the following formula:
wherein w is 1 For the first weight value, Y, corresponding to the target pixel point eu Y is the ordinate value corresponding to the target pixel point with the largest ordinate value on the column of the target pixel point ed Y is the ordinate value corresponding to the target pixel point with the smallest ordinate value on the column of the target pixel point e For the ordinate value corresponding to the target pixel point,for the maximum gray average value in the gray average values of eight neighborhood pixel points corresponding to each target pixel point in the target pixel point set on the row of the target pixel point, the following is->For the minimum gray average value in the gray average values of eight neighborhood pixel points corresponding to each target pixel point in the target pixel point set on the row of the target pixel point, the following is->The gray average value of the eight neighborhood pixel points corresponding to the target pixel point is obtained. />The larger the position of this point, the closer the main body of the weft yarn blurring area is; />The larger this pixel point is, the greater the overlapping degree of the warp and weft blurred regions is.
The blurring effect generated by the weft yarn is caused by unidirectional movement, so that the gray scale of the unidirectional direction from the main body area to the blurring area gradually changes along the movement direction, and the position information of the unidirectional direction in the blurring range, namely the position from the lower edge, reflects the probability of approaching the unidirectional direction to the main body area of the weft yarn. However, the blurred areas are caused by warps in addition to wefts, and the superposition of the two blurred areas is higher than that of the single weft, so that the whole gray value of the blurred areas is increased. Therefore, the pixel point is also required to be adjusted on one of the rows of the pixel point by the position information reflected by the gray information; however, considering that the extra blurring of the warps and the wefts caused by different degrees of vibration can cause a certain degree of influence, the adjustment is also needed according to the actual gradient direction; the method comprises the following steps:
for any target pixel point:
obtaining the gradient direction theta of the target pixel point according to the eight neighborhood pixel points corresponding to the target pixel point; the calculation of the gradient direction is well known and will not be described in detail. The gradient direction theta of the target pixel point represents an acute angle formed by the gradient direction and a horizontal line, and theta epsilon (0-90 degrees); then, according to the gradient direction of the target pixel point, a second weight value corresponding to the target pixel point is obtained; and calculating a second weight value corresponding to the target pixel point according to the following formula:
wherein w is 2 For the second weight value corresponding to the target pixel point, θ is the gradient direction of the target pixel point,representing the opposite direction of the transport direction of the conveyor belt, cos is a cosine similarity calculation function. Namely, a larger fuzzy direction angle generated by motion, wherein the direction angle is an included angle with the x-axis direction of an image coordinate system, and w 2 Can also represent the pixel points according to the pixel points and the neighborhoodWeight value corresponding to the gradient direction of (c). w (w) 2 The larger the pixel point is, the more the fuzzy area where the pixel point is located is approaching to the fuzzy area of the weft, and the range of the second weight value is 0 to 1.
Step S005, obtaining the weight of each target pixel point in the structural element according to the first weight value and the second weight value corresponding to each target pixel point; according to the weight of the target pixel point in the structural element, obtaining a gray image of the target flame-retardant glass fiber mesh cloth; and obtaining a defect area according to the gray level image of the target flame-retardant glass fiber mesh cloth.
The degree of change of the gray level of the blurred region generated by the warp yarn edge is higher than that of the blurred region generated by the weft yarn, meanwhile, the blurred effect generated by the weft yarn is very small in the transverse direction, the gray level change reaction is in the vertical direction, so that the warp yarn blurred region is mainly reflected in the transverse direction on the influence of the gradient direction of the central pixel point, the weft yarn blurred region is mainly reflected in the vertical direction, and therefore, for the second weight, the probability that the gradient direction is closer to the weft yarn region is larger, and the corresponding weight value is larger.
In this embodiment, the product of the first weight value and the second weight value corresponding to each target pixel point is used as the weight of each target pixel point in the structural element, that is, the weight value in the gray morphological structural element. The pixel points in the fuzzy area can obtain the weight value of the pixel points in the structural element according to the position information of the pixel points and the relation between the pixel points and the neighborhood pixel points.
Each target pixel point obtains weight information of the target pixel point in a structural element based on the position information of a fuzzy area where the target pixel point is located and the difference degree of the actual fuzzy direction of the target pixel point; marking the product of the gray value of each target pixel point and the weight of the corresponding target pixel point in the structural element as the adjustment gray value of each target pixel point; the gray value of each pixel point except the target pixel point on the gray image of the flame-retardant glass fiber mesh cloth and the image formed by the adjustment gray value of each target pixel point on the gray image of the flame-retardant glass fiber mesh cloth are recorded as a reconstruction image; and then carrying out gray morphological corrosion operation on the reconstructed image, and marking the image after the corrosion operation as a gray image of the target flame-retardant glass fiber mesh cloth.
So far, the blurred area in the image is removed, and the main area of the warps and the wefts is restored to a great extent. Then performing defect detection on the target flame-retardant glass fiber mesh gray level image by a conventional defect detection means, and performing defect detection on the target flame-retardant glass fiber mesh gray level image by using edge detection to obtain a defect region; the present embodiment uses a gray morphological operation to present visual defects of image blurring to the acquired image due to high-speed movement. The method is characterized in that the distribution characteristics of pixel gray information in an image are analyzed, and the fuzzy area is eliminated by combining with the arrangement structural characteristics of warps and wefts of the flame-retardant glass fiber mesh fabric, so that the problem that the visual characteristics of defects such as broken wefts and jump wefts are disturbed due to the fuzzy problem is solved, the internal weight of structural elements of gray morphology is adaptively adjusted, and the further problem that the severely fuzzy weft area is eliminated due to the single structural elements in the conventional morphology is solved by dividing the area range of the fuzzy image; and then, the self-adaptive adjustment of the weight of the structural elements required by the gray morphology can be performed by combining the transmission direction of the transmission belt acquired in the earlier stage and the gray information of the blurred image, so that the required equipment configuration requirement is lower, and the operation content is less.
Firstly, acquiring gray images of flame-retardant glass fiber mesh cloth on a production line of the flame-retardant glass fiber mesh cloth; dividing gray values on a gray histogram according to peaks on the gray histogram corresponding to the flame-retardant glass fiber mesh cloth gray image to obtain a target gray value range; obtaining windows corresponding to all pixel points in the target gray scale range according to a preset sliding window; screening each pixel point in the target gray scale range according to the gray value of each pixel point in a window corresponding to each pixel point in the target gray scale range to obtain each target pixel point; according to each target pixel point, a first weight value corresponding to each target pixel point is obtained; obtaining a second weight value corresponding to each target pixel point according to the gradient direction of each target pixel point; according to the first weight value and the second weight value corresponding to each target pixel point, the weight of each target pixel point in the structural element is obtained; according to the weight of the target pixel point in the structural element, obtaining a gray image of the target flame-retardant glass fiber mesh cloth; and obtaining a defect area according to the gray level image of the target flame-retardant glass fiber mesh cloth. The method and the device can accurately obtain the defect area on the flame-retardant glass fiber mesh cloth.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the scope of the embodiments of the present application, and are intended to be included within the scope of the present application.

Claims (9)

1. The defect detection method of the flame-retardant glass fiber mesh cloth is characterized by comprising the following steps of:
acquiring a gray image of the flame-retardant glass fiber mesh cloth on a production line of the flame-retardant glass fiber mesh cloth according to a camera attached with a visible light source;
dividing gray values on a gray histogram according to peaks on the gray histogram corresponding to the flame-retardant glass fiber mesh cloth gray image to obtain a target gray value range;
obtaining windows corresponding to all pixel points in the target gray scale range according to a preset sliding window; screening each pixel point in the target gray scale range according to the gray value of each pixel point in a window corresponding to each pixel point in the target gray scale range to obtain each target pixel point;
obtaining a first weight value corresponding to each target pixel point according to the pixel value of each target pixel point; obtaining a second weight value corresponding to each target pixel point according to the gradient direction of each target pixel point;
according to the first weight value and the second weight value corresponding to each target pixel point, the weight of each target pixel point in the structural element is obtained; according to the weight of the target pixel point in the structural element, obtaining a gray image of the target flame-retardant glass fiber mesh cloth; and obtaining a defect area according to the gray level image of the target flame-retardant glass fiber mesh cloth.
2. The method for detecting defects of flame retardant glass fiber mesh cloth according to claim 1, wherein the method for dividing gray values on a gray histogram according to peaks on the gray histogram corresponding to the gray image of the flame retardant glass fiber mesh cloth to obtain a target gray value range comprises the following steps:
according to the gray level histogram, gray level values corresponding to two wave crest positions and gray level values corresponding to one wave trough position on the gray level histogram are obtained; the gray values of the two wave peak positions are respectively marked as a 1 And a 3 The gray value of the trough position is marked as a 2 And a 1 >a 2 >a 3
Fitting the gray level histogram to obtain a fitting curve;
calculating a slope value corresponding to each gray value on the fitting curve;
at 0 to a 3 Selecting the gray value with the largest slope value in the range, and marking the gray value as a 4 The method comprises the steps of carrying out a first treatment on the surface of the At a 1 And a 2 Selecting the gray value with the largest slope value in the range, and marking the gray value as a 5
According to a 1 、a 2 、a 3 、a 4 、a 5 Dividing gray values on the gray histogram to obtain a first gray value range b 1 Second gray value range b 2 And a third gray value range b 3 ,b 1 =(0,a 4 ],b 2 =(a 4 ,a 5 ],b 3 =(a 5 ,255];
Second gray value range b 2 And is noted as a target gray scale range.
3. The method for detecting defects of flame-retardant glass fiber mesh cloth according to claim 1, wherein the step of screening each pixel point in a target gray scale range according to the gray value of each pixel point in a window corresponding to each pixel point in the target gray scale range to obtain each target pixel point comprises the steps of:
for any pixel point in the target gray scale range:
judging whether the gray value of each pixel point in the window corresponding to the pixel point is in the target gray range, if so, marking the pixel point as a target pixel point.
4. The method for detecting defects of a fire-retardant glass fiber mesh fabric according to claim 1, wherein the method for obtaining the first weight value corresponding to each target pixel point according to each target pixel point comprises the following steps:
for any target pixel point:
acquiring a target pixel point with the largest ordinate value and a target pixel point with the smallest ordinate value on the column where the target pixel point is located;
obtaining a target pixel point set on the row of the target pixel point according to all the target pixel points on the row of the target pixel point;
calculating the gray average value of eight neighborhood pixel points corresponding to each target pixel point in the target pixel point set on the row where the target pixel point is located;
selecting the maximum gray average value and the minimum gray average value in the gray average values of eight adjacent pixel points corresponding to each target pixel point in the target pixel point set on the row of the target pixel point;
and obtaining a first weight value corresponding to the target pixel according to the target pixel with the largest ordinate value and the target pixel with the smallest ordinate value on the column of the target pixel and the maximum gray average value and the minimum gray average value in the gray average values of eight neighborhood pixel corresponding to each target pixel in the target pixel set on the row of the target pixel.
5. The method for detecting defects of a fire-retardant glass fiber mesh fabric according to claim 4, wherein the first weight value corresponding to the target pixel point is calculated according to the following formula:
wherein w is 1 For the first weight value, Y, corresponding to the target pixel point eu Y is the ordinate value corresponding to the target pixel point with the largest ordinate value on the column of the target pixel point ed Y is the ordinate value corresponding to the target pixel point with the smallest ordinate value on the column of the target pixel point e For the ordinate value corresponding to the target pixel point,for the maximum gray average value in the gray average values of eight neighborhood pixel points corresponding to each target pixel point in the target pixel point set on the row of the target pixel point,for the minimum gray average value in the gray average values of eight neighborhood pixel points corresponding to each target pixel point in the target pixel point set on the row of the target pixel point, the following is->The gray average value of the eight neighborhood pixel points corresponding to the target pixel point is obtained.
6. The method for detecting defects of a fire-retardant glass fiber mesh fabric according to claim 1, wherein for any target pixel, the second weight value corresponding to the target pixel is calculated according to the following formula:
wherein w is 2 For the second weight value corresponding to the target pixel point, θ is the gradient direction of the target pixel point,representing the opposite direction of the transport direction of the conveyor belt, cos is a cosine similarity calculation function.
7. The method for detecting defects of flame-retardant glass fiber mesh cloth according to claim 1, wherein the method for obtaining the weight of each target pixel point in the structural element according to the first weight value and the second weight value corresponding to each target pixel point comprises the following steps:
taking the product of the first weight value corresponding to each target pixel point and the second weight value corresponding to each target pixel point as the weight of each target pixel point in the structural element.
8. The method for detecting defects of flame-retardant glass fiber mesh cloth according to claim 1, wherein the method for obtaining the gray level image of the target flame-retardant glass fiber mesh cloth according to the weight of the target pixel point in the structural element comprises the following steps:
marking the product of the gray value of each target pixel point and the weight of the corresponding target pixel point in the structural element as the adjustment gray value of each target pixel point;
the gray value of each pixel point except the target pixel point on the gray image of the flame-retardant glass fiber mesh cloth and the image formed by the adjustment gray value of each target pixel point on the gray image of the flame-retardant glass fiber mesh cloth are recorded as a reconstruction image;
and carrying out gray morphological corrosion operation on the reconstructed image, and marking the image after the corrosion operation as a gray image of the target flame-retardant glass fiber mesh cloth.
9. The method for detecting defects of a flame retardant glass fiber mesh fabric according to claim 1, wherein the method for obtaining the defective area according to the gray level image of the target flame retardant glass fiber mesh fabric comprises the steps of:
and carrying out edge detection on the gray level image of the target flame-retardant glass fiber mesh cloth to obtain a defect area.
CN202311444552.7A 2023-11-02 2023-11-02 Defect detection method for flame-retardant glass fiber mesh cloth Active CN117495798B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311444552.7A CN117495798B (en) 2023-11-02 2023-11-02 Defect detection method for flame-retardant glass fiber mesh cloth

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311444552.7A CN117495798B (en) 2023-11-02 2023-11-02 Defect detection method for flame-retardant glass fiber mesh cloth

Publications (2)

Publication Number Publication Date
CN117495798A true CN117495798A (en) 2024-02-02
CN117495798B CN117495798B (en) 2024-05-03

Family

ID=89673836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311444552.7A Active CN117495798B (en) 2023-11-02 2023-11-02 Defect detection method for flame-retardant glass fiber mesh cloth

Country Status (1)

Country Link
CN (1) CN117495798B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114445387A (en) * 2022-01-29 2022-05-06 泗阳富艺木业股份有限公司 Fiberboard quality classification method based on machine vision
US20220198634A1 (en) * 2020-12-22 2022-06-23 Hon Hai Precision Industry Co., Ltd. Method for selecting a light source for illuminating defects, electronic device, and non-transitory storage medium
CN115100201A (en) * 2022-08-25 2022-09-23 淄博齐华制衣有限公司 Blending defect detection method of flame-retardant fiber material
CN115222741A (en) * 2022-09-20 2022-10-21 江苏昱恒电气有限公司 Cable surface defect detection method
CN115409833A (en) * 2022-10-28 2022-11-29 一道新能源科技(衢州)有限公司 Hot spot defect detection method of photovoltaic panel based on unsharp mask algorithm

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220198634A1 (en) * 2020-12-22 2022-06-23 Hon Hai Precision Industry Co., Ltd. Method for selecting a light source for illuminating defects, electronic device, and non-transitory storage medium
CN114445387A (en) * 2022-01-29 2022-05-06 泗阳富艺木业股份有限公司 Fiberboard quality classification method based on machine vision
CN115100201A (en) * 2022-08-25 2022-09-23 淄博齐华制衣有限公司 Blending defect detection method of flame-retardant fiber material
CN115222741A (en) * 2022-09-20 2022-10-21 江苏昱恒电气有限公司 Cable surface defect detection method
CN115409833A (en) * 2022-10-28 2022-11-29 一道新能源科技(衢州)有限公司 Hot spot defect detection method of photovoltaic panel based on unsharp mask algorithm

Also Published As

Publication number Publication date
CN117495798B (en) 2024-05-03

Similar Documents

Publication Publication Date Title
US6753965B2 (en) Defect detection system for quality assurance using automated visual inspection
US6804381B2 (en) Method of and device for inspecting images to detect defects
Jeon et al. Automatic recognition of woven fabric patterns by an artificial neural network
CN115311303B (en) Textile warp and weft defect detection method
CN115330784B (en) Cloth surface defect detection method
CN113724241B (en) Broken filament detection method and device for carbon fiber warp-knitted fabric and storage medium
CN103759662A (en) Dynamic textile yarn diameter rapid-measuring device and method
CN114998321B (en) Textile material surface hairiness degree identification method based on optical means
CN109211918B (en) Fabric bow weft detection method based on weft trend
CN115100206B (en) Printing defect identification method for textile with periodic pattern
CN115311265B (en) Loom intelligence control system based on weaving quality
CN112330673B (en) Woven fabric density detection method based on image processing
CN116309671A (en) Geosynthetic fabric quality detection system
CN113781447A (en) Weft yarn gap detection method and device based on carbon fibers and storage medium
CN117495798B (en) Defect detection method for flame-retardant glass fiber mesh cloth
CN111402225B (en) Cloth folding false-detection defect discriminating method
CN110186929A (en) A kind of real-time product defect localization method
CN113610843A (en) Real-time defect identification system and method for optical fiber braided layer
CN114913180B (en) Intelligent detection method for defect of cotton cloth reed mark
CN114782426A (en) Knitted fabric broken yarn defect detection method based on artificial intelligence system
CN109919028B (en) Flexible coordinate system establishing and shape identifying method based on fabric weave structure
CN115311269B (en) Textile abnormity detection method
CN113837126B (en) DB weft yarn detection method
CN117934455B (en) River water flow purification effect-based detection method and system
CN115100481B (en) Textile qualitative classification method based on artificial intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant