CN113516628A - Sizing yarn reed collision detection method based on machine vision - Google Patents

Sizing yarn reed collision detection method based on machine vision Download PDF

Info

Publication number
CN113516628A
CN113516628A CN202110523477.8A CN202110523477A CN113516628A CN 113516628 A CN113516628 A CN 113516628A CN 202110523477 A CN202110523477 A CN 202110523477A CN 113516628 A CN113516628 A CN 113516628A
Authority
CN
China
Prior art keywords
image
reed
yarn
collision
sizing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110523477.8A
Other languages
Chinese (zh)
Inventor
潘如如
夏旭文
周建
高卫东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangnan University
Original Assignee
Jiangnan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangnan University filed Critical Jiangnan University
Priority to CN202110523477.8A priority Critical patent/CN113516628A/en
Publication of CN113516628A publication Critical patent/CN113516628A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Treatment Of Fiber Materials (AREA)

Abstract

The invention belongs to the technical field of machine vision detection, and relates to a sizing yarn reed collision detection method based on machine vision. The invention provides a sizing yarn reed collision detection method based on machine vision, which can detect and track the warp yarn reed collision phenomenon in the sizing process and is a sizing yarn reed collision detection method with high accuracy, strong robustness and wide variety adaptability. The image acquisition system which is self-built and can adapt to slasher of different types is used for capturing the complete image of the warp yarn passing through the dents. Firstly, cutting an initial image to obtain a detection area, and then smoothing the image through Gaussian filtering; finally, by improving an interframe difference method, expanding interframe difference, and adding a Canny edge algorithm and mathematical morphology operation, detecting and tracking yarn collision and reed congestion of warp yarns are realized; meanwhile, the proposed algorithm can be applied to detection and identification problems in other fields.

Description

Sizing yarn reed collision detection method based on machine vision
Technical Field
The invention belongs to the technical field of machine vision detection, and relates to a sizing yarn reed collision detection method based on machine vision.
Background
Slashing is an important preparation procedure before weaving in the textile industry, and endows warp yarns with the capability of resisting external complex mechanical force, improves the weavability of the warp yarns and ensures the smooth weaving process. The warp yarn is broken by the expansion reed to form yarn jamming, which is a common problem in the sizing process. If the phenomenon of yarn collision and yarn jamming is not found in time, the stop rate of the sizing machine is increased, the loss of warp yarns is increased, cloth cover defects are generated in the post-weaving production, and the quality is reduced. At present, the enterprise generally adopts a manual patrol method for dealing with the problems, but manual detection can increase the labor amount of workers, reduce the production efficiency and influence the enterprise benefit. Therefore, an accurate and efficient sizing yarn reed collision detection method is urgently needed by textile enterprises.
China is accelerating to promote the integration of a new generation of information technology and the textile and clothing industry, and the textile and clothing industry is promoted to develop towards the directions of green, low carbon, intellectualization, digitalization, softness and the like. Digital image technology, one of the advanced intelligent technologies, is a novel technology for the information technology and the computer technology multidisciplinary fusion development in the last two decades. With the development of computer technology, communication technology and internet technology, modern technology can acquire information data through different means, and the information data volume becomes more and more huge. With the development of image acquisition and pattern recognition technology, quality control detection by applying machine vision becomes a trend of enterprises for implementing a timely reaction strategy and carrying out large-scale production and quality control; the machine vision detection technology has the characteristics of non-contact, quick response, strong on-site anti-interference capability and the like, and provides a new method and means for automatic detection. The machine vision technology is introduced into the slashing and reed collision yarn detection, so that the requirements of the slashing and reed collision yarn detection on sensitivity and reliability can be well met. Slashing is an important preparation procedure before weaving in the textile industry, and endows warp yarns with the capability of resisting external complex mechanical force, improves the weavability of the warp yarns and ensures the smooth weaving process. The warp yarn is broken by the expansion reed to form yarn jamming, which is a common problem in the sizing process. If the phenomenon of yarn collision and yarn jamming is not found in time, the stop rate of the sizing machine is increased, the loss of warp yarns is increased, cloth cover defects are generated in the post-weaving production, and the quality is reduced. At present, the enterprise generally adopts a manual patrol method for dealing with the problems, but manual detection can increase the labor amount of workers, reduce the production efficiency and influence the enterprise benefit. Therefore, an accurate and efficient sizing yarn reed collision detection method is urgently needed by textile enterprises.
China is accelerating to promote the integration of a new generation of information technology and the textile and clothing industry, and the textile and clothing industry is promoted to develop towards the directions of green, low carbon, intellectualization, digitalization, softness and the like. Digital image technology, one of the advanced intelligent technologies, is a novel technology for the information technology and the computer technology multidisciplinary fusion development in the last two decades. With the development of computer technology, communication technology and internet technology, modern technology can acquire information data through different means, and the information data volume becomes more and more huge. With the development of image acquisition and pattern recognition technology, quality control detection by applying machine vision becomes a trend of enterprises for implementing a timely reaction strategy and carrying out large-scale production and quality control; the machine vision detection technology has the characteristics of non-contact, quick response, strong on-site anti-interference capability and the like, and provides a new method and means for automatic detection. The machine vision technology is introduced into the slashing and reed collision yarn detection, so that the requirements of the slashing and reed collision yarn detection on sensitivity and reliability can be well met.
Disclosure of Invention
In order to overcome the problems in the prior art, the invention aims to provide a slashing reed collision detection method based on machine vision, which can detect the slashing warp reed collision phenomenon in real time and is a slashing reed collision detection method with high accuracy, strong robustness and wide variety adaptability. Meanwhile, the proposed algorithm can identify, track and position the slashing reed hitting position, and can be applied to detection and identification problems in other fields, such as identification of moving targets, vehicle detection and the like.
The technical scheme of the invention is as follows:
a slashing reed collision detection method based on machine vision comprises the following steps:
step 1, collecting images of wide sizing warp dent positions, wherein the collected images comprise images of wide sizing passing dents normally and images of occurrence of reed collision;
step 2, cutting the collected image to obtain an image of slashing passing through dents;
step 3, carrying out image smoothing treatment on the image cut in the step 2, and reducing the detail level of the image;
step 4, carrying out difference processing on the image in the step 3 by using an improved frame difference method to obtain a difference image;
step 5, performing binary processing on the difference image obtained in the step 4, and performing Canny edge algorithm and expanded mathematical morphology operation processing on the binary image;
step 6, setting an area threshold, taking the area threshold as a sizing yarn reed collision judgment condition, judging that the sizing yarn reed collision occurs if the contour area is larger than the set area threshold in the step 5, and otherwise, normally passing the sizing yarn through the reed dent without generating the reed collision phenomenon;
and 7, performing square frame calibration on the part where sizing yarn reed collision occurs, and realizing identification, tracking and detection of sizing yarn reed collision.
The specific steps of collecting the slashing image in the step 1 are as follows:
the method is characterized in that the image types of slashes passing through reed dents are collected as widely as possible under different slasher speeds, a common digital or fixed-focus camera is used for image collection, and the width as large as possible can be detected while the image quality is ensured during collection. The camera is kept at a fixed distance from the slashing while the spatial resolution of the acquisition is recorded in order to deduce the actual distance characteristics.
The image cutting in the step 2 comprises the following specific steps:
because the phenomenon of yarn jamming and jamming only occur at the dent of the sizing machine, in order to eliminate the interference caused by irrelevant areas except the dent and reduce the time of program operation, the collected image is cut and only the dent area is reserved. The cut image can eliminate interference caused by yarn shaking and obtain a complete image of sizing passing through the reed dent.
The image smoothing in the step 3 uses a Gaussian blur filter, which is specifically characterized in that:
the convolution matrix formed by pixels with non-zero distribution is transformed with the original image, and the value of each pixel is the weighted average of the values of the surrounding adjacent pixels. The original pixel has the largest gaussian distribution value and therefore has the largest weight, and the adjacent pixels have smaller and smaller weights as they are farther from the original pixel. This blurring preserves the edge effect better than other equalized blurring filters. The concrete formula is as follows:
Figure BDA0003064967530000041
wherein: f (x, y) is the Gaussian filter template coefficient, (x, y) is the point coordinate, and sigma is the standard deviation of the normal distribution.
The specific characteristics of the improved frame difference method in the steps 4 and 5 are as follows:
the algorithm can extract a slashing reed collision target, and detect and track the slashing reed collision in real time; the interframe difference method is to perform difference operation on two continuous frames of images in time, subtract pixels corresponding to different frames, judge the absolute value of gray difference and obtain a difference image. The concrete formula is as follows:
Dn(x,y)=|fn(x,y)-fn-1(x,y)|#(2)
d in formula (2)nIs a difference image, fn(x,y)、fn-1(x, y) are the nth frame and the (n-1) th frame image, respectively.
Steps 4 and 5 are improved by the following points aiming at the problems of the traditional interframe difference method: 1) and for the problem that the target with slow change is easy to have the omission phenomenon, a method for expanding the frame difference is adopted for processing. 2) Adding a Canny edge algorithm to the problem of target edge discontinuity in target detection, so that the yarn cluster edge can be clearly and completely detected; in order to enhance the self-adaptability of the algorithm, a maximum inter-class variance method is adopted when double thresholds of edge operators are selected. 3) And (3) the warp yarns are collided and yarn-gathered to be overlapped to generate a cavity, and the cavity is filled by adopting the mathematical morphology operation of expansion. The concrete formula is as follows:
Figure BDA0003064967530000042
t in the formula (3) is the enlarged frame difference, fn-t(x, y) is the n-t frame image, D'nTo enlarge the difference image after frame differencing, NAAnd lambda is the total number of pixels in the region to be detected, lambda is the illumination suppression coefficient, and A is the whole frame image to be detected.
Add Entries to step 5.1
Figure BDA0003064967530000051
Showing the variation of the overall illumination in the image. When the illumination change in the image is large, the term value is increased, and the sudden change of the image difference caused by the illumination environment can be counteracted to a certain extent, so that the influence of the sudden change of the illumination environment on the detection result is effectively inhibited.
The main characteristics of sizing yarn reed collision detection and judgment in the step 6 are as follows:
when the warp yarns are knocked, the yarn agglomeration phenomenon can occur. Therefore, after traversing the contour, the contour area threshold δ is used as the determination condition for the yarn jamming caused by the warp yarn collision.
Figure BDA0003064967530000052
R in the formula (4)nFor the final detected target image, i.e. as contour area>When delta, judging that the warp yarns have a reed collision and yarn jamming phenomenon; when the outline area is less than or equal to delta, the phenomenon of yarn jamming caused by warp yarn collision and reed jamming does not occur.
The specific characteristic of the square frame calibration of the reed collision position of the sizing in the step 7 is as follows:
and 6, regarding the judgment result in the step 6, taking a circumscribed rectangle in the binarized foreground image as a target image marking frame, wherein two adjacent sides of the circumscribed rectangle are respectively parallel to the upper boundary and the lower boundary of the image.
The invention has the beneficial effects that:
the invention greatly reduces the labor cost and is beneficial to long-term maintenance; the improved frame difference method used by the invention can be applied to slashing reed collision detection and tracking of each slasher; the warp loss and the cloth cover defects in the post-weaving production are reduced, and the working efficiency and the product quality of the sizing machine are improved; the improved frame difference method based on machine vision can be applied to the problems of target detection and tracking in other fields.
Drawings
FIG. 1 is a flow chart of a sizing reed collision detection method based on machine vision;
FIG. 2 is a schematic view of an image capture device;
FIG. 3 is a schematic diagram of sizing bumping;
FIG. 4 is a flow chart of an algorithm for improving the frame difference method;
FIG. 5 is a difference diagram of sizing bumping and denting;
FIG. 6 is a sizing bumping reed morphological operation diagram;
FIG. 7 is a diagram showing the effect of detecting the sizing yarn crashing.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the examples or the description in the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to specific embodiments and the accompanying drawings.
The embodiment of the invention provides a sizing yarn reed collision detection method based on machine vision, wherein an identification flow chart is shown in figure 1, and in order to explain the specific implementation mode of the invention in detail, the method comprises the following steps:
step 1, collecting images of wide slashing warps at dents, wherein the collecting device is shown in figure 2, and the collected images comprise images of wide slashing warps passing through the dents normally and going into reed collision;
in this step, 1840 size images of the slashing from the laboratory are collected as image data, and a part of the slashing images can refer to fig. 3, and the size of the images is 1920 × 1080 pixels;
step 2, cutting the collected image;
because the phenomenon of yarn jamming and jamming only occur at the dent of the sizing machine, in order to eliminate the interference caused by irrelevant areas except the dent and reduce the time of program operation, the image collected in the step 1 is cut and only the dent area is reserved. The cut image can eliminate interference caused by yarn shaking and obtain a complete image of sizing passing through the reed dent.
Step 3, carrying out image smoothing treatment on the image cut in the step 2, and reducing the detail level of the image; in the step, a Gaussian blur filter is used for carrying out image smoothing on the cut picture in the step 2; yarns moving in the detection process interfere with automatic detection of sizing yarn reed collision, and a Gaussian blur filter is used for smoothing cut images, so that the level of detail is reduced. The original pixel has the largest gaussian distribution value and therefore has the largest weight, and the adjacent pixels have smaller and smaller weights as they are farther from the original pixel. This blurring preserves the edge effect better than other equalized blurring filters. The concrete formula is as follows:
Figure BDA0003064967530000071
wherein: f (x, y) is the Gaussian filter template coefficient, (x, y) is the point coordinate, and sigma is the standard deviation of the normal distribution. Step 4, 5, establishing a target detection algorithm of the improved frame difference method, wherein a flow chart is shown in fig. 4; the algorithm can extract a slashing reed collision target and carry out real-time detection and tracking on the slashing reed collision; the interframe difference method is to perform difference operation on two continuous frames of images in time, subtract pixels corresponding to different frames, judge the absolute value of gray difference and obtain a difference image. The concrete formula is as follows:
Dn(x,y)=|fn(x,y)-fn-1(x,y)|#(2)
d in formula (2)nIs a difference image, fn(x,y)、fn-1(x, y) are the nth frame and the (n-1) th frame image, respectively. In this step, aiming at the problems of the conventional interframe difference method, the following improvements are adopted: 1) and for the problem that the target with slow change is easy to have the omission phenomenon, a method for expanding the frame difference is adopted for processing. 2) Adding a Canny edge algorithm to the problem of target edge discontinuity in target detection, so that the yarn cluster edge can be clearly and completely detected; in order to enhance the self-adaptability of the algorithm, a maximum inter-class variance method is adopted when double thresholds of edge operators are selected. 3) And (3) the warp yarns are collided and yarn-gathered to be overlapped to generate a cavity, and the cavity is filled by adopting the mathematical morphology operation of expansion. The concrete formula is as follows:
Figure BDA0003064967530000081
t in the formula (3) is the enlarged frame difference, fn-t(x, y) is the n-t frame image, D'nTo enlarge the difference image after frame differencing, NAAnd lambda is the total number of pixels in the region to be detected, lambda is the illumination suppression coefficient, and A is the whole frame image to be detected.
Add Entries to step 5.1
Figure BDA0003064967530000082
Showing the variation of the overall illumination in the image. When the illumination change in the image is large, the term value is increased, and the sudden change of the image difference caused by the illumination environment can be counteracted to a certain extent, so that the influence of the sudden change of the illumination environment on the detection result is effectively inhibited.
Step 6, sizing and reed collision judgment is carried out on the result processed in the step 5;
in the step, sizing yarn reed collision is judged; when the warp yarns are knocked, the yarn agglomeration phenomenon can occur.
Therefore, after traversing the contour, the contour area threshold δ is used as the determination condition for the yarn jamming caused by the warp yarn collision.
Figure BDA0003064967530000083
R in the formula (4)nFor the final detected target image, i.e. as contour area>When delta, judging that the warp yarns have a reed collision and yarn jamming phenomenon; when the outline area is less than or equal to delta, the phenomenon of yarn jamming caused by warp yarn collision and reed jamming does not occur.
And 7, performing square frame calibration on the part where sizing yarn reed collision occurs, and realizing identification, tracking and detection of sizing yarn reed collision.
In the step, regarding the judgment result in the step 6, taking the circumscribed rectangle as a target image marking frame, and carrying out square frame calibration on the slashing reed collision position; and two adjacent sides of the circumscribed rectangle are respectively parallel to the upper boundary and the lower boundary of the image.
For a detailed description of the specific effects of the present invention, reference is made to fig. 5, 6, and 7 for a representative difference image, morphological operation image, and box mark image.
Those of ordinary skill in the art will understand that: the invention is not to be considered as limited to the specific embodiments thereof, but is to be understood as being modified in all respects, all changes and equivalents that come within the spirit and scope of the invention.

Claims (3)

1. A slashing reed collision detection method based on machine vision is characterized by comprising the following steps:
step 1, collecting images of sizing warps at dents, wherein the collected images comprise images of sizing passing through the dents normally and images of occurrence of reed collision;
step 2, cutting the collected image to obtain an image of slashing passing through dents;
the image cutting in the step 2 comprises the following specific steps: cutting the collected image and only keeping a dent area; the cut image can obtain a complete image of slashing passing through the reed dent while eliminating interference caused by yarn shaking;
step 3, carrying out image smoothing treatment on the image cut in the step 2, and reducing the detail level of the image;
the image smoothing in step 3 uses a gaussian blur filter as follows:
a convolution matrix formed by pixels with non-zero distribution is transformed with the original image, and the value of each pixel is the weighted average of the values of the surrounding adjacent pixels; the original pixel has the maximum Gaussian distribution value and therefore has the maximum weight, and the adjacent pixels have the weights which are the more distant from the original pixel
The smaller the size; thus, the edge effect is better kept by blurring than other equalizing blurring filters; the concrete formula is as follows:
Figure FDA0003064967520000011
wherein: f (x, y) is a Gaussian filter template coefficient, (x, y) is a point coordinate, and sigma is a standard deviation of normal distribution;
step 4, carrying out difference processing on the image in the step 3 by using an improved frame difference method to obtain a difference image;
step 5, performing binary processing on the difference image obtained in the step 4, and performing Canny edge algorithm and expanded mathematical morphology operation processing on the binary image;
the improved frame difference method in the steps 4 and 5 is characterized in that:
aiming at the problems of the traditional interframe difference method, the following improvements are adopted:
1) expanding frame difference;
2) a Canny edge algorithm is added for processing, so that the edge of the yarn cluster can be clearly and completely detected; in order to enhance the self-adaptability of the algorithm, a maximum inter-class variance method is adopted when double thresholds of an edge operator are selected;
3) the warp yarns are collided and yarn-gathered to be overlapped to generate a cavity, and the cavity is filled by adopting expanded mathematical morphology operation; the concrete formula is as follows:
Figure FDA0003064967520000021
t in the formula (3) is the enlarged frame difference, fn-t(x, y) is the n-t frame image, D'nTo enlarge the difference image after frame differencing, NAThe total number of pixels in the region to be detected is shown, lambda is an illumination suppression coefficient, and A is an entire frame image to be detected;
step 6, setting an area threshold value, and taking the area threshold value as a sizing reed collision judgment condition;
Figure FDA0003064967520000022
r in the formula (4)nFor the final detected target image, i.e. as contour area>When delta, judging that the warp yarns have a reed collision and yarn jamming phenomenon; when the outline area is less than or equal to delta, the phenomenon of yarn jamming and jamming of warp yarns does not occur;
and 7, performing square frame calibration on the part where sizing yarn reed collision occurs, and realizing identification, tracking and detection of sizing yarn reed collision.
2. The machine vision-based slashing reed detection method as claimed in claim 1, wherein the addition items sensitive to overall illumination and shadow in the step 5 are characterized in that:
the addition term
Figure FDA0003064967520000023
Showing the change of the whole illumination in the image; when the illumination change in the image is large, the term value is increased, and the sudden change of the image difference caused by the illumination environment can be counteracted to a certain extent, so that the influence of the sudden change of the illumination environment on the detection result is effectively inhibited.
3. The machine vision-based sizing yarn reed collision detection method as claimed in claim 1 or 2, wherein the specific steps of collecting the image in the step 1 are as follows: collecting different image types of slashes passing through the reed dent at different slasher speeds, and collecting images by using a common digital camera or a fixed-focus camera; the camera is kept at a fixed distance from the slashing while the spatial resolution of the acquisition is recorded in order to deduce the actual distance characteristics.
CN202110523477.8A 2021-05-13 2021-05-13 Sizing yarn reed collision detection method based on machine vision Pending CN113516628A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110523477.8A CN113516628A (en) 2021-05-13 2021-05-13 Sizing yarn reed collision detection method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110523477.8A CN113516628A (en) 2021-05-13 2021-05-13 Sizing yarn reed collision detection method based on machine vision

Publications (1)

Publication Number Publication Date
CN113516628A true CN113516628A (en) 2021-10-19

Family

ID=78064504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110523477.8A Pending CN113516628A (en) 2021-05-13 2021-05-13 Sizing yarn reed collision detection method based on machine vision

Country Status (1)

Country Link
CN (1) CN113516628A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114092510A (en) * 2021-12-01 2022-02-25 常州市宏发纵横新材料科技股份有限公司 Normal distribution based segmentation method, computer equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114092510A (en) * 2021-12-01 2022-02-25 常州市宏发纵横新材料科技股份有限公司 Normal distribution based segmentation method, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
WO2021012757A1 (en) Real-time target detection and tracking method based on panoramic multichannel 4k video images
Liu et al. A fabric defect detection method based on deep learning
CN103529051B (en) A kind of Woven textiles flaw automatic on-line detection method
CN116757990A (en) Railway fastener defect online detection and identification method based on machine vision
CN111861996A (en) Printed fabric defect detection method
CN114279357B (en) Die casting burr size measurement method and system based on machine vision
CN102020036A (en) Visual detection method for transparent paper defect of outer package of strip cigarette
CN109472788B (en) Method for detecting flaw on surface of airplane rivet
CN110889837A (en) Cloth flaw detection method with flaw classification function
CN109781737B (en) Detection method and detection system for surface defects of hose
CN109540917B (en) Method for extracting and analyzing yarn appearance characteristic parameters in multi-angle mode
CN115311265B (en) Loom intelligence control system based on weaving quality
CN110807763A (en) Method and system for detecting ceramic tile surface bulge
CN114140384A (en) Transverse vibration image recognition algorithm for hoisting steel wire rope based on contour fitting and centroid tracking
CN113516628A (en) Sizing yarn reed collision detection method based on machine vision
CN112164050A (en) Method, device and storage medium for detecting surface defects of products on production line
CN108492306A (en) A kind of X-type Angular Point Extracting Method based on image outline
CN114926387A (en) Weld defect detection method and device based on background estimation and edge gradient suppression
CN112529853A (en) Method and device for detecting damage of netting of underwater aquaculture net cage
CN109472779A (en) A kind of yarn appearance characteristic parameter extraction and analysis method based on morphosis
CN115187567A (en) Method for detecting forming direction and width of metal additive manufacturing molten pool
CN113487570B (en) High-temperature continuous casting billet surface defect detection method based on improved yolov5x network model
CN114212452A (en) Coal flow detection method based on laser assistance and image processing and energy-saving control system
CN111402225B (en) Cloth folding false-detection defect discriminating method
CN111784691A (en) Textile flaw detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination