CN115049669A - Metal defect identification method - Google Patents

Metal defect identification method Download PDF

Info

Publication number
CN115049669A
CN115049669A CN202210978629.8A CN202210978629A CN115049669A CN 115049669 A CN115049669 A CN 115049669A CN 202210978629 A CN202210978629 A CN 202210978629A CN 115049669 A CN115049669 A CN 115049669A
Authority
CN
China
Prior art keywords
edge
target
obtaining
point
defect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210978629.8A
Other languages
Chinese (zh)
Inventor
汤琴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rugao Fumeilong Metal Products Co ltd
Original Assignee
Rugao Fumeilong Metal Products Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rugao Fumeilong Metal Products Co ltd filed Critical Rugao Fumeilong Metal Products Co ltd
Priority to CN202210978629.8A priority Critical patent/CN115049669A/en
Publication of CN115049669A publication Critical patent/CN115049669A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30136Metal

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention relates to the technical field of data processing, in particular to a metal defect identification method, which comprises the steps of obtaining a gray image of a metal surface image, carrying out edge detection on the gray image and obtaining an edge detection effect; screening out real edges when the edge detection effect is lower than a preset threshold value; taking each real edge as a target edge, obtaining a connection edge of the target edge through the step of obtaining the connection edge, and judging whether the connection edge is an actual edge line; forming a combined edge by the connecting edge, the target edge and the adjacent edge which are used as actual edge lines, obtaining the connecting edge of the combined edge, and forming a new combined edge until the new combined edge forms a closed area; and obtaining the confidence coefficient of the closed region as the defect region, wherein when the confidence coefficient is greater than a preset confidence coefficient threshold value, the corresponding closed region is the defect region. The invention can make up the defect of edge detection, extract the unidentified edge and improve the identification precision of the defect area.

Description

Metal defect identification method
Technical Field
The invention relates to the technical field of data processing, in particular to a metal defect identification method.
Background
During metal production and assembly, defects may occur on the metal surface, with minor defects affecting the aesthetics of the metal, and more severe defects reducing material strength, shortening workpiece life, and increasing safety risks, and minor defects may also turn into serious defects when long. Therefore, the defect detection of the surface of the metal product is an essential part in quality detection.
With the development of industrial technology, the defect detection of the metal surface is changed from manual detection to machine intelligent detection, however, when defect identification is performed by the traditional machine vision, due to factors such as uneven illumination, light reflection of the metal surface and the like, an accurate defect area is difficult to extract, and therefore the identification precision is not high.
Disclosure of Invention
The invention provides a metal defect identification method, which is used for solving the problem of low detection precision of metal surface defects, and adopts the following technical scheme:
one embodiment of the invention provides a metal defect identification method, which comprises the following steps:
collecting a metal surface image, obtaining a gray image of the surface image, carrying out edge detection on the gray image to obtain edge pixel points and background pixel points, and obtaining an edge detection effect based on the information entropy difference of the edge pixel points and the background pixel points;
when the edge detection effect is lower than a preset threshold value, acquiring unsealed edge lines, and screening out real edges by performing closed analysis on each unsealed edge line;
taking each real edge as a target edge, obtaining a connection edge of the target edge through the step of obtaining the connection edge, and judging whether the connection edge is an actual edge line or not based on pixel values at two sides of the connection edge; a combined edge is formed by the connecting edge, the target edge and the adjacent edge which are used as actual edge lines, the connecting edge of the combined edge is obtained through the steps, and a new combined edge is formed until the new combined edge forms a closed area;
obtaining the confidence coefficient that the closed region is a defect region based on the gray gradient of each pixel point in the closed region, and when the confidence coefficient is greater than a preset confidence coefficient threshold value, the corresponding closed region is a defect region;
the steps are as follows: and acquiring the characteristic distance between each target edge and other real edges, taking the other real edges corresponding to the shortest characteristic distance as adjacent edges of the target edge, and taking two end points corresponding to the characteristic distances of the target edge and the adjacent edges as a starting point and an end point respectively to carry out edge growth.
Preferably, the method for obtaining the edge detection effect includes:
calculating a first information entropy of the edge pixel point and a second information entropy of the background pixel point, taking a difference absolute value of the first information entropy and the second information entropy as a numerator, taking a maximum value of the two information entropies as a denominator, and obtaining a ratio which is the edge detection effect.
Preferably, the process of the closure analysis is as follows:
acquiring two end points of each unsealed edge line, acquiring the maximum growing times based on the linear distance between the two end points, growing from one end point along the direction with the minimum gray gradient except the unsealed edge line by using a region growing algorithm, and when the growing times reach the maximum growing times, not reaching the other end point, wherein the edge line is an interference edge; otherwise, it is a real edge.
Preferably, the method for acquiring the characteristic distance comprises the following steps:
and acquiring two target end points of the target edge and two edge end points of each other real edge, acquiring an end point distance between each target end point and each edge end point, and taking the shortest end point distance as a characteristic distance between the target edge and the corresponding other real edge.
Preferably, the edge growing with two end points corresponding to the feature distance between the target edge and the adjacent edge as a start point and an end point respectively includes:
and acquiring a linear distance between the starting point and the end point, acquiring the maximum growth times based on the linear distance, growing from the starting point along the direction with the minimum gray gradient except the target edge by using a region growth algorithm, and growing the edge within the maximum growth times to reach the end point.
Preferably, the determining whether the connection edge is an actual edge line based on the pixel values at the two sides of the connection edge includes:
the method comprises the steps of obtaining a central point of the connecting edge, connecting the starting point and the end point into a straight line, making the perpendicular line of the straight line through the central point, obtaining adjacent pixel points of each pixel point on the connecting edge in two extending directions of the perpendicular line as corresponding associated pixel points, calculating the ratio of the gray level difference value of the two associated pixel points corresponding to each pixel point to the gray level value of the pixel point, wherein the average value of the ratio of all the pixel points on the connecting edge is the actual edge probability, when the actual edge probability is larger than a preset probability threshold value, the corresponding connecting edge is an actual edge line, and otherwise, the corresponding connecting edge is not the actual edge line.
Preferably, the confidence coefficient obtaining method includes:
and taking each pixel point in the closed region as a neighborhood center to construct a neighborhood region, acquiring the gray gradient of the neighborhood center in each direction in the neighborhood region, acquiring the difference of the gray gradients of all the pixel points in the same direction, and calculating the sum of the differences in all the directions, namely the confidence coefficient.
Preferably, the method further comprises the steps of:
and classifying the defects of each defect area by using the trained target detection network.
Preferably, the training process of the target detection network is as follows:
and (3) taking the image of the defect area as network input, marking the class of the defect surrounding frame on the metal surface as a network label, and training the target detection network by adopting a mean square error loss function until the loss function is converged, so that the training of the target detection network is finished.
The embodiment of the invention at least has the following beneficial effects:
the edge detection effect is obtained through the information entropy difference between the edge pixel points and the background pixel points, when the information entropy difference is too small, the edge pixel points and the background pixel points are not distinguished obviously, which indicates that the edge detection effect is not good, and the subsequent steps are required at the moment, for gray level images with poor edge detection effect, unclosed edges are obtained, the unclosed edges may be interference edges caused by uneven illumination or reflection, or only partial edges of defect areas are identified due to poor edge detection effect, so that real edges are screened out by performing closed analysis on the unclosed edges, and then, connecting the adjacent real edges through the connecting edges to form a closed region, forming the part of the edges, which are probably detected by the defect region and interrupted by the detected defect region due to poor edge detection effect, of the adjacent real edges of the closed region, and identifying the defect region by obtaining the confidence coefficient of the closed edge as the defect region. The invention can make up the defect of edge detection, extract the unidentified edge and improve the identification precision of the defect area.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart illustrating steps of a metal defect identification method according to an embodiment of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of a metal defect identification method according to the present invention, its specific implementation, structure, features and effects will be given in conjunction with the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following describes a specific scheme of the metal defect identification method provided by the invention in detail with reference to the accompanying drawings.
Referring to fig. 1, a flow chart of steps of a metal defect identification method according to an embodiment of the present invention is shown, the method including the following steps:
and S001, collecting the metal surface image, acquiring a gray level image of the surface image, carrying out edge detection on the gray level image to obtain edge pixel points and background pixel points, and acquiring an edge detection effect based on the information entropy difference between the edge pixel points and the background pixel points.
The method comprises the following specific steps:
1. and collecting a metal surface image and acquiring a gray image of the surface image.
The method comprises the steps of collecting a metal surface image by an industrial camera under a fixed light source, wherein the collected image is an RGB image, and carrying out gray processing on the RGB image to obtain a gray image of the surface image.
The graying is a conventional technique, and the effect of graying can be achieved by various methods, and as an example, in the embodiment of the present invention, the graying is performed by a weighted graying method.
2. And carrying out edge detection on the gray level image to obtain edge pixel points and background pixel points.
And (4) carrying out canny operator detection on the gray level image of the metal surface to obtain edge pixel points, namely dividing the pixel points on the image into two types, namely edge pixel points and background pixel points. When the number of the edge pixel points is too small, no defect exists on the metal surface.
3. And acquiring an edge detection effect.
Calculating a first information entropy of the edge pixel point and a second information entropy of the background pixel point, taking a difference absolute value of the first information entropy and the second information entropy as a numerator, taking a maximum value of the two information entropies as a denominator, and obtaining a ratio which is an edge detection effect.
Firstly, the gray levels of all pixel points are compressed to 16 levels, the statistics of the number of the pixel points of each gray level is respectively carried out on two types of pixel points, and the number of the pixel points of the L category in the ith gray level is recorded as
Figure 809339DEST_PATH_IMAGE001
Wherein L =1 indicates that the category is an edge pixel, and L =2 indicates that the category is a background pixel.
Calculating a first information entropy of the edge pixel point:
taking the proportion of the number of each gray level in the edge pixel points to the number of all the edge pixel points as the occurrence probability of the corresponding gray level, substituting the occurrence probability of all the gray levels into an information entropy calculation formula to obtain a first information entropy of the edge pixel points
Figure 52233DEST_PATH_IMAGE002
Figure DEST_PATH_IMAGE003
Wherein,
Figure 169225DEST_PATH_IMAGE004
the number of the ith gray scale accounts for the proportion of the number of all the edge pixel points.
Similarly, calculating a second information entropy of the background pixel point:
taking the proportion of the number of each gray level in the background pixel points to the number of all the background pixel points as the occurrence probability of the corresponding gray level, substituting the occurrence probability of all the gray levels into an information entropy calculation formula to obtain a second information entropy of the background pixel points
Figure 522977DEST_PATH_IMAGE005
Figure 494344DEST_PATH_IMAGE006
Wherein,
Figure DEST_PATH_IMAGE007
and the number of the ith gray scale accounts for the proportion of the number of all the background pixels.
Acquiring an edge detection effect based on the information entropy difference between the edge pixel points and the background pixel points:
Figure 967526DEST_PATH_IMAGE008
wherein R represents an edge detection effect,
Figure 509497DEST_PATH_IMAGE009
presentation selection
Figure 916208DEST_PATH_IMAGE002
And
Figure 227234DEST_PATH_IMAGE005
maximum value of (2).
When the information entropy difference is larger, the edge pixel points and the background pixel points are distinguished more obviously, and the edge detection effect is better.
An edge detection algorithm canny operator is often used for target region segmentation, but in practical application, defects still exist, and for a metal surface defect region, due to uneven illumination and surface reflection characteristic interference, the accuracy of the canny operator in detecting the defect edge is poor, and the detected edge can be broken.
And step S002, when the edge detection effect is lower than a preset threshold value, acquiring unsealed edge lines, and screening out real edges by performing closed analysis on each unsealed edge line.
The method comprises the following specific steps:
1. and evaluating the edge detection effect.
When the edge detection effect is lower than a preset threshold value, the detection effect of the canny operator is poor, the pixel point classification has a misjudgment condition, and some edges are not detected possibly, so that the edge of the defect area is interrupted.
As an example, the preset threshold value is 0.5 in the embodiment of the present invention.
2. And (4) screening out real edges by performing closed analysis on each unsealed edge line.
And connecting adjacent edge pixel points to obtain each edge line, and analyzing each unsealed edge line to obtain all unsealed edge lines because the metal defect area is often a closed area.
Acquiring two end points of each unsealed edge line, acquiring the maximum growing times based on the linear distance between the two end points, growing from one end point along the direction with the minimum gray gradient except the unsealed edge line by using a region growing algorithm, and when the growing times reach the maximum growing times, not reaching the other end point, wherein the edge line is an interference edge; otherwise, it is a real edge.
Analyzing any unsealed edge line, taking the unsealed edge line Q as an example, obtaining two end points of the unsealed edge line Q and recording the two end points as
Figure 811931DEST_PATH_IMAGE010
Figure 90465DEST_PATH_IMAGE011
. In that
Figure 788949DEST_PATH_IMAGE010
Figure 203750DEST_PATH_IMAGE011
In between
Figure 959348DEST_PATH_IMAGE010
And (3) using a region growing algorithm with the point as a starting point, wherein the growing direction is the direction in which the gray gradient of the central pixel point is minimum in the 3 x 3 neighborhood except the unsealed edge line Q. Obtaining
Figure 459599DEST_PATH_IMAGE010
Figure 693266DEST_PATH_IMAGE011
The number of pixels contained in the straight line formed by connecting the pixel points
Figure 228152DEST_PATH_IMAGE012
Setting the maximum growth number as
Figure 889072DEST_PATH_IMAGE013
When the number of growth exceeds the maximum number of growth
Figure 142199DEST_PATH_IMAGE014
Description of the invention
Figure 176626DEST_PATH_IMAGE010
And
Figure 566019DEST_PATH_IMAGE011
the unclosed edge line Q has no unrecognized other edge line, and the defect interference edge line is deleted.
By growing in a direction other than the self direction at both ends of the unsealed edge line Q, if another edge can be grown between both ends within the maximum number of growing times
Figure 397840DEST_PATH_IMAGE015
The unsealed edge line Q is a part of the closed area, and the other edge is grown
Figure 138263DEST_PATH_IMAGE015
I.e. the part of the edge not detected in the closed region where the unclosed edge line Q is located. The other edge may also be joined to the other unsealed edge
Figure 448153DEST_PATH_IMAGE016
Partially overlapped to illustrate the unsealed edge line Q and the unsealed edge
Figure 957632DEST_PATH_IMAGE016
Together forming a closed edge, but because the edge detection is not accurate enough, the middle edge is not detected, breaking the closed area into two unclosed edges.
An unclosed edge that can grow from one end point to the other within the maximum number of growths is therefore a true edge.
Step S003, each real edge is taken as a target edge, a connection edge of the target edge is obtained through the step of obtaining the connection edge, and whether the connection edge is an actual edge line is judged based on pixel values at two sides of the connection edge; and forming a combined edge by the connecting edge as the actual edge line, the target edge and the adjacent edge, and obtaining the connecting edge of the combined edge through the step to form a new combined edge until the new combined edge forms a closed area.
The method comprises the following specific steps:
1. and acquiring a connecting edge of the target edge.
And taking each real edge as a target edge, and obtaining a connecting edge of the target edge through a step of obtaining the connecting edge, wherein the step of obtaining the connecting edge is as follows: and acquiring the characteristic distance between each target edge and other real edges, taking the other real edges corresponding to the shortest characteristic distance as adjacent edges of the target edge, and taking two end points corresponding to the characteristic distances of the target edge and the adjacent edges as a starting point and an end point respectively to carry out edge growth.
The process of obtaining the characteristic distance between each target edge and other real edges is as follows: and acquiring two target end points of the target edge and two edge end points of each other real edge, acquiring an end point distance between each target end point and each edge end point, and taking the shortest end point distance as a characteristic distance between the target edge and the corresponding other real edge.
Taking an unsealed edge Q as a real edge as an example, the real edge Q is taken as a target edge, and the target end point is
Figure 491512DEST_PATH_IMAGE010
And
Figure 453652DEST_PATH_IMAGE011
another real edge W is used as the other real edge, and the edge end point is
Figure 113039DEST_PATH_IMAGE017
And
Figure 211446DEST_PATH_IMAGE018
respectively calculating Euclidean distance between each target end point and each edge end point as end point distance to obtain four end point distances, and taking the minimum end point distance as the end point distanceThe characteristic distance between the real edge Q and the real edge W is denoted as D.
Obtaining the characteristic distance between the target edge and each other real edge, and obtaining the shortest characteristic distance
Figure 916227DEST_PATH_IMAGE019
And taking the other corresponding real edges as adjacent edges of the target edge, taking two end points corresponding to the characteristic distances of the target edge and the adjacent edges as a starting point and an end point respectively, acquiring a linear distance between the starting point and the end point, acquiring the maximum growth times based on the linear distance, growing from the starting point along the direction with the minimum gray gradient except the target edge by using a region growing algorithm, and growing the edges within the maximum growth times to reach the end points.
Assuming that the real edge W is an adjacent edge of the target edge Q, the end point corresponding to the feature distance is
Figure 381975DEST_PATH_IMAGE010
And
Figure 548514DEST_PATH_IMAGE017
then obtain
Figure 517738DEST_PATH_IMAGE010
And
Figure 377110DEST_PATH_IMAGE017
straight distance therebetween, i.e.
Figure 796065DEST_PATH_IMAGE010
And
Figure 31875DEST_PATH_IMAGE017
the number of pixels included in the straight line formed by the connection
Figure 121185DEST_PATH_IMAGE020
The maximum number of growth is
Figure 151457DEST_PATH_IMAGE021
To do so by
Figure 60639DEST_PATH_IMAGE010
As a starting point, in
Figure 568980DEST_PATH_IMAGE017
And growing from the starting point along the direction with the minimum gray gradient except the target edge by using a region growing algorithm to obtain a connecting edge of the target edge.
If the target edge and the adjacent edge are two partial edges on the same closed area, and the connection edge of the target edge is an actual edge, the connection edge is an undetected partial edge between the two edges.
2. And judging whether the connection edge is an actual edge line.
The method comprises the steps of obtaining a central point of a connecting edge, connecting a starting point and a terminal point into a straight line, making the perpendicular line of the straight line through the central point, obtaining adjacent pixel points of each pixel point on the connecting edge in two extending directions of the perpendicular line as corresponding associated pixel points, calculating the ratio of the gray difference value of the two associated pixel points corresponding to each pixel point to the gray value of the pixel point, and calculating the average value of the ratio of all the pixel points on the connecting edge to be actual edge probability.
Connection of
Figure 778376DEST_PATH_IMAGE010
Figure 979550DEST_PATH_IMAGE017
Obtaining a straight line S, obtaining a central point e of a connecting edge, drawing a perpendicular line K with the straight line S by passing the point e, wherein the intersection point of the perpendicular line K and the straight line S is a point S, and two extending directions of the perpendicular line are vectors respectively
Figure 378957DEST_PATH_IMAGE022
Sum vector
Figure 707302DEST_PATH_IMAGE023
The corresponding direction. For each pixel point on the connecting edge, obtain the edge
Figure 754892DEST_PATH_IMAGE022
Adjacent pixel points along direction
Figure 877700DEST_PATH_IMAGE023
The adjacent pixel points in the direction are taken as the associated pixel points of the pixel point, namely, each pixel point on the connecting edge has two associated pixel points in different directions.
If the connection edge is an actual edge line, the gray value difference of the pixel points on the two sides of the connection edge is large, so that the actual edge line is judged by the gray value difference of the associated pixel points, and the specific calculation formula is as follows:
Figure 10741DEST_PATH_IMAGE024
where, P represents the actual edge probability,
Figure 877197DEST_PATH_IMAGE025
indicating the number of pixel points on the connecting edge,
Figure 779294DEST_PATH_IMAGE026
representing the gray value of the j-th pixel point on the connecting edge,
Figure DEST_PATH_IMAGE027
indicating that the j-th pixel on the connecting edge is
Figure 929128DEST_PATH_IMAGE022
The gray value of the associated pixel point in the direction,
Figure 300198DEST_PATH_IMAGE028
indicating that the j-th pixel on the connection edge is
Figure 970345DEST_PATH_IMAGE023
The gray value of the associated pixel point in the direction.
The larger the gray value difference of the pixel points at the two sides of the connecting edge is, the corresponding gray value difference is
Figure 726948DEST_PATH_IMAGE029
The larger the probability that the connecting edge is an actual edge is.
And when the actual edge probability is greater than the preset probability threshold value, the corresponding connection edge is an actual edge line, otherwise, the connection edge is not the actual edge line.
As an example, the probability threshold is 0.95, that is, a connection edge having an actual edge probability P greater than 0.95 is an actual edge line, and is not detected only because the edge detection accuracy is not high.
3. And forming a combined edge by the connecting edge as the actual edge line, the target edge and the adjacent edge, and obtaining the connecting edge of the combined edge through the step to form a new combined edge until the new combined edge forms a closed area.
When the actual edge probability of the connecting edge is not greater than 0.95, the adjacent edge of the target edge needs to be excluded at this time, the real edge with the next shorter characteristic distance is used as a new adjacent edge of the target edge until a new adjacent edge with the actual edge probability greater than 0.95 can be obtained, and a combined edge is formed by the connecting edge used as the actual edge line, the target edge and the adjacent edge.
If the adjacent edge with the actual edge probability larger than 0.95 does not exist, the target edge is taken as a part of the closed area, the rest part is not detected, at the moment, the other edge growing within the maximum growing frequency between the two end points of the target edge is obtained, and the other edge and the target edge form the closed area.
If the new combined edge is a closed area, the undetected partial edges are identified by the method to form a complete closed area, otherwise, the new combined edge is continuously obtained until the new combined edge forms a closed area.
And when no other real edge exists outside the new combined edge and the new combined edge is still an unsealed edge, performing edge growing between two end points of the new combined edge to obtain a sealed area.
And step S004, obtaining the confidence coefficient of the closed region as the defect region based on the gray gradient of each pixel point in the closed region, and when the confidence coefficient is greater than a preset confidence coefficient threshold value, taking the corresponding closed region as the defect region.
The method comprises the following specific steps:
1. and obtaining the confidence that the closed region is the defect region.
And taking each pixel point in the closed region as a neighborhood center to construct a neighborhood region, acquiring the gray gradient of the neighborhood center in each direction in the neighborhood region, acquiring the difference of the gray gradients of all the pixel points in the same direction, and calculating the sum of the differences in all the directions, namely the confidence coefficient.
Each pixel point in the closed region is used as a neighborhood center to construct a 3 x 3 neighborhood region, when each pixel point is used as the neighborhood center, neighborhood pixel points in 8 directions are provided, and when the o-th pixel point is used as the neighborhood center, the gray gradient in the v-th direction is as follows:
Figure 457138DEST_PATH_IMAGE030
wherein
Figure 830350DEST_PATH_IMAGE031
the gray value of the o-th pixel point is represented,
Figure 295399DEST_PATH_IMAGE032
and expressing the gray value of the neighborhood pixel point in the ith direction of the ith pixel point.
Each pixel point has corresponding gradient values in 8 directions, and the average value of all the gradient values is obtained
Figure 922821DEST_PATH_IMAGE033
Calculating the difference of the gray gradients of all the pixel points in the v-th direction:
Figure 338758DEST_PATH_IMAGE034
wherein
Figure 418841DEST_PATH_IMAGE035
representing the gray scale gradient of the t-th pixel point in the v-th direction,
Figure 680058DEST_PATH_IMAGE036
and the number of pixel points in the closed area is represented.
Sum of differences in all directions
Figure 427565DEST_PATH_IMAGE037
I.e. the confidence that the closed region is a defect region. The confidence degree that the region is a defect region is represented according to the change of the gray gradient in different directions, if the closed region is the defect region, the internal structure of the defect region is often irregular, and the corresponding gradient change is larger; if the closed region is a light reflecting region, the middle of the light reflecting region is brightest, and the gradient change is uniform and not very large as the light reflecting region gradually becomes darker close to the edge line, so that the light reflecting region is excluded by the change of the gray gradient. Gray scale gradient and average gradient value
Figure 762207DEST_PATH_IMAGE033
The larger the difference is, the larger the difference in the corresponding direction is, the larger the sum of the differences in all directions is, which shows that the larger the difference between the pixel points in the closed region is, the more irregular the structure is, the more likely it is to be a defective region.
2. And when the confidence coefficient is greater than a preset confidence coefficient threshold value, the corresponding closed region is a defect region.
The higher the confidence coefficient is, the more likely it is to be a defect region, and when the confidence coefficient threshold is greater than a certain degree, the corresponding closed region is a defect region. The confidence threshold is preset according to actual conditions, and as an example, the confidence threshold is 0.9 in the embodiment of the present invention.
Further, in another embodiment, the present invention further comprises the steps of: and classifying the defects of each defect area by using the trained target detection network.
And taking the image of the defect area as network input, performing class marking on the defect surrounding frame of the metal surface as a network label, and training the target detection network by adopting a mean square error loss function until the loss function is converged, so that the training of the target detection network is completed.
Training the existing target detection network, wherein the network structure is Encoder-Decoder-FC, and the network training process is as follows:
firstly, making image label data, marking a defect surrounding frame on the metal surface, marking the category, the coordinate of a central point, the width and the height of the surrounding frame, namely
Figure 313274DEST_PATH_IMAGE038
. Wherein class represents the defect class,
Figure 660073DEST_PATH_IMAGE039
is the coordinate of the center of the bounding box, w is the width of the bounding box, and h is the height of the bounding box. And x, y, w and h in the label need to be normalized.
Training the network through image data and bounding box label data, extracting the characteristics of the image by a target detection Encoder, inputting the image into a defect area image subjected to normalization processing, outputting the image into a Feature map, and up-sampling the intermediate characteristics by a target detection Decoder to generate a bounding box of the metal defect.
The loss function of the network training adopts a mean square error loss function.
And inputting the image of the defect area into the trained target detection network, and outputting the category to which the defect belongs. The output category is determined at the time of label labeling.
In summary, in the embodiment of the present invention, a metal surface image is collected, a gray image of the surface image is obtained, edge detection is performed on the gray image, edge pixel points and background pixel points are obtained, and an edge detection effect is obtained based on the information entropy difference between the edge pixel points and the background pixel points; when the edge detection effect is lower than a preset threshold value, acquiring unclosed edge lines, and screening out real edges by performing closed analysis on each unclosed edge line; each real edge is taken as a target edge, a connection edge of the target edge is obtained through the step of obtaining the connection edge, and whether the connection edge is an actual edge line or not is judged based on pixel values on two sides of the connection edge; forming a combined edge by the connecting edge, the target edge and the adjacent edge which are used as the actual edge lines, obtaining the connecting edge of the combined edge through the steps, and forming a new combined edge until the new combined edge forms a closed area; and obtaining the confidence coefficient of the closed region as a defect region based on the gray gradient of each pixel point in the closed region, and when the confidence coefficient is greater than a preset confidence coefficient threshold value, the corresponding closed region is the defect region. The embodiment of the invention can make up the defect of edge detection, extract the unidentified edge and improve the identification precision of the defect area.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. And specific embodiments thereof have been described above. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same or similar parts in the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; modifications of the technical solutions described in the foregoing embodiments, or equivalents of some technical features thereof, are not essential to the spirit of the technical solutions of the embodiments of the present application, and are all included in the scope of the present application.

Claims (9)

1. A method for identifying metal defects, the method comprising the steps of:
collecting a metal surface image, obtaining a gray image of the surface image, carrying out edge detection on the gray image to obtain edge pixel points and background pixel points, and obtaining an edge detection effect based on the information entropy difference of the edge pixel points and the background pixel points;
when the edge detection effect is lower than a preset threshold value, acquiring unsealed edge lines, and screening out real edges by performing closed analysis on each unsealed edge line;
taking each real edge as a target edge, obtaining a connection edge of the target edge through the step of obtaining the connection edge, and judging whether the connection edge is an actual edge line or not based on pixel values at two sides of the connection edge; a combined edge is formed by the connecting edge, the target edge and the adjacent edge which are used as actual edge lines, the connecting edge of the combined edge is obtained through the steps, and a new combined edge is formed until the new combined edge forms a closed area;
obtaining the confidence coefficient that the closed region is a defect region based on the gray gradient of each pixel point in the closed region, and when the confidence coefficient is greater than a preset confidence coefficient threshold value, the corresponding closed region is a defect region;
the steps are as follows: and acquiring the characteristic distance between each target edge and other real edges, taking the other real edges corresponding to the shortest characteristic distance as adjacent edges of the target edge, and taking two end points corresponding to the characteristic distances of the target edge and the adjacent edges as a starting point and an end point respectively to carry out edge growth.
2. The method for identifying metal defects according to claim 1, wherein the method for obtaining the edge detection effect comprises:
calculating a first information entropy of the edge pixel point and a second information entropy of the background pixel point, taking a difference absolute value of the first information entropy and the second information entropy as a numerator, taking a maximum value of the two information entropies as a denominator, and obtaining a ratio which is the edge detection effect.
3. The metal defect identification method according to claim 1, wherein the closed analysis process comprises:
acquiring two end points of each unsealed edge line, acquiring the maximum growing times based on the linear distance between the two end points, growing from one end point along the direction with the minimum gray gradient except the unsealed edge line by using a region growing algorithm, and when the growing times reach the maximum growing times, not reaching the other end point, wherein the edge line is an interference edge; otherwise, it is a real edge.
4. The metal defect identification method according to claim 1, wherein the characteristic distance obtaining method comprises:
and acquiring two target end points of the target edge and two edge end points of each other real edge, acquiring an end point distance between each target end point and each edge end point, and taking the shortest end point distance as a characteristic distance between the target edge and the corresponding other real edge.
5. The method for identifying the metal defects according to claim 1, wherein the edge growing is performed by taking two end points corresponding to the feature distance between the target edge and the adjacent edge as a starting point and an end point respectively, and comprises the following steps:
and acquiring a linear distance between the starting point and the end point, acquiring the maximum growth times based on the linear distance, growing from the starting point along the direction with the minimum gray gradient except the target edge by using a region growth algorithm, and performing edge growth within the maximum growth times to reach the end point.
6. The method of claim 1, wherein the determining whether the connection edge is an actual edge line based on pixel values at two sides of the connection edge comprises:
the method comprises the steps of obtaining a central point of a connection edge, connecting the starting point and the end point into a straight line, making a vertical line of the straight line through the central point, obtaining adjacent pixel points of each pixel point on the connection edge in two extending directions of the vertical line as corresponding associated pixel points, calculating the ratio of the gray difference value of the two associated pixel points corresponding to each pixel point to the gray value of the pixel point, wherein the average value of the ratios of all the pixel points on the connection edge is an actual edge probability, when the actual edge probability is larger than a preset probability threshold value, the corresponding connection edge is an actual edge line, and otherwise, the corresponding connection edge is not the actual edge line.
7. The metal defect identification method according to claim 1, wherein the confidence coefficient is obtained by:
and taking each pixel point in the closed region as a neighborhood center to construct a neighborhood region, acquiring the gray gradient of the neighborhood center in each direction in the neighborhood region, acquiring the difference of the gray gradients of all the pixel points in the same direction, and calculating the sum of the differences in all the directions, namely the confidence coefficient.
8. A method as claimed in claim 1, characterized in that the method further comprises the steps of:
and classifying the defects of each defect area by using the trained target detection network.
9. The metal defect identification method of claim 8, wherein the training process of the target detection network comprises:
and (3) taking the image of the defect area as network input, marking the class of the defect surrounding frame on the metal surface as a network label, and training the target detection network by adopting a mean square error loss function until the loss function is converged, so that the training of the target detection network is finished.
CN202210978629.8A 2022-08-16 2022-08-16 Metal defect identification method Pending CN115049669A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210978629.8A CN115049669A (en) 2022-08-16 2022-08-16 Metal defect identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210978629.8A CN115049669A (en) 2022-08-16 2022-08-16 Metal defect identification method

Publications (1)

Publication Number Publication Date
CN115049669A true CN115049669A (en) 2022-09-13

Family

ID=83167582

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210978629.8A Pending CN115049669A (en) 2022-08-16 2022-08-16 Metal defect identification method

Country Status (1)

Country Link
CN (1) CN115049669A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272946A (en) * 2022-09-30 2022-11-01 江苏三通科技有限公司 Method for identifying damage of common rail fuel injector by using electronic equipment
CN115633259A (en) * 2022-11-15 2023-01-20 深圳市泰迅数码有限公司 Automatic regulation and control method and system for intelligent camera based on artificial intelligence
CN115641329A (en) * 2022-11-15 2023-01-24 武汉惠强新能源材料科技有限公司 Lithium battery diaphragm defect detection method and system
CN115861987A (en) * 2023-02-27 2023-03-28 江苏天南电力股份有限公司 Intelligent electric power fitting defect identification method for on-line monitoring of power transmission line
CN116071357A (en) * 2023-03-07 2023-05-05 飞杨电源技术(深圳)有限公司 High-power charger surface defect detection method
CN116152255A (en) * 2023-04-21 2023-05-23 高唐县红发塑业有限公司 Modified plastic production defect judging method
CN116934740A (en) * 2023-09-11 2023-10-24 深圳市伟利达精密塑胶模具有限公司 Plastic mold surface defect analysis and detection method based on image processing
CN117576416A (en) * 2024-01-15 2024-02-20 北京阿丘机器人科技有限公司 Workpiece edge area detection method, device and storage medium
CN117953434A (en) * 2024-03-27 2024-04-30 广州煜能电气有限公司 Intelligent gateway-based method and system for monitoring external damage of power transmission line

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10475179B1 (en) * 2018-10-12 2019-11-12 Velocity Image Processing LLC Compensating for reference misalignment during inspection of parts
CN112712512A (en) * 2021-01-05 2021-04-27 余波 Hot-rolled strip steel scab defect detection method and system based on artificial intelligence
CN113160192A (en) * 2021-04-28 2021-07-23 北京科技大学 Visual sense-based snow pressing vehicle appearance defect detection method and device under complex background
CN114882215A (en) * 2022-04-11 2022-08-09 安徽理工大学 Shape selection identification method for particle aggregate region of photoelectric coal gangue sorting image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10475179B1 (en) * 2018-10-12 2019-11-12 Velocity Image Processing LLC Compensating for reference misalignment during inspection of parts
CN112712512A (en) * 2021-01-05 2021-04-27 余波 Hot-rolled strip steel scab defect detection method and system based on artificial intelligence
CN113160192A (en) * 2021-04-28 2021-07-23 北京科技大学 Visual sense-based snow pressing vehicle appearance defect detection method and device under complex background
CN114882215A (en) * 2022-04-11 2022-08-09 安徽理工大学 Shape selection identification method for particle aggregate region of photoelectric coal gangue sorting image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
蔡聪艺: "金属表面缺陷的机器视觉检测方法研究与实现", 《成都大学学报(自然科学版)》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272946A (en) * 2022-09-30 2022-11-01 江苏三通科技有限公司 Method for identifying damage of common rail fuel injector by using electronic equipment
CN115633259A (en) * 2022-11-15 2023-01-20 深圳市泰迅数码有限公司 Automatic regulation and control method and system for intelligent camera based on artificial intelligence
CN115641329A (en) * 2022-11-15 2023-01-24 武汉惠强新能源材料科技有限公司 Lithium battery diaphragm defect detection method and system
CN115633259B (en) * 2022-11-15 2023-03-10 深圳市泰迅数码有限公司 Automatic regulation and control method and system for intelligent camera based on artificial intelligence
CN115861987A (en) * 2023-02-27 2023-03-28 江苏天南电力股份有限公司 Intelligent electric power fitting defect identification method for on-line monitoring of power transmission line
CN116071357A (en) * 2023-03-07 2023-05-05 飞杨电源技术(深圳)有限公司 High-power charger surface defect detection method
CN116152255A (en) * 2023-04-21 2023-05-23 高唐县红发塑业有限公司 Modified plastic production defect judging method
CN116934740A (en) * 2023-09-11 2023-10-24 深圳市伟利达精密塑胶模具有限公司 Plastic mold surface defect analysis and detection method based on image processing
CN116934740B (en) * 2023-09-11 2023-12-08 深圳市伟利达精密塑胶模具有限公司 Plastic mold surface defect analysis and detection method based on image processing
CN117576416A (en) * 2024-01-15 2024-02-20 北京阿丘机器人科技有限公司 Workpiece edge area detection method, device and storage medium
CN117576416B (en) * 2024-01-15 2024-05-14 北京阿丘机器人科技有限公司 Workpiece edge area detection method, device and storage medium
CN117953434A (en) * 2024-03-27 2024-04-30 广州煜能电气有限公司 Intelligent gateway-based method and system for monitoring external damage of power transmission line
CN117953434B (en) * 2024-03-27 2024-06-18 广州煜能电气有限公司 Intelligent gateway-based method and system for monitoring external damage of power transmission line

Similar Documents

Publication Publication Date Title
CN115049669A (en) Metal defect identification method
CN116721106B (en) Profile flaw visual detection method based on image processing
CN113362326B (en) Method and device for detecting defects of welding spots of battery
CN113469177B (en) Deep learning-based drainage pipeline defect detection method and system
CN113160192B (en) Visual sense-based snow pressing vehicle appearance defect detection method and device under complex background
CN111223088B (en) Casting surface defect identification method based on deep convolutional neural network
CN113362306B (en) Packaged chip defect detection method based on deep learning
CN109685760B (en) MATLAB-based SLM powder bed powder laying image convex hull depression defect detection method
CN106338520A (en) Recognition method of surface defects of multilayer solid wood composite floor with surface board being jointed board
CN113963042B (en) Metal part defect degree evaluation method based on image processing
CN112446871B (en) Tunnel crack identification method based on deep learning and OpenCV
CN114897896B (en) Building wood defect detection method based on gray level transformation
US20100040276A1 (en) Method and apparatus for determining a cell contour of a cell
CN111652213A (en) Ship water gauge reading identification method based on deep learning
CN115115644A (en) Vehicle welding defect detection method based on artificial intelligence
CN115797354B (en) Method for detecting appearance defects of laser welding seam
CN115578374A (en) Mechanical part casting quality evaluation method and system
CN115719332A (en) Welding quality detection method
CN116188468B (en) HDMI cable transmission letter sorting intelligent control system
CN114119603A (en) Image processing-based snack box short shot defect detection method
CN113870202A (en) Far-end chip defect detection system based on deep learning technology
CN113298809A (en) Composite material ultrasonic image defect detection method based on deep learning and superpixel segmentation
CN114332534A (en) Hyperspectral image small sample classification method
CN112750113B (en) Glass bottle defect detection method and device based on deep learning and linear detection
CN113591850A (en) Two-stage trademark detection method based on computer vision robustness target detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination