CN117808796B - Gear surface damage detection method based on computer vision - Google Patents

Gear surface damage detection method based on computer vision Download PDF

Info

Publication number
CN117808796B
CN117808796B CN202410199476.6A CN202410199476A CN117808796B CN 117808796 B CN117808796 B CN 117808796B CN 202410199476 A CN202410199476 A CN 202410199476A CN 117808796 B CN117808796 B CN 117808796B
Authority
CN
China
Prior art keywords
pixel point
gear
pixel
gray
extension
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410199476.6A
Other languages
Chinese (zh)
Other versions
CN117808796A (en
Inventor
刘自强
张震
黄復隆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi Changkong Gear Co ltd
Original Assignee
Shaanxi Changkong Gear Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi Changkong Gear Co ltd filed Critical Shaanxi Changkong Gear Co ltd
Priority to CN202410199476.6A priority Critical patent/CN117808796B/en
Publication of CN117808796A publication Critical patent/CN117808796A/en
Application granted granted Critical
Publication of CN117808796B publication Critical patent/CN117808796B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image data processing, in particular to a gear surface damage detection method based on computer vision, which comprises the following steps: acquiring a gear gray image, and determining each extension pixel point in the eight neighborhood directions of the gear pixel points in the gear gray image; taking an area formed by connecting the gear pixel points with all extension pixel points in the eight neighborhood direction as a communication node; determining the clustering distance between each connected node, constructing a connected graph according to the clustering distance, and further obtaining each optimal cluster; judging whether the surface of the gear to be detected has damage defects or not according to the image characteristics of the connected graph in each optimal cluster. According to the method, each region is determined by analyzing the gray characteristic information of the image, and the region is used as node data in the clustering process, so that the accuracy of a clustering result can be effectively improved, and the reliability of the detection of the damage of the gear surface is further improved.

Description

Gear surface damage detection method based on computer vision
Technical Field
The invention relates to the technical field of image data processing, in particular to a gear surface damage detection method based on computer vision.
Background
The gear is used as a high-precision physical mechanical structure, the quality of the gear directly influences the service life of the whole mechanical equipment, and the damage to the surface of the gear influences the service life and the performance of the gear. Therefore, it is necessary to detect damage to the gear surface, which is typically scratches, pitting, abrasion, fatigue cracks, corrosion, and the like. When detecting damage to the gear surface, the damage condition of the gear surface is judged by analyzing the shot image information of the gear surface.
The acquired image information is subjected to cluster classification by using a connected graph dynamic splitting clustering algorithm, pixels with the same distribution characteristics are classified into the same cluster, and the damage condition of the gear surface can be judged by analyzing the distribution condition of the pixels in the cluster. When the connected graph dynamic split clustering algorithm is used for constructing the connected graph, node data for constructing the connected graph are required to be determined, the pixel points in the image are directly used as the node data by the existing algorithm, the structural distribution of a damaged area cannot be reflected, normal pixel points exist in clusters with damage defect properties, and deviation exists in gear surface damage detection realized based on the clusters; meanwhile, as the position information of the pixel points in the image is fixed, the information in the constructed connected graph is disordered, the accuracy of cluster classification is further affected, and the subsequent detection of the damage to the gear surface is caused to generate errors.
Disclosure of Invention
In order to solve the technical problem that when the cluster classification is carried out on the acquired image by using the connected graph dynamic splitting clustering algorithm, the cluster classification result is inaccurate when the pixel points are used as nodes for constructing the connected graph, and the subsequent error is caused to the detection of the gear surface damage, the invention aims to provide a gear surface damage detection method based on computer vision, and the adopted technical scheme is as follows:
One embodiment of the invention provides a gear surface damage detection method based on computer vision, which comprises the following steps:
acquiring gear gray level images of the surface of a gear to be detected, and further acquiring downsampled images of a preset number of scale levels corresponding to the gear gray level images;
According to the gray value of each pixel point in the image to be analyzed, analyzing the gray distribution similarity condition between the preset window area of each pixel point and the preset window area of the pixel point in the eight neighborhood direction, and determining the extension degree of each pixel point in the eight neighborhood direction of each pixel point; the image to be analyzed is a gear gray image or each downsampled image;
Taking any pixel point in the gear gray level image as a gear pixel point, and determining a mapping pixel point of the gear pixel point in each downsampled image; analyzing extension stability according to extension degrees of all the gear pixel points and all the pixel points in the eight-neighborhood direction of each corresponding mapping pixel point, and determining all the extension pixel points in the eight-neighborhood direction of the gear pixel points;
taking an area formed by connecting the gear pixel points with each extension pixel point in the eight neighborhood direction as a communication node, and further obtaining each communication node; determining the clustering distance between each connected node, constructing a connected graph according to the clustering distance, and further obtaining each optimal cluster;
Judging whether the surface of the gear to be detected has damage defects or not according to the image characteristics of the connected graph in each optimal cluster.
Further, according to the gray value of each pixel in the image to be analyzed, analyzing the gray distribution similarity between the preset window area of each pixel and the preset window area of the pixel in the eight neighborhood direction, and determining the extension degree of each pixel in the eight neighborhood direction of each pixel, including:
constructing a window area with a preset size by taking each pixel point in the image to be analyzed as a center point, and taking the window area with the preset size as a preset window area to obtain a preset window area of each pixel point in the image to be analyzed;
for any pixel point in an image to be analyzed, determining each gray level change amplitude of the pixel point according to the pixel point and the gray level value of each neighborhood pixel point in a preset window area of the pixel point; the neighborhood pixel points are other pixel points except the center point in the preset window area;
taking any one direction of the eight neighborhood directions as a target direction, and taking the pixel point in the target direction of the pixel point as an initial pixel point to be extended; calculating the difference between any gray scale variation amplitude value of the pixel point and each gray scale variation amplitude value of any initial pixel point to be expanded, determining the minimum difference, and taking the sum of the minimum differences corresponding to the pixel point as the expansion degree of the pixel point in the target direction of the pixel point.
Further, the determining each gray scale variation amplitude of the pixel according to the pixel and the gray scale value of each neighboring pixel in the preset window area of the pixel includes:
For any one neighborhood pixel point in a preset window area, calculating the gray value difference between the pixel point and the neighborhood pixel point; determining an included angle between a connecting line between the pixel point and the neighborhood pixel point and the horizontal direction, and further calculating a cosine value of the included angle; taking the product of the gray value difference and the cosine value of the included angle as the gray change amplitude of the pixel point.
Further, analyzing the extension stability according to the extension degree of each pixel point in the eight-neighborhood direction of the gear pixel point and each corresponding mapping pixel point, and determining each extension pixel point in the eight-neighborhood direction of the gear pixel point includes:
Respectively counting the number of the initial pixel points to be extended in the target direction of the gear pixel points and the number of the initial pixel points to be extended in the target direction of each mapping pixel point corresponding to the gear pixel points, and determining the minimum number of the initial pixel points to be extended as an extension analysis number; selecting an initial pixel point to be expanded closest to the pixel point to be analyzed as a target pixel point to be expanded in the target direction of the pixel point to be analyzed, wherein the number of the target pixel points to be expanded is the expansion analysis number; the pixel points to be analyzed are gear pixel points or mapping pixel points corresponding to the gear pixel points;
Calculating extension evaluation indexes of all target pixel points to be extended in the target direction of the gear pixel points; and judging whether the extension evaluation index of the target pixel point to be extended is larger than a preset extension threshold value according to the distance between the target pixel point to be extended and the gear pixel point in sequence from small to large until the target pixel point to be extended, of which the extension evaluation index is not larger than the preset extension threshold value, appears, and taking the target pixel point to be extended, of which the current extension evaluation index is larger than the preset extension threshold value, as the extension pixel point in the target direction of the gear pixel point.
Further, the calculating the extension evaluation index of each target pixel to be extended in the target direction of the gear pixel includes:
for any one target pixel point to be extended, calculating the accumulated sum of differences between the extension degree of the target pixel point to be extended in the target direction of the gear pixel point and the extension degree of the target pixel point to be extended in the target direction of each mapping pixel point corresponding to the gear pixel point; and carrying out inverse proportion normalization processing on the accumulated sum of the differences to obtain an extension evaluation index of the target pixel point to be extended in the target direction of the gear pixel point.
Further, determining a cluster distance between each connected node includes:
for any two connected nodes, determining the gray level change degree of the two connected nodes according to the gray level variance of each pixel point in the two connected nodes and the eight neighborhood pixel points;
determining the distance between the center points of the two communication nodes according to the positions of the center points of the two communication nodes, determining the maximum area value corresponding to the two communication nodes, and taking the ratio of the distance between the center points of the two communication nodes and the maximum area value as the regular distance between the two communication nodes;
Counting the number of connected nodes passing through a central point connecting line of two connected nodes, performing curve fitting on the central points of the two connected nodes and the central points of the connected nodes passing through the central point connecting line to obtain a fitted curve, calculating slope variances of all pixel points on the fitted curve, and taking the product of the number of the connected nodes and the slope variances as the communication degree between the two connected nodes;
And combining the gray level change degree, the regular distance and the communication degree of the two communication nodes to obtain the clustering distance between the two communication nodes.
Further, the determining the gray level variation degree of the two connected nodes according to the gray level variance of each pixel point in the two connected nodes and the eight neighboring pixel points comprises:
For any one of the connected nodes, calculating the difference between the gray variance of any one pixel point and the eight neighborhood pixel points in the connected nodes and the average value of the gray variances of all the pixel points and the eight neighborhood pixel points, and marking the difference as the gray variance difference; and taking the average value of the gray variance differences of each pixel point in the connected node as the gray variation degree of the connected node.
Further, the judging whether the surface of the gear to be detected has a damage defect according to the image features of the connected graph in each optimal cluster includes:
Determining the damage degree of each optimal cluster according to the gray variance, perimeter and area of the connected graph in each optimal cluster; if the damage degree of any one of the optimal clusters is larger than a preset damage threshold, judging that the surface of the gear to be detected has damage defects, otherwise, judging that the surface of the gear to be detected has no damage defects.
Further, the determining the damage degree of each optimal cluster according to the gray variance, perimeter and area of the connected graph in each optimal cluster includes:
For any one optimal cluster, calculating the ratio of the perimeter and the area of the connected graph in the optimal cluster, further calculating the product of the ratio and the gray variance, and taking the product of the ratio and the gray variance as a first product; and carrying out normalization processing on the first product of the connected graph in the optimal cluster, and taking the numerical value after normalization processing as the damage degree of the optimal cluster.
Further, the obtaining downsampled images of a preset number of scale levels corresponding to the gear gray image includes:
And setting a preset number of scale levels, and performing downsampling treatment on the gear gray level images by using the Gaussian pyramid to obtain downsampled images of the preset number of scale levels corresponding to the gear gray level images.
The invention has the following beneficial effects:
The invention provides a gear surface damage detection method based on computer vision, which is characterized in that when an acquired image is clustered by using a connected graph dynamic splitting clustering algorithm, the extension degree of each pixel point in the eight neighborhood direction of each pixel point is determined by analyzing the consistency of gray distribution of surrounding neighborhood of the pixel point in a gear gray image under different scale levels, so that the numerical accuracy of extension degree calculation is improved, and data support is provided for the follow-up screening of extension pixel points; in order to judge the extension effectiveness, the extension stability is analyzed by combining the extension degree of each pixel point in the eight neighborhood directions of the gear pixel point and each corresponding mapping pixel point, and each extension pixel point is determined, so that the unstable region is prevented from being divided into one region to influence the subsequent detection accuracy of gear damage; the extended area is used as a communication node, which is favorable for showing structural distribution characteristics of the damaged area, further improves the gray information distribution rule degree in the subsequently constructed communication graph, and avoids errors in subsequent gear damage detection; the clustering distance among all the connected nodes is determined to be more in accordance with the structural distribution of the whole image, so that the cluster classification is clearer, and the subsequent damage detection is convenient.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for detecting damage to a gear surface based on computer vision according to an embodiment of the present invention;
Fig. 2 is an exemplary diagram of each preset window area in the same direction in the embodiment of the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the present invention to achieve the preset purpose, the following detailed description is given below of the specific implementation, structure, features and effects of the technical solution according to the present invention with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The embodiment of the invention carries out cluster classification on collected images of the surface of a gear to be detected through a connected graph dynamic splitting clustering algorithm, judges the damage condition of the gear to be detected according to the image characteristics of the connected graph in the cluster, and particularly provides a gear surface damage detection method based on computer vision, which is shown in fig. 1 and comprises the following steps:
s1, acquiring gear gray level images of the surface of a gear to be detected, and further acquiring downsampled images of a preset number of scale levels corresponding to the gear gray level images.
And firstly, acquiring a gear gray image of the surface of the gear to be detected.
In this embodiment, an image of the surface of the gear to be detected is captured by an industrial camera, and the capturing angle may be a front view angle. After the image of the surface of the gear to be detected is obtained, in order to facilitate the subsequent analysis of the image gray characteristic information to detect the gear damage, the image of the surface of the gear to be detected is subjected to gray processing, so that a gear gray image can be obtained. The implementation process of the graying treatment is the prior art and is not within the scope of the present invention, and will not be described in detail here.
And secondly, obtaining downsampled images of a preset number of scale levels corresponding to the gear gray level images.
When the surface of the gear is not damaged, the whole gray distribution of the gear image is uniform, and a stable region can be obtained through the gray distribution uniformity of the pixel points; however, when there is a damage to the gear surface, the gradation distribution uniformity of the entire gear image is destroyed, and the similarity of the gradation change distribution can be analyzed later when the region division is performed, with the gradation change similarity as the condition of the region division.
When the area division is carried out, the division condition can be corrected through the consistency of the image information under different scales, so that the problem that the follow-up division area is unstable due to noise points or tiny gray differences existing in metal is avoided, and the accuracy of gear damage detection is affected. Before the division of the region is corrected, images of different scales need to be acquired.
In this embodiment, a preset number of scale levels are set, and downsampling processing is performed on the gear gray level image by using a gaussian pyramid, so that downsampled images of the preset number of scale levels corresponding to the gear gray level image can be obtained. The scale level may be set to 4, and the implementer may set the number of scale levels according to specific actual implementation. The downsampling process of the gaussian pyramid is the prior art, and is not in the scope of the present invention, and will not be described in detail.
It should be noted that, in order to ensure that each pixel point in the gear gray image has its corresponding mapping pixel point in the image under different scales, so as to facilitate the subsequent analysis of gray change consistency between the pixel point and each corresponding mapping pixel point, downsampling processing needs to be performed on the gear gray image.
Thus far, the present embodiment obtains a gear gray image and each downsampled image of the gear surface to be detected.
S2, determining the extension degree of each pixel point in the eight neighborhood direction of each pixel point according to the gear gray level image and the gray level value of each pixel point in each downsampled image.
It should be noted that, in the following process of dividing the region, gray scale extension conditions of the pixels in different directions need to be analyzed, so that the possibility that each pixel in the eight-neighborhood direction of each pixel is extended, namely the extension degree, can be determined based on the gray scale change condition between each pixel and different pixels around the pixel in the image and analyzing the gray scale distribution similarity condition between the preset window region of each pixel and the preset window region of the pixel in the eight-neighborhood direction.
In this embodiment, the extension degree of each pixel point in the eight neighborhood direction of each pixel point needs to be determined for both the gear gray level image and each downsampled image, and the calculation process of the extension degree corresponding to different images is the same, and taking any one image of the gear gray level image and each downsampled image as an example, and any one image is referred to as an image to be analyzed, the determining the extension degree of each pixel point in the eight neighborhood direction of each pixel point in the image to be analyzed may include:
The first step, a window area with a preset size is constructed by taking each pixel point in the image to be analyzed as a center point, and the window area with the preset size is used as the preset window area, so that the preset window area of each pixel point in the image to be analyzed is obtained.
In order to improve the accuracy of gray level change similarity analysis, a local area of each pixel point is taken as an analysis object. In this embodiment, the preset size of the window area may be set toOf course, the practitioner may set the size of the window area according to the specific practical situation.
It should be noted that, for the pixel points near the image boundary, the preset window area of the pixel point may be determined through the zero padding operation. In addition, when the gear surface image is acquired, the pixels near the image boundary are generally pixels in the background area, which will not adversely affect the accuracy of the final gear damage detection result, so that the gray information of the pixels in the part need not be considered excessively.
And secondly, for any pixel point in the image to be analyzed, determining each gray level change amplitude value of the pixel point according to the pixel point and gray level values of each neighborhood pixel point in a preset window area of the pixel point.
In this embodiment, for any one of the neighboring pixel points in the preset window area, the gray value difference between the pixel point and the neighboring pixel point is calculated; determining an included angle between a connecting line between the pixel point and the neighborhood pixel point and the horizontal direction, and further calculating a cosine value of the included angle; taking the product of the gray value difference and the cosine value of the included angle as the gray change amplitude of the pixel point.
The neighborhood pixel points are other pixel points except the center point in the preset window area; the gray value difference may be an absolute value of a difference between gray values of two pixel points; the gray level change amplitude can represent the gray level change degree between the pixel point and the pixel point in the neighborhood direction, and the larger the gray level change amplitude is, the larger the gray level change degree is. The gray scale change amplitude is calculated to provide data support for subsequent analysis of the degree of consistency of distribution and the degree of quantization expansion between two window areas in the same direction.
Thirdly, taking any one direction of the eight neighborhood directions as a target direction, and taking the pixel point in the target direction of the pixel point as an initial pixel point to be extended; calculating the difference between any gray scale variation amplitude value of the pixel point and each gray scale variation amplitude value of any initial pixel point to be expanded, determining the minimum difference, and taking the sum of the minimum differences corresponding to the pixel point as the expansion degree of the pixel point in the target direction of the pixel point.
In this embodiment, the eight neighborhood directions may include 0 degrees, 45 degrees, 90 degrees, 135 degrees, 180 degrees, 225 degrees, 270 degrees, and 315 degrees, for example, the target direction may be 45 degrees, and all pixel points in the 45 degree direction of the pixel points in the image to be analyzed are obtained as the initial pixel points to be extended. In order to measure the gray level change similarity between the window area of the pixel point and the window area of the pixel point to be initially stretched, the difference between one gray level change amplitude obtained by the pixel point and a single neighborhood pixel point in the window area and a plurality of gray level change amplitudes obtained by the pixel point to be initially stretched and each neighborhood pixel point in the window area is analyzed, so that the stretching degree of the pixel point to be initially stretched in the target direction of the pixel point is determined. The minimum difference is obtained for analyzing the gray level change similarity condition between the two window areas in the same direction on the premise of being very similar, so as to determine the extension condition between the center points of the two window areas in the same direction.
As an example, a calculation formula of the extension degree of the s-th initial pixel to be extended in the target direction of the i-th pixel in the image to be analyzed may be:
; in the/> For the extension degree of the ith initial pixel point in the target direction of the ith pixel point in the image to be analyzed, a is the sequence number of the neighborhood pixel point in the preset window area of the ith pixel point in the image to be analyzed, N is the number of the neighborhood pixel points in the preset window area of the ith pixel point in the image to be analyzed, N can be equal to 8, min is a minimum function,For the cosine value of the included angle between the connection line of the ith pixel point in the image to be analyzed and the ith neighborhood pixel point in the preset window area and the horizontal direction,/>For the gray level difference between the ith pixel point in the image to be analyzed and the (a) th neighborhood pixel point in the preset window area of the ith pixel point,/>For the gray scale variation amplitude of the ith pixel point in the image to be analyzed,For the cosine value of the included angle between the connection line of the ith initial pixel point to be expanded in the target direction of the ith pixel point in the image to be analyzed and the ith neighborhood pixel point in the preset window area and the horizontal direction,/>For the gray level difference between the(s) th initial pixel point to be expanded in the target direction of the (i) th pixel point in the image to be analyzed and the (b) th neighborhood pixel point in the preset window area of the(s) th initial pixel point to be expanded,/>For the gray scale variation amplitude of the ith initial pixel point to be extended in the target direction of the ith pixel point in the image to be analyzed,/>For solving the absolute value function, M is the number of neighborhood pixel points in a preset window area of the s-th initial pixel point to be extended in the target direction of the i-th pixel point in the image to be analyzed.
In the calculation formula of the degree of expansion,The situation when the difference of gray level change in the target direction is the smallest can be represented, and the situation is used for calculating the gray level change similarity situation of two preset window areas, an example diagram of each preset window area in the same direction is shown in fig. 2, one box in fig. 2 represents a pixel point, and a black box represents the pixel point and two initial pixel points to be extended in the 45-degree direction; the consistency of the gray scale distribution of the two preset window areas can be represented, and the smaller the accumulated value of the minimum difference is, the higher the consistency of the gray scale distribution of the two preset window areas is.
It should be noted that, referring to a calculation process of an extension degree of an initial pixel point to be extended of any one pixel point in an image to be analyzed in any one direction, an extension degree of each pixel point in the image to be analyzed in an eight-neighborhood direction can be obtained, and the description is not repeated at this time.
Thus, the embodiment obtains the extension degree of each pixel point in the eight neighborhood directions of each pixel point in the gear gray level image and each downsampled image, and the extension degree is determined to be helpful for screening extension pixel points meeting the conditions.
S3, determining each extension pixel point in the eight neighborhood direction of each pixel point in the gear gray level image according to the extension degree of each pixel point in the gear gray level image and each pixel point in the eight neighborhood direction of the mapping pixel point in each downsampled image.
It should be noted that, by analyzing the stability of the extension degree of each pixel point in the gear gray image under each downsampled image, the extension evaluation index of each pixel point in the eight neighborhood direction of each pixel point in the gear gray image is quantified, the extension evaluation index can represent the extension effectiveness of the pixel point, the greater the extension difference between the pixel point in the gear gray image and each corresponding mapping pixel point is, the worse the extension uniformity is, the lower the extension effectiveness of each pixel point in the eight neighborhood direction of each pixel point in the gear gray image is, and the pixel point with lower extension effectiveness should be excluded as the extension pixel point, so as to avoid that the unstable region is divided into the same region to affect the subsequent gear damage detection result.
Firstly, mapping and matching each pixel point in the gear gray image with each pixel point in different downsampled images through a mapping relation of a Gaussian pyramid, so that the mapping pixel point of each pixel point in the gear gray image in each downsampled image can be obtained, and each pixel point in the gear gray image has a unique mapping pixel point corresponding to the pixel point in each downsampled image, so that each mapping pixel point corresponding to each pixel point in the gear gray image is obtained. The implementation process of determining the mapped pixel points is the prior art, and will not be described in detail here.
Secondly, for convenience of description and analysis, any pixel point in the gear gray level image is called a gear pixel point, and further, a mapping pixel point of the gear pixel point in each downsampled image is determined, and taking each extension pixel point in the target direction of the gear pixel point as an example, the specific implementation steps may include:
The method comprises the steps of firstly, respectively counting the number of initial to-be-extended pixels in a target direction of a gear pixel and the number of initial to-be-extended pixels in the target direction of each mapping pixel corresponding to the gear pixel, and determining the minimum number of the initial to-be-extended pixels as an extension analysis number; and selecting an initial pixel point to be expanded closest to the pixel point to be analyzed as a target pixel point to be expanded in the target direction of the pixel point to be analyzed, wherein the number of the target pixel points to be expanded is the expansion analysis number.
In this embodiment, the determining process of the gear pixel point and the target pixel point to be extended of each corresponding mapping pixel point is consistent, and in order to reduce unnecessary description, each mapping pixel point corresponding to the gear pixel point or each corresponding mapping pixel point is used as a pixel point to be analyzed; the number of extension analysis is determined to avoid the situation that the number of the initial pixel points to be extended in the image of the gear pixel points and the number of the initial pixel points to be extended of the mapping pixel points are different, and the initial pixel points to be extended in the eight neighborhood directions of the gear pixel points can be subjected to preliminary screening treatment by determining the target pixel points to be extended, so that the calculation amount of the subsequent extension stability analysis is reduced to a certain extent. The initial pixel point to be extended closest to the pixel point to be analyzed is selected because the pixel point is generally extended with the adjacent and connected pixel points around the pixel point when the extension of the gear pixel point is realized.
Calculating extension evaluation indexes of all target pixel points to be extended in the target direction of the gear pixel points; and judging whether the extension evaluation index of the target pixel point to be extended is larger than a preset extension threshold value according to the distance between the target pixel point to be extended and the gear pixel point in sequence from small to large until the target pixel point to be extended, of which the extension evaluation index is not larger than the preset extension threshold value, appears, and taking the target pixel point to be extended, of which the current extension evaluation index is larger than the preset extension threshold value, as the extension pixel point in the target direction of the gear pixel point.
In this embodiment, whether a first target pixel to be extended closest to the gear pixel in the target direction of the gear pixel is an extended pixel is first determined, and if the extension evaluation index of the first target pixel to be extended is greater than a preset extension threshold, the first target pixel to be extended is taken as the extended pixel in the target direction of the gear pixel; then, judging whether a second target pixel point to be extended, which is closest to the gear pixel point in the target direction of the gear pixel point, is an extended pixel point, and if the extension evaluation index of the second target pixel point to be extended is greater than a preset extension threshold value, taking the second target pixel point to be extended as the extended pixel point in the target direction of the gear pixel point; and continuously and sequentially judging whether the extension evaluation index of the target pixel point to be extended is larger than a preset extension threshold value or not until the extension evaluation index is not larger than the preset extension threshold value, stopping judging, and obtaining each extension pixel point in the target direction of the gear pixel point. The preset expansion threshold value can be set to 0.68, and the practitioner can set the magnitude of the preset expansion threshold value according to specific practical situations and historical experience values.
It should be noted that, referring to the determination process of each extended pixel point in the target direction of the gear pixel point, each extended pixel point in the eight neighborhood direction of the gear pixel point may be obtained. After each extension pixel point in the eight-neighborhood direction of the gear pixel point is obtained, determining the next gear pixel point in the gear gray level image, further determining each extension pixel point in the eight-neighborhood direction of the next gear pixel point, and continuously iterating until all the pixel points in the gear gray level image are traversed.
It should be noted that the next gear pixel point cannot be the already determined extension pixel point and the gear pixel point for which the extension pixel point has been determined.
The step of calculating the extension evaluation index of each target pixel to be extended in the target direction of the gear pixel may include:
For any one target pixel point to be extended, calculating the accumulated sum of differences between the extension degree of the target pixel point to be extended in the target direction of the gear pixel point and the extension degree of the target pixel point to be extended in the target direction of each mapping pixel point corresponding to the gear pixel point; and carrying out inverse proportion normalization processing on the accumulated sum of the differences to obtain an extension evaluation index of the target pixel point to be extended in the target direction of the gear pixel point.
As an example, the calculation formula of the extension evaluation index of the x-th target pixel to be extended in the target direction of the gear pixel is:
; in the/> The extension evaluation index of the x-th target pixel point to be extended in the target direction of the gear pixel point is exp an exponential function based on a natural constant, x is the serial number of the target pixel point to be extended, Z is the serial number of the mapping pixel point corresponding to the gear pixel point, Z is the number of the mapping pixel point corresponding to the gear pixel point, and the number of the mapping pixel points is/>Is the extension degree of the x target pixel point to be extended in the target direction of the gear pixel point,/>Extension degree of the x target pixel point to be extended in the target direction of the z-th mapping pixel point of the gear pixel point,/>For absolute value functions.
Thus, the method and the device obtain each extension pixel point in the eight neighborhood directions of each gear pixel point in the gear gray image, determine the extension pixel points to be beneficial to the subsequent realization of the region division of the gear gray image, divide the gear gray image into regions with similar and stable gray changes, and are beneficial to improving the accuracy of the clustering result of image clustering based on the regions.
S4, taking an area formed by connecting the gear pixel points with each extension pixel point in the eight neighborhood direction as a communication node, and further obtaining each communication node; and determining the clustering distance between each connected node, and constructing a connected graph according to the clustering distance to further obtain each optimal cluster.
And in the first step, the area formed by connecting the gear pixel points with all the extension pixel points in the eight neighborhood direction is used as a communication node, so that all the communication nodes are obtained.
In this embodiment, the connected nodes refer to extended areas in the gear gray level image, and each area in which extension is completed is obtained by analyzing the consistency of gray level distribution of surrounding neighborhoods of pixel points under different downsampled images, so that each area in which extension is completed is used as node data to perform subsequent connected graph construction, structural change of the connected graph can be reflected to a certain extent, and the condition that subsequent cluster classification is unclear and subsequent gear damage detection is affected due to disordered constructed connected graph information is avoided.
And secondly, determining the clustering distance between each connected node, constructing a connected graph according to the clustering distance, and further obtaining each optimal cluster.
A first sub-step of determining a cluster distance between each connected node.
When the connection graph is constructed by taking the area as a node, a certain range exists in the area, the size of the area affects the connection reality of the clustering distance, for example, the euclidean distance between two adjacent areas with larger areas may be larger than the euclidean distance between two non-adjacent areas with smaller areas, and the two adjacent areas with larger areas have stronger connectivity but larger euclidean distance, so that the two adjacent areas with larger areas may not be connected when the connection graph is constructed. Therefore, when the clustering distance between two adjacent connected nodes is determined, the gray distribution similarity condition between the two nodes and the integral position information relation of the two nodes can be analyzed, and the clustering distance between the two nodes is quantized, so that the obtained clustering distance is more in accordance with the structural distribution characteristics of the integral image, cluster classification is promoted to be clearer, and the gear surface damage detection is convenient to follow.
In this embodiment, for any two connected nodes, determining a clustering distance between two connected nodes may include:
firstly, according to the gray variance of each pixel point in two connected nodes and eight neighborhood pixel points, determining the gray variation degree of the two connected nodes.
In this embodiment, for any one connected node, the difference between the gray variance of any one pixel point in the connected node and the gray variance of the eight neighbor pixel points in the connected node and the average value of the gray variances of all the pixel points and the eight neighbor pixel points is calculated and recorded as the gray variance difference; and taking the average value of the gray variance differences of each pixel point in the connected node as the gray variation degree of the connected node. The gray level change degree can represent the gray level change condition of the whole connected node.
As an example, the calculation formula of the gray level variation degree of the v-th connected node is:
; in the/> For the gray level change degree of the v-th connected node,/>The number of the pixel points in the v-th communication node is the number of the pixel points in the v-th communication node, and c is the number of the pixel points in the v-th communication node,/>Is the gray variance of the c pixel point and the eight neighborhood pixel points in the v connected node,/>Is the average value of gray variance of all pixel points in the v-th connected node and eight neighborhood pixel points thereof,/>For the gray variance difference of the c pixel point in the v-th connected node,/>For absolute value functions.
In a calculation formula of the gray level variation degree, the gray level variance of the pixel points can represent the gray level variation stability corresponding to the pixel points of eight adjacent domains around the pixel points, and the smaller the gray level variance is, the more stable the gray level variation of the region formed by the pixel points and the eight adjacent domains around the pixel points is; the larger the gray variance difference is, the larger the gray variation degree in the connected nodes is, namely, the larger the gray variation degree of the extension area is.
And secondly, determining the distance between the center points of the two communication nodes according to the positions of the center points of the two communication nodes, determining the maximum area value corresponding to the two communication nodes, and taking the ratio of the distance between the center points of the two communication nodes and the maximum area value as the regular distance between the two communication nodes.
In the embodiment, the center points of the two communication nodes are determined first, and then the Euclidean distance between the two center points is calculated according to the positions of the center points of the two communication nodes in the gear gray level image in a Euclidean distance calculation mode; the maximum area value corresponding to the two connected nodes is determined to eliminate the influence of the area on the clustering distance, the Euclidean distance is corrected by using the maximum area value, and the larger the maximum area value is, the larger the correction degree of the Euclidean distance is, and the smaller the regular distance is. The area of the connected node may be the number of pixel points in the connected node.
And then, counting the number of the communication nodes passing through the central point connecting line of the two communication nodes, performing curve fitting on the central points of the two communication nodes and the central points of the communication nodes passing through the central point connecting line to obtain a fitting curve, calculating slope variances of all pixel points on the fitting curve, and taking the product of the number of the communication nodes and the slope variances as the communication degree between the two communication nodes.
In this embodiment, a least square method is used to perform curve fitting on the center points of two communication nodes and the center point of the communication node through which the connection line between the center points passes, and the implementation process of the least square method is in the prior art, which is not in the protection scope of the present invention, and will not be described in detail. Wherein, the horizontal axis of each pixel point on the fitting curve is an abscissa value, and the vertical axis is an ordinate value.
And finally, combining the gray level change degree, the regular distance and the communication degree of the two communication nodes to obtain the clustering distance between the two communication nodes.
In the present embodiment, the v-th communication node and the v-th communication nodeThe calculation formula of the clustering distance between the connected nodes can be:
; in the/> For the v-th connected node and/>Clustering distance between connected nodes, norm is a linear normalization function,/>For the gray level change degree of the v-th connected node,/>For/>Gray level change degree of each connected node,/>To find absolute value function,/>Center point and the/>, which are the v-th connected nodeEuclidean distance between center points of the connected nodes,/>For the area of the v-th connected node,/>For/>Area of individual connected nodes,/>For the v-th connected node and/>Maximum area value corresponding to each connected node,/>For the v-th connected node and/>Regular distance between connected nodes,/>For the v-th connected node and/>The number of connected nodes passed by the central point connecting line of the connected nodes,/>And performing curve fitting on the central points of the two connected nodes and the central point of the connected node through which the connecting line of the central points passes to obtain slope variances of all pixel points on the fitted curve.
In the calculation formula of the clustering distance,The larger, illustrate the v-th connected node and the/>The larger the gray level distribution difference between the two communication nodes is, the larger the clustering distance between the two communication nodes is; /(I)The ratio of Euclidean distance between the central points of two connected nodes in the maximum area value can be characterized,/>The larger the clustering distance is, the smaller the clustering distance is, and the method is used for overcoming the situation that the clustering distance has deviation due to the overlarge area of the area; /(I)Can characterize the v-th communication node and the/>Connectivity between connected nodes,/>The larger the connectivity is, the worse the connectivity is, the larger the clustering distance is, and the smaller the probability that two connected nodes belong to the same cluster is, namely/>The central point distribution discrete degree of other communication nodes on the fitting curve corresponding to the two communication nodes can be represented, and the larger the distribution discrete degree is, the worse the communication reliability between the two communication nodes is, and the larger the clustering distance is.
It should be noted that, referring to the calculation process of the clustering distance between any two connected nodes, the clustering distance between all the connected nodes combined in pairs may be obtained. The clustering distance obtained through multiple angle analysis and calculation can overcome the defect that the clustering distance calculation has deviation when the area is node data, and the accuracy of dividing each optimal clustering cluster obtained later is improved. Of course, other ways of determining the cluster distance between two connected nodes may be used.
And a second sub-step, constructing a connected graph according to the clustering distance, and further obtaining each optimal cluster.
In this embodiment, according to the clustering distance that all the two connected nodes in the gear gray level image form a group, a connected graph can be constructed, then the connected graph is operated according to the connected graph dynamic splitting clustering algorithm, the clustering classification result of the gear gray level image can be obtained, and the clustering classification result obtained at this time is used as an optimal result, namely, each optimal clustering cluster is obtained. The implementation process of the dynamic split clustering algorithm of the connected graph is the prior art, and is not in the scope of the present invention, and will not be described in detail here.
So far, the embodiment obtains each optimal cluster corresponding to the gear gray level image.
And S5, judging whether the surface of the gear to be detected has damage defects or not according to the image characteristics of the connected graph in each optimal cluster.
In this embodiment, the damage degree of each optimal cluster is quantified by analyzing the distribution condition and the gray level change chaotic degree of the connected graph in each optimal cluster, so as to determine whether the surface of the gear to be detected has damage defects.
And firstly, determining the damage degree of each optimal cluster according to the gray variance, perimeter and area of the connected graph in each optimal cluster.
For any one optimal cluster, calculating the ratio of the perimeter and the area of the connected graph in the optimal cluster, further calculating the product of the ratio and the gray variance, and taking the product of the ratio and the gray variance as a first product; and carrying out normalization processing on the first product of the connected graph in the optimal cluster, and taking the numerical value after normalization processing as the damage degree of the optimal cluster.
In this embodiment, referring to the calculation process of the damage degree of any one optimal cluster, the damage degree of each optimal cluster can be obtained; the damage degree can represent the probability that the optimal cluster is the gear abrasion condition, and the greater the damage degree is, the greater the probability that the optimal cluster is the gear abrasion condition is; the larger the ratio of the perimeter to the area, the more complex the pixel distribution of the connected graph in the optimal cluster is, and the more likely the gear is damaged; the larger the gray variance is, the more chaotic the gray variation of the connected graph in the optimal cluster is, and the greater the possibility of damage defects on the surface of the gear to be detected is.
And secondly, if the damage degree of any one of the optimal clusters is larger than a preset damage threshold, judging that the surface of the gear to be detected has damage defects, otherwise, judging that the surface of the gear to be detected has no damage defects.
In this embodiment, the preset damage threshold may be set to 0.7, and the practitioner may set the damage threshold according to a specific practical situation, which is not specifically limited herein. Of course, when analyzing the gear images of other shooting angles of other gear surfaces to be detected, the method can be also applied to the whole damage detection process of the gear surfaces to be detected, which are shot at the front view angle.
This embodiment ends.
The invention provides a gear surface damage detection method based on computer vision, which can divide images into different areas according to the consistency of distribution of acquired pixel points and the stability of dividing areas under different scales when node data are acquired, and the areas are used as the node data to divide subsequent clustering clusters so that the division is clearer, thereby facilitating the detection of subsequent damage conditions.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention and are intended to be included within the scope of the invention.

Claims (7)

1. The method for detecting the damage of the gear surface based on the computer vision is characterized by comprising the following steps of:
acquiring gear gray level images of the surface of a gear to be detected, and further acquiring downsampled images of a preset number of scale levels corresponding to the gear gray level images;
According to the gray value of each pixel point in the image to be analyzed, analyzing the gray distribution similarity condition between the preset window area of each pixel point and the preset window area of the pixel point in the eight neighborhood direction, and determining the extension degree of each pixel point in the eight neighborhood direction of each pixel point; the image to be analyzed is a gear gray image or each downsampled image;
Taking any pixel point in the gear gray level image as a gear pixel point, and determining a mapping pixel point of the gear pixel point in each downsampled image; analyzing extension stability according to extension degrees of all the gear pixel points and all the pixel points in the eight-neighborhood direction of each corresponding mapping pixel point, and determining all the extension pixel points in the eight-neighborhood direction of the gear pixel points;
taking an area formed by connecting the gear pixel points with each extension pixel point in the eight neighborhood direction as a communication node, and further obtaining each communication node; determining the clustering distance between each connected node, constructing a connected graph according to the clustering distance, and further obtaining each optimal cluster;
Judging whether the surface of the gear to be detected has damage defects or not according to the image characteristics of the connected graph in each optimal cluster;
According to the gray value of each pixel in the image to be analyzed, the gray distribution similarity condition between the preset window area of each pixel and the preset window area of the pixel in the eight neighborhood direction is analyzed, and the extension degree of each pixel in the eight neighborhood direction of each pixel is determined, which comprises the following steps:
constructing a window area with a preset size by taking each pixel point in the image to be analyzed as a center point, and taking the window area with the preset size as a preset window area to obtain a preset window area of each pixel point in the image to be analyzed;
for any pixel point in an image to be analyzed, determining each gray level change amplitude of the pixel point according to the pixel point and the gray level value of each neighborhood pixel point in a preset window area of the pixel point; the neighborhood pixel points are other pixel points except the center point in the preset window area;
Taking any one direction of the eight neighborhood directions as a target direction, and taking the pixel point in the target direction of the pixel point as an initial pixel point to be extended; calculating the difference between any gray scale variation amplitude value of the pixel point and each gray scale variation amplitude value of any initial pixel point to be extended, determining the minimum difference, and taking the sum of the minimum differences corresponding to the pixel point as the extension degree of the pixel point in the target direction of the pixel point;
According to the extension degree of each pixel point in the eight-neighborhood direction of the gear pixel point and each corresponding mapping pixel point, analyzing extension stability, and determining each extension pixel point in the eight-neighborhood direction of the gear pixel point, wherein the method comprises the following steps:
Respectively counting the number of the initial pixel points to be extended in the target direction of the gear pixel points and the number of the initial pixel points to be extended in the target direction of each mapping pixel point corresponding to the gear pixel points, and determining the minimum number of the initial pixel points to be extended as an extension analysis number; selecting an initial pixel point to be expanded closest to the pixel point to be analyzed as a target pixel point to be expanded in the target direction of the pixel point to be analyzed, wherein the number of the target pixel points to be expanded is the expansion analysis number; the pixel points to be analyzed are gear pixel points or mapping pixel points corresponding to the gear pixel points;
Calculating extension evaluation indexes of all target pixel points to be extended in the target direction of the gear pixel points; according to the distance between the target pixel to be extended and the gear pixel, judging whether the extension evaluation index of the target pixel to be extended is larger than a preset extension threshold value in sequence from small to large until the target pixel to be extended with the extension evaluation index not larger than the preset extension threshold value appears, and taking the target pixel to be extended with the current extension evaluation index larger than the preset extension threshold value as the extension pixel in the target direction of the gear pixel;
calculating the extension evaluation index of each target pixel point to be extended in the target direction of the gear pixel point, including:
for any one target pixel point to be extended, calculating the accumulated sum of differences between the extension degree of the target pixel point to be extended in the target direction of the gear pixel point and the extension degree of the target pixel point to be extended in the target direction of each mapping pixel point corresponding to the gear pixel point; and carrying out inverse proportion normalization processing on the accumulated sum of the differences to obtain an extension evaluation index of the target pixel point to be extended in the target direction of the gear pixel point.
2. The method for detecting gear surface damage based on computer vision according to claim 1, wherein the determining the respective gray scale variation magnitudes of the pixel points according to the pixel points and the gray scale values of the respective neighboring pixel points in the preset window area of the pixel points comprises:
For any one neighborhood pixel point in a preset window area, calculating the gray value difference between the pixel point and the neighborhood pixel point; determining an included angle between a connecting line between the pixel point and the neighborhood pixel point and the horizontal direction, and further calculating a cosine value of the included angle; taking the product of the gray value difference and the cosine value of the included angle as the gray change amplitude of the pixel point.
3. The method for detecting gear surface damage based on computer vision according to claim 1, wherein determining a clustering distance between each connected node comprises:
for any two connected nodes, determining the gray level change degree of the two connected nodes according to the gray level variance of each pixel point in the two connected nodes and the eight neighborhood pixel points;
determining the distance between the center points of the two communication nodes according to the positions of the center points of the two communication nodes, determining the maximum area value corresponding to the two communication nodes, and taking the ratio of the distance between the center points of the two communication nodes and the maximum area value as the regular distance between the two communication nodes;
Counting the number of connected nodes passing through a central point connecting line of two connected nodes, performing curve fitting on the central points of the two connected nodes and the central points of the connected nodes passing through the central point connecting line to obtain a fitted curve, calculating slope variances of all pixel points on the fitted curve, and taking the product of the number of the connected nodes and the slope variances as the communication degree between the two connected nodes;
And combining the gray level change degree, the regular distance and the communication degree of the two communication nodes to obtain the clustering distance between the two communication nodes.
4. The method for detecting gear surface damage based on computer vision according to claim 3, wherein determining the gray level variation degree of the two connected nodes according to the gray level variance of each pixel point in the two connected nodes and the eight neighboring pixel points thereof comprises:
For any one of the connected nodes, calculating the difference between the gray variance of any one pixel point and the eight neighborhood pixel points in the connected nodes and the average value of the gray variances of all the pixel points and the eight neighborhood pixel points, and marking the difference as the gray variance difference; and taking the average value of the gray variance differences of each pixel point in the connected node as the gray variation degree of the connected node.
5. The method for detecting the damage to the surface of the gear based on the computer vision according to claim 1, wherein the judging whether the damage defect exists on the surface of the gear to be detected according to the image features of the connected graph in each optimal cluster comprises the following steps:
Determining the damage degree of each optimal cluster according to the gray variance, perimeter and area of the connected graph in each optimal cluster; if the damage degree of any one of the optimal clusters is larger than a preset damage threshold, judging that the surface of the gear to be detected has damage defects, otherwise, judging that the surface of the gear to be detected has no damage defects.
6. The method for detecting the damage to the gear surface based on the computer vision according to claim 5, wherein the determining the damage degree of each optimal cluster according to the gray variance, the perimeter and the area of the connected graph in each optimal cluster comprises:
For any one optimal cluster, calculating the ratio of the perimeter and the area of the connected graph in the optimal cluster, further calculating the product of the ratio and the gray variance, and taking the product of the ratio and the gray variance as a first product; and carrying out normalization processing on the first product of the connected graph in the optimal cluster, and taking the numerical value after normalization processing as the damage degree of the optimal cluster.
7. The method for detecting gear surface damage based on computer vision according to claim 1, wherein the obtaining downsampled images of a preset number of scale levels corresponding to the gear gray level image comprises:
And setting a preset number of scale levels, and performing downsampling treatment on the gear gray level images by using the Gaussian pyramid to obtain downsampled images of the preset number of scale levels corresponding to the gear gray level images.
CN202410199476.6A 2024-02-23 2024-02-23 Gear surface damage detection method based on computer vision Active CN117808796B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410199476.6A CN117808796B (en) 2024-02-23 2024-02-23 Gear surface damage detection method based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410199476.6A CN117808796B (en) 2024-02-23 2024-02-23 Gear surface damage detection method based on computer vision

Publications (2)

Publication Number Publication Date
CN117808796A CN117808796A (en) 2024-04-02
CN117808796B true CN117808796B (en) 2024-05-28

Family

ID=90434750

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410199476.6A Active CN117808796B (en) 2024-02-23 2024-02-23 Gear surface damage detection method based on computer vision

Country Status (1)

Country Link
CN (1) CN117808796B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118037726B (en) * 2024-04-12 2024-06-04 陕西中铁华博实业发展有限公司 Railway accessory defect detection method and system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017072517A (en) * 2015-10-08 2017-04-13 西日本高速道路エンジニアリング四国株式会社 Detection method of surface crack of concrete
CN112862760A (en) * 2021-01-19 2021-05-28 浙江大学 Bearing outer ring surface defect area detection method
CN114581510A (en) * 2022-02-28 2022-06-03 扬州宝祥节能科技有限公司 Rolling window resistance point positioning method based on mode identification and artificial intelligence system
CN115272304A (en) * 2022-09-26 2022-11-01 山东滨州安惠绳网集团有限责任公司 Cloth defect detection method and system based on image processing
CN115601362A (en) * 2022-12-14 2023-01-13 临沂农业科技职业学院(筹)(Cn) Welding quality evaluation method based on image processing
CN115829903A (en) * 2021-09-16 2023-03-21 中国科学院微电子研究所 Mask defect detection method and device, computer equipment and storage medium
CN116503403A (en) * 2023-06-27 2023-07-28 无锡斯达新能源科技股份有限公司 Defect detection method of metal cutting tool bit based on image processing
CN116664559A (en) * 2023-07-28 2023-08-29 深圳市金胜电子科技有限公司 Machine vision-based memory bank damage rapid detection method
CN116934748A (en) * 2023-09-15 2023-10-24 山东重交路桥工程有限公司 Pavement crack detection system based on emulsified high-viscosity asphalt
CN116934744A (en) * 2023-09-14 2023-10-24 深圳市冠禹半导体有限公司 MOSFET etching defect detection method based on machine vision
CN117351008A (en) * 2023-12-04 2024-01-05 深圳市阿龙电子有限公司 Smart phone panel surface defect detection method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017072517A (en) * 2015-10-08 2017-04-13 西日本高速道路エンジニアリング四国株式会社 Detection method of surface crack of concrete
CN112862760A (en) * 2021-01-19 2021-05-28 浙江大学 Bearing outer ring surface defect area detection method
CN115829903A (en) * 2021-09-16 2023-03-21 中国科学院微电子研究所 Mask defect detection method and device, computer equipment and storage medium
CN114581510A (en) * 2022-02-28 2022-06-03 扬州宝祥节能科技有限公司 Rolling window resistance point positioning method based on mode identification and artificial intelligence system
CN115272304A (en) * 2022-09-26 2022-11-01 山东滨州安惠绳网集团有限责任公司 Cloth defect detection method and system based on image processing
CN115601362A (en) * 2022-12-14 2023-01-13 临沂农业科技职业学院(筹)(Cn) Welding quality evaluation method based on image processing
CN116503403A (en) * 2023-06-27 2023-07-28 无锡斯达新能源科技股份有限公司 Defect detection method of metal cutting tool bit based on image processing
CN116664559A (en) * 2023-07-28 2023-08-29 深圳市金胜电子科技有限公司 Machine vision-based memory bank damage rapid detection method
CN116934744A (en) * 2023-09-14 2023-10-24 深圳市冠禹半导体有限公司 MOSFET etching defect detection method based on machine vision
CN116934748A (en) * 2023-09-15 2023-10-24 山东重交路桥工程有限公司 Pavement crack detection system based on emulsified high-viscosity asphalt
CN117351008A (en) * 2023-12-04 2024-01-05 深圳市阿龙电子有限公司 Smart phone panel surface defect detection method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PCB表面缺陷视觉检测及锡膏焊缺陷识别技术研究;陈早早;《中国优秀硕士学位论文全文数据库信息科技辑》;20230115(第1期);第1至86页 *

Also Published As

Publication number Publication date
CN117808796A (en) 2024-04-02

Similar Documents

Publication Publication Date Title
CN115829883B (en) Surface image denoising method for special-shaped metal structural member
CN116168026B (en) Water quality detection method and system based on computer vision
CN117808796B (en) Gear surface damage detection method based on computer vision
CN116385450B (en) PS sheet wear resistance detection method based on image processing
CN115330767B (en) Method for identifying production abnormity of corrosion foil
CN116309565B (en) High-strength conveyor belt deviation detection method based on computer vision
CN116977358B (en) Visual auxiliary detection method for corrugated paper production quality
CN115311267B (en) Method for detecting abnormity of check fabric
CN116611748A (en) Titanium alloy furniture production quality monitoring system
CN116137036B (en) Gene detection data intelligent processing system based on machine learning
CN114782329A (en) Bearing defect damage degree evaluation method and system based on image processing
CN117557820B (en) Quantum dot optical film damage detection method and system based on machine vision
CN116091504A (en) Connecting pipe connector quality detection method based on image processing
CN116883408B (en) Integrating instrument shell defect detection method based on artificial intelligence
CN116452589B (en) Intelligent detection method for surface defects of artificial board based on image processing
CN117094916B (en) Visual inspection method for municipal bridge support
CN114881960A (en) Feature enhancement-based cloth linear defect detection method and system
CN116703251A (en) Rubber ring production quality detection method based on artificial intelligence
CN115018835A (en) Automobile starter gear detection method
CN117635609A (en) Visual inspection method for production quality of plastic products
CN117876382A (en) System and method for detecting tread pattern defects of automobile tire
CN117237350A (en) Real-time detection method for quality of steel castings
CN115423807B (en) Cloth defect detection method based on outlier detection
CN114742849B (en) Leveling instrument distance measuring method based on image enhancement
CN115311287A (en) Method for detecting production abnormity of common rail oil injector

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant