CN118096728B - Machine vision-based part spraying quality detection method - Google Patents

Machine vision-based part spraying quality detection method Download PDF

Info

Publication number
CN118096728B
CN118096728B CN202410466141.6A CN202410466141A CN118096728B CN 118096728 B CN118096728 B CN 118096728B CN 202410466141 A CN202410466141 A CN 202410466141A CN 118096728 B CN118096728 B CN 118096728B
Authority
CN
China
Prior art keywords
gray level
reference gray
difference
pixel point
defect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410466141.6A
Other languages
Chinese (zh)
Other versions
CN118096728A (en
Inventor
陈松
荣震
徐世霖
霍建仁
滕鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Camc Surface Technology Jiangsu Co ltd
Original Assignee
Camc Surface Technology Jiangsu Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Camc Surface Technology Jiangsu Co ltd filed Critical Camc Surface Technology Jiangsu Co ltd
Priority to CN202410466141.6A priority Critical patent/CN118096728B/en
Publication of CN118096728A publication Critical patent/CN118096728A/en
Application granted granted Critical
Publication of CN118096728B publication Critical patent/CN118096728B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to a machine vision-based part spraying quality detection method, which comprises the following steps: obtaining defect contrast degree of each reference gray level according to the difference of the change rate of the pixel point quantity of each reference gray level and the adjacent reference gray level in the reference gray level sequence and the difference between the gray value corresponding to each reference gray level and the gray value corresponding to the reference gray level with the largest pixel point quantity; obtaining the defect probability of the pixel point corresponding to each reference gray level according to the defect contrast degree of the reference gray level of the mark pixel point of all the pixel points corresponding to each reference gray level, the gradient amplitude of all the pixel points corresponding to each reference gray level and the defect contrast degree of each reference gray level, and obtaining the enhanced part image; and spray quality detection is performed. The invention enhances the contrast of the image and improves the accuracy of the spray quality detection of the parts.

Description

Machine vision-based part spraying quality detection method
Technical Field
The invention relates to the technical field of image processing, in particular to a machine vision-based part spraying quality detection method.
Background
The spray quality detection of parts is an important link in the manufacturing industry, especially in the industries of automobiles, aerospace, electronics, consumer goods and the like. The spraying is not only used for beauty, but also can provide a protective layer, prevent corrosion and abrasion, ensure the spraying quality and be vital to the final performance and service life of the product. Because flow marks can occur in the process of spraying parts, the appearance and quality of products can be affected by the flow marks, and therefore defect detection of the flow marks in the process of spraying the parts is particularly important.
The quality defect problem in the process of spraying the parts can be detected by inputting the part images according to the neural network, the acquired part images need to be enhanced due to noise interference in the process of detecting the parts according to the neural network, but the images are enhanced through histogram equalization in the process of enhancing the part images, but the enhanced images enhance some details in the images, but the details between the normal area and the defect area are not reflected, so that the accuracy of detecting the spraying quality of the parts is reduced.
Disclosure of Invention
The invention provides a machine vision-based part spraying quality detection method, which aims to solve the existing problems.
The machine vision-based part spraying quality detection method provided by the invention adopts the following technical scheme:
the invention provides a machine vision-based part spraying quality detection method, which comprises the following steps of:
Capturing a part image;
The gray level of the number of pixel points in the gray level histogram of the part image is recorded as reference gray level, all the reference gray levels are sequenced to obtain a reference gray level sequence, and the defect comparison degree of each reference gray level is obtained according to the difference of the change rate of the pixel points of each reference gray level and the adjacent reference gray level in the reference gray level sequence and the difference between the gray level value corresponding to each reference gray level and the gray level value corresponding to the reference gray level with the maximum number of pixel points;
Obtaining a marked pixel point in a neighborhood range of each pixel point according to the gradient direction of each pixel point, obtaining a defect probability of the pixel point corresponding to each reference gray level according to the defect contrast degree of the reference gray level of the marked pixel point of all the pixel points corresponding to each reference gray level, the gradient amplitude of all the pixel points corresponding to each reference gray level and the defect contrast degree of each reference gray level, and obtaining a gray level corresponding to each reference gray level after equalization and an enhanced part image according to the ratio of the number of the pixel points of the reference gray level to the total number of all the pixel points of all the reference gray levels and the defect probability of the pixel point corresponding to each reference gray level;
and performing spray quality detection on the enhanced part image.
Further, the step of sorting all the reference gray levels to obtain a reference gray level sequence includes the following specific steps:
And sequencing all the reference gray levels according to the gray level from small to large to obtain a group of sequences, and recording the sequenced sequences as reference gray level sequences.
Further, the step of obtaining the defect contrast degree of each reference gray level according to the difference between the change rate of the pixel point number of each reference gray level and the adjacent reference gray level in the reference gray level sequence, the difference between the gray value corresponding to each reference gray level and the gray value corresponding to the reference gray level with the largest pixel point number comprises the following specific steps:
Obtaining the pixel point quantity difference characteristic of each reference gray level according to the difference of the change rate of the pixel point quantity of each reference gray level and the adjacent reference gray level in the reference gray level sequence, and obtaining the defect comparison degree of each reference gray level according to the pixel point quantity difference characteristic of each reference gray level, the gray value corresponding to each reference gray level and the difference between the gray values corresponding to the reference gray level with the maximum pixel point quantity.
Further, the method for obtaining the pixel point quantity difference characteristic of each reference gray level according to the difference of the change rate of the pixel point quantity of each reference gray level and the adjacent reference gray level in the reference gray level sequence comprises the following specific steps:
The reference gray level adjacent to the left of each reference gray level in the reference gray level sequence is marked as a first reference gray level, and the reference gray level adjacent to the left of the first reference gray level is marked as a second reference gray level; the method comprises the steps of recording a reference gray level adjacent to each reference gray level right in a reference gray level sequence as a third reference gray level, recording a reference gray level adjacent to the third reference gray level right as a fourth reference gray level, recording the absolute value of the difference value between each reference gray level and the number of pixels of the first reference gray level as a first difference absolute value, recording the absolute value of the difference value between the first reference gray level and the number of pixels of the second reference gray level as a second difference absolute value, recording the absolute value of the difference value between each reference gray level and the number of pixels of the third reference gray level as a third difference absolute value, recording the absolute value of the difference value between the third reference gray level and the number of pixels of the fourth reference gray level as a fourth difference absolute value, recording the ratio of the first difference absolute value to the number of pixels of each reference gray level as a first difference absolute value, recording the ratio of the second difference absolute value to the number of pixels of the first reference gray level as a second difference absolute value, recording the ratio of the third difference value to the number of pixels of each reference gray level as a third difference absolute value, recording the ratio of the third difference value to the number of pixels of the fourth reference gray level as a fourth difference value;
the absolute value of the difference between the first ratio and the second ratio is recorded as a first difference, the absolute value of the difference between the third ratio and the fourth ratio is recorded as a second difference, the average value of the first difference and the second difference is recorded as a first value of each reference gray level, the first values of all the reference gray levels are subjected to linear normalization to obtain normalized first values of each reference gray level, and the normalized first values of each reference gray level are used as pixel point number difference characteristics of each reference gray level.
Further, the step of obtaining the defect contrast degree of each reference gray level according to the difference between the pixel point number difference characteristic of each reference gray level, the gray value corresponding to each reference gray level and the gray value corresponding to the reference gray level with the largest pixel point number comprises the following specific steps:
recording the absolute value of the difference between the gray value corresponding to each reference gray level and the gray value corresponding to the reference gray level with the largest pixel number as the second value of each reference gray level Will/>The third value of each reference gray level is recorded as a product result of the third value of each reference gray level and the pixel point quantity difference characteristic of each reference gray level, and the product result is used as the defect comparison degree of each reference gray level;
Wherein, An exponential function based on a natural constant is represented.
Further, the step of obtaining the labeled pixel point in the neighborhood range of each pixel point according to the gradient direction of each pixel point includes the following specific steps:
And marking the pixel points in the eight adjacent areas pointed by the gradient direction of each pixel point as marked pixel points of each pixel point.
Further, the obtaining the defect probability of the pixel point corresponding to each reference gray level according to the defect contrast degree of the reference gray level of the mark pixel point of all the pixel points corresponding to each reference gray level, the gradient amplitude of all the pixel points corresponding to each reference gray level and the defect contrast degree of each reference gray level includes the following specific steps:
Obtaining gradient defect comparison degree of each reference gray level according to the defect comparison degree of the reference gray level of the mark pixel point of all pixel points corresponding to each reference gray level, the gradient amplitude of all pixel points corresponding to each reference gray level and the defect comparison degree of each reference gray level;
and taking the average value of the defect contrast degree of each reference gray level and the gradient defect contrast degree of each reference gray level as the defect probability of the pixel point corresponding to each reference gray level.
Further, the step of obtaining the gradient defect contrast degree of each reference gray level according to the defect contrast degree of the reference gray level of the marking pixel point of all the pixel points corresponding to each reference gray level, the gradient amplitude of all the pixel points corresponding to each reference gray level and the defect contrast degree of each reference gray level comprises the following specific steps:
Marking the result of the product between the defect contrast degree of the reference gray level of the marked pixel point of each pixel point corresponding to each reference gray level and the gradient amplitude of the marked pixel point of each pixel point corresponding to each reference gray level as a first product value of the marked pixel point of each pixel point corresponding to each reference gray level, and marking the average value of the first product values of the marked pixel points of all the pixel points corresponding to each reference gray level as a second product value of each reference gray level Will beThe third product value of each reference gray level is recorded, and the product result of the third product value of each reference gray level and the defect contrast degree of each reference gray level is used as the gradient defect contrast degree of each reference gray level;
Wherein, An exponential function based on a natural constant is represented.
Further, the method for obtaining the gray level corresponding to each reference gray level after equalization and obtaining the enhanced part image according to the ratio of the number of the pixels of the reference gray level to the total number of the pixels of all the reference gray levels and the defect probability of the pixels corresponding to each reference gray level, comprises the following specific steps:
The ratio of the number of pixel points of each reference gray level to the total number of all pixel points of all reference gray levels is recorded as a first characteristic of each reference gray level, the ratio between the defect probability of the pixel point corresponding to each reference gray level and the accumulated sum of the defect probabilities of the pixel points corresponding to all reference gray levels is recorded as a second characteristic of each reference gray level, the result of subtracting 1 from the number of all reference gray levels is recorded as a third characteristic, the product result of the first characteristic of each reference gray level, the second characteristic of each reference gray level and the third characteristic is recorded as a fourth characteristic of each reference gray level, and the fourth characteristic of each reference gray level is rounded up and rounded down to obtain the corresponding gray level after equalization of each reference gray level;
And obtaining the enhanced part image through the corresponding gray level after each reference gray level is equalized.
Further, the spray quality detection of the enhanced part image comprises the following specific steps:
Acquiring a large number of part images, using U% of the part images for training the DNN neural network, and 1-U% of the part images for verification; the loss function of the neural network is a cross entropy loss function; performing quality detection of part spraying according to the enhanced part image and the trained DNN neural network, when the output of the neural network is 0, judging that no flow mark defect exists in the spraying process of the part, and when the output of the neural network is 1, judging that the flow mark defect exists in the spraying process of the part;
Wherein U is a preset parameter.
The technical scheme of the invention has the beneficial effects that: according to the difference of the change rate of the pixel point quantity of each reference gray level and the adjacent reference gray level in the reference gray level sequence and the difference between the gray value corresponding to each reference gray level and the gray value corresponding to the reference gray level with the largest pixel point quantity, the defect contrast degree of each reference gray level is obtained, and the accuracy of analyzing the contrast of each reference gray level is improved; the defect probability of the pixel point corresponding to each reference gray level is obtained according to the defect contrast degree of the reference gray level of the mark pixel point of all the pixel points corresponding to each reference gray level, the gradient amplitude of all the pixel points corresponding to each reference gray level and the defect contrast degree of each reference gray level; and obtaining the corresponding gray level after equalization of each reference gray level according to the ratio of the number of the pixel points of the reference gray level to the total number of the pixel points of all the reference gray levels and the defect probability of the pixel point corresponding to each reference gray level, obtaining the enhanced part image, carrying out spray quality detection on the enhanced part image, enhancing the contrast of the image and improving the accuracy of spray quality detection of the part.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of steps of a machine vision-based component spray quality detection method of the present invention;
Fig. 2 is a flow chart of the component spray quality inspection.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following detailed description is given below with reference to the accompanying drawings and the preferred embodiments, on the basis of the machine vision-based part spraying quality detection method provided by the invention, the specific implementation, structure, characteristics and effects thereof. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the machine vision-based component spraying quality detection method provided by the invention with reference to the accompanying drawings.
Referring to fig. 1, a flowchart illustrating a method for detecting spraying quality of a component based on machine vision according to an embodiment of the invention is shown, the method includes the following steps:
Step S001: a part image is acquired.
In order to analyze defects occurring during the spraying of the parts, it is necessary to collect images after the spraying of the parts and analyze whether there are flow mark defects in the spraying of the parts according to the images.
Specifically, a high-resolution CCD camera is used for collecting the sprayed image of the part, and the pre-processing operation of graying and non-local mean filtering is carried out on the sprayed image of the part, so that the pre-processed part image is obtained.
The preprocessing of graying and non-local mean filtering is known in the art, and detailed description thereof is omitted herein.
To this end, an acquisition part image is obtained.
Step S002: and marking the gray level with the number of the pixel points in the gray level histogram of the part image as a reference gray level, sequencing all the reference gray levels to obtain a reference gray level sequence, and obtaining the defect comparison degree of each reference gray level according to the difference of the change rate of the pixel points of each reference gray level and the adjacent reference gray level in the reference gray level sequence and the difference between the gray level value corresponding to each reference gray level and the gray level value corresponding to the reference gray level with the maximum number of the pixel points.
In the process of spraying the parts, if the spraying of each place on the surfaces of the parts is uniform, that is, the distribution of gray values of all pixel points in the obtained part image is concentrated, the gray histogram of the part image shows gaussian distribution; when the spraying of each part on the surface of the part is not uniform, that is, the distribution of the gray values of all pixel points in the obtained part image is relatively dispersed, the gray histogram of the part image may not show Gaussian distribution; the defect level of the whole image can be analyzed from the gray level histogram of the part image.
Further, on the gray level histogram of the part image, the larger the rate of change between the numbers of pixels corresponding to adjacent gray levels of each gray level, the greater the likelihood that the pixels in the part image corresponding to the gray level are defective pixels is described; the smaller the rate of change between the number of pixels corresponding to adjacent gray levels for each gray level, the less likely that a pixel in the part image corresponding to that gray level is a defective pixel. And under normal conditions, the gray values of the pixel points in the part image after the spraying of the parts are concentrated, so that the possibility of defects of the pixel points of each gray level can be analyzed according to the difference between each gray level and the gray level with the largest number of the pixel points in the gray level histogram of the part image.
Specifically, a gray level histogram in the part image is obtained, gray levels with the number of pixels in the gray level histogram are recorded as reference gray levels, all the reference gray levels are ordered according to the gray level size from small to large, a group of sequences are obtained, and the ordered sequences are recorded as reference gray level sequences. Wherein the gray levels in the gray histogram and the gray values in the part image are in one-to-one correspondence, i.e. the gray levels are gray values.
According to the difference of the change rate of the pixel point quantity of each reference gray level and the adjacent reference gray level, the pixel point quantity difference characteristic of each reference gray level is obtained, and as an embodiment, the specific calculation method comprises the following steps:
In the method, in the process of the invention, Representing the/>, in a sequence of reference grey levelsNumber of pixel points corresponding to each reference gray level,/>Representing the/>, in a sequence of reference grey levelsNumber of pixel points corresponding to each reference gray level,/>Representing the/>, in a sequence of reference grey levelsNumber of pixel points corresponding to each reference gray level,/>Representing the/>, in a sequence of reference grey levelsReference gray level and/>Absolute value of difference between pixel numbers corresponding to the reference gray levels,/>Representing the/>, in a sequence of reference grey levelsReference gray level and/>Absolute value of difference between pixel numbers corresponding to the reference gray levels,/>Representing the/>, in a sequence of reference grey levelsReference gray level and/>Absolute value of difference between pixel numbers corresponding to the reference gray levels,/>Representing the/>, in a sequence of reference grey levelsReference gray level and/>Absolute value of difference between pixel numbers corresponding to the reference gray levels,/>Is the absolute value sign,/>Represent the firstPixel point number difference feature of each reference gray level,/>Representing a linear normalization function.
Wherein,Representing a rate of change of the number of pixels corresponding to adjacent two reference gray levels in the sequence of reference gray levels based on the number of pixels per reference gray level,/>The difference of the change rates of the pixel numbers corresponding to the adjacent two reference gray levels in the reference gray level sequence is represented, when the difference is larger, the difference characteristic of the pixel numbers representing the reference gray levels is larger, and the possibility that the pixel corresponding to the reference gray level is the pixel of the defect area is larger; when the difference is smaller, the difference characteristic of the number of the pixels representing the reference gray level is smaller, the probability that the pixel corresponding to the reference gray level is the pixel of the defect area is smaller.
In this embodiment, since the defect area is further enhanced and more details in the defect area are shown, the contrast between the gray value of each reference gray level and the gray value of the reference gray level with the largest number of pixels is enhanced.
According to the difference between the gray value corresponding to each reference gray level and the gray value corresponding to the reference gray level with the largest pixel point number and the pixel point number difference characteristic of each reference gray level, the defect contrast degree of each reference gray level is obtained, and as an embodiment, the specific calculation method is as follows:
In the method, in the process of the invention, Represents the/>Pixel point number difference feature of each reference gray level,/>Represents the/>Gray values corresponding to the reference gray levels,/>Representing gray values corresponding to reference gray levels with the largest number of pixel points,/>Representing an exponential function based on a natural constant,/>Represents the/>Defect contrast level for each reference gray level.
Wherein,Representing the difference between the gray value corresponding to each reference gray level and the gray value corresponding to the reference gray level with the largest number of pixel points, and when the difference is larger, indicating that a clear texture exists between a high gray value and a low gray value in the texture, then the enhancement is required to be lower; when the difference is smaller, indicating that there is unclear texture between a high gray value and a low gray value in the texture, the degree of enhancement is required to be higher. When the pixel point quantity difference characteristic of each reference gray level is larger, the degree of enhancement is required to be larger; the smaller the pixel point number difference feature of each reference gray level, the smaller the degree of enhancement is indicated.
So far, the defect contrast degree of each reference gray level is obtained.
Step S003: obtaining a marked pixel point in a neighborhood range of each pixel point according to the gradient direction of each pixel point, obtaining a defect probability of the pixel point corresponding to each reference gray level according to the defect contrast degree of the reference gray level of the marked pixel point of all the pixel points corresponding to each reference gray level, the gradient amplitude of all the pixel points corresponding to each reference gray level and the defect contrast degree of each reference gray level, and obtaining a gray level corresponding to each reference gray level after equalization and obtaining the enhanced part image according to the ratio of the number of the pixel points of the reference gray level to the total number of all the pixel points of all the reference gray levels and the defect probability of the pixel point corresponding to each reference gray level.
It should be noted that, when a flow mark occurs during the spraying process of the parts, the gray values of the adjacent pixel points at the flow mark should be relatively different, so that the defect contrast degree of the corresponding reference gray level of the pixel point in the eight neighborhood of each pixel point can be analyzed by the gradient value of each pixel point for further analysis.
Specifically, acquiring a gradient direction and a gradient amplitude of each pixel point in the part image through Soble operators, and marking the pixel point in eight adjacent areas pointed by the gradient direction of each pixel point as a marked pixel point of each pixel point; the Soble operator is a well-known technique, and will not be described in detail herein.
Wherein 360 DEG is equally divided into eight sections, namely,/>,/>,/>,/>,/>,/>,/>; Wherein each section is left-closed and right-open, and when the gradient direction of the pixel points to/>When the pixel points are in the eight adjacent areas, the right side pixel points are marked as mark pixel points; when the gradient direction of the pixel points to/>When the pixel points in the right upper direction in the eight adjacent areas are marked as marked pixel points; when the gradient direction of the pixel points to/>When the pixel points in the eight adjacent areas are marked as the pixel points; when the gradient direction of the pixel points to/>When the pixel point is in the left upper direction in the eight adjacent areas, the pixel point is marked as a marked pixel point; when the gradient direction of the pixel points to/>When the pixel points are in the eight adjacent areas, the left pixel point is marked as a marked pixel point; when the gradient direction of the pixel points to/>When the pixel points in the left lower direction in the eight adjacent areas are marked as marked pixel points; when the gradient direction of the pixel points is pointedWhen the pixel points are in the eight adjacent areas, the pixel points in the vertical downward direction are marked as mark pixel points; when the gradient direction of the pixel points to/>In this case, the pixel in the lower right direction in the octal neighborhood is referred to as a marker pixel.
According to the defect contrast degree of the reference gray level of the mark pixel point of all pixel points corresponding to each reference gray level, the gradient amplitude of all pixel points corresponding to each reference gray level and the defect contrast degree of each reference gray level, the gradient defect contrast degree of each reference gray level is obtained, and as an embodiment, the specific calculation method is as follows:
In the method, in the process of the invention, Represents the/>The/>, corresponding to the reference gray levelDefect contrast degree of reference gray level of marked pixel point of each pixel point,/>Represents the/>The/>, corresponding to the reference gray levelGradient amplitude of each pixel,/>Representing the number of all pixel points corresponding to each reference gray level,/>Representing an exponential function based on natural maturity,/>Represents the/>Defect contrast level of each reference gray level,/>Representing the gradient defect contrast level for each reference gray level.
Wherein,Representing the gray scale difference between each pixel point and the pixel points in the eight adjacent neighborhoods, and when the gray scale difference is larger, indicating that a clear texture exists between a high gray scale value and a low gray scale value in the texture, then the enhancement is required to be lower; when the difference is smaller, indicating that there is unclear texture between a high gray value and a low gray value in the texture, the degree of enhancement is required to be higher.
According to the defect contrast degree of each reference gray level and the gradient defect contrast degree of each reference gray level, the defect probability of the pixel point corresponding to each reference gray level is obtained, and as an embodiment, the specific calculation method comprises the following steps:
In the method, in the process of the invention, Represents the/>Defect contrast level of each reference gray level,/>Representing the gradient defect contrast level for each reference gray level,/>Represents the/>The defect probability of the pixel point corresponding to each reference gray level.
When the defect contrast degree of each reference gray level and the gradient defect contrast degree of each reference gray level are larger, the defect probability of the pixel point corresponding to the reference gray level is larger; when the defect contrast degree of each reference gray level and the gradient defect contrast degree of each reference gray level are smaller, the defect probability of the pixel point corresponding to the reference gray level is smaller.
When the part image is enhanced by histogram equalization in the conventional way, the mapping relationship of gray levels before and after equalization is carried out by the product of the ratio and the gray level number minus 1 only by the ratio of the number of pixel points of each gray level to the number of all pixel points in the part image; however, in the above analysis, since the contrast between the pixels cannot be well compared, the contrast between the gray levels is analyzed by obtaining the defect probability of the pixel corresponding to each gray level, and thus the effect of enhancing the component image can be improved.
Specifically, according to the ratio of the number of pixels of the reference gray level to the total number of all pixels of all the reference gray levels, the number of all the reference gray levels, and the ratio between the defect probability of the pixel corresponding to each reference gray level and the sum of the defect probabilities of the pixels corresponding to all the reference gray levels, the corresponding gray level after equalization of each reference gray level is obtained, as an embodiment, the specific calculation method is as follows:
In the method, in the process of the invention, Represents the/>Defect probability of pixel point corresponding to each reference gray level,/>Representing the sum of defect probabilities of pixel points corresponding to all reference gray levels,/>Represents the/>Number of pixel points corresponding to each reference gray level,/>Representing the total number of pixel points corresponding to all reference gray levels, namely the total number of all pixel points in the part image; /(I)Representing the number of all reference grey levels,/>Represents the/>Corresponding gray level after equalization of the reference gray level,/>Representing the rounding function and rounding, i.e. pair/>Rounding is performed and then rounding is performed.
Wherein,For the ratio of the defect probability of the pixel point corresponding to each reference gray level in the defect probability sum of the pixel points corresponding to all the reference gray levels, the larger the ratio is, the larger the degree of the reference gray level to be adjusted is, and the smaller the ratio is, the smaller the degree of the reference gray level to be adjusted is.
So far, the corresponding gray level after each reference gray level is equalized is obtained.
And obtaining the enhanced part image through the corresponding gray level after each reference gray level is equalized.
Step S004: and performing spray quality detection on the enhanced part image.
A parameter U is preset, where the embodiment is described by taking u=70 as an example, and the embodiment is not specifically limited, where U may be determined according to the specific implementation situation.
Acquiring a large number of part images, using U% of the part images for training a DNN neural network (deep neural network ), and 1-U% of the part images for verification; the loss function of the neural network in this embodiment is a cross entropy loss function; performing quality detection of part spraying according to the enhanced part image and the trained DNN neural network, when the output of the neural network is 0, judging that no flow mark defect exists in the spraying process of the part, and when the output of the neural network is 1, judging that the flow mark defect exists in the spraying process of the part; the flow of the spray quality detection of the parts is shown in figure 2. The DNN neural network is a known technology, and will not be described in detail herein.
This embodiment is completed.
The following examples were usedThe model is only used to represent the negative correlation and the result output by the constraint model is at/>In the section, other models with the same purpose can be replaced in the specific implementation, and the embodiment is only to/>The model is described as an example, and is not particularly limited, wherein/>Refers to the input of the model.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the invention, but any modifications, equivalent substitutions, improvements, etc. within the principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. The machine vision-based part spraying quality detection method is characterized by comprising the following steps of:
Capturing a part image;
The gray level of the number of pixel points in the gray level histogram of the part image is recorded as reference gray level, all the reference gray levels are sequenced to obtain a reference gray level sequence, and the defect comparison degree of each reference gray level is obtained according to the difference of the change rate of the pixel points of each reference gray level and the adjacent reference gray level in the reference gray level sequence and the difference between the gray level value corresponding to each reference gray level and the gray level value corresponding to the reference gray level with the maximum number of pixel points;
Obtaining a marked pixel point in a neighborhood range of each pixel point according to the gradient direction of each pixel point, obtaining a defect probability of the pixel point corresponding to each reference gray level according to the defect contrast degree of the reference gray level of the marked pixel point of all the pixel points corresponding to each reference gray level, the gradient amplitude of all the pixel points corresponding to each reference gray level and the defect contrast degree of each reference gray level, and obtaining a gray level corresponding to each reference gray level after equalization and an enhanced part image according to the ratio of the number of the pixel points of the reference gray level to the total number of all the pixel points of all the reference gray levels and the defect probability of the pixel point corresponding to each reference gray level;
and performing spray quality detection on the enhanced part image.
2. The machine vision-based component spraying quality detection method according to claim 1, wherein the step of sorting all the reference gray levels to obtain a reference gray level sequence comprises the following specific steps:
And sequencing all the reference gray levels according to the gray level from small to large to obtain a group of sequences, and recording the sequenced sequences as reference gray level sequences.
3. The machine vision-based component spraying quality detection method according to claim 1, wherein the obtaining the defect comparison degree of each reference gray level according to the difference between the change rate of the pixel point number of each reference gray level and the adjacent reference gray level in the reference gray level sequence, the difference between the gray value corresponding to each reference gray level and the gray value corresponding to the reference gray level with the largest pixel point number comprises the following specific steps:
Obtaining the pixel point quantity difference characteristic of each reference gray level according to the difference of the change rate of the pixel point quantity of each reference gray level and the adjacent reference gray level in the reference gray level sequence, and obtaining the defect comparison degree of each reference gray level according to the pixel point quantity difference characteristic of each reference gray level, the gray value corresponding to each reference gray level and the difference between the gray values corresponding to the reference gray level with the maximum pixel point quantity.
4. The machine vision-based component spraying quality detection method according to claim 3, wherein the obtaining the pixel point quantity difference characteristic of each reference gray level according to the difference of the change rate of the pixel point quantity of each reference gray level and the adjacent reference gray level in the reference gray level sequence comprises the following specific steps:
The reference gray level adjacent to the left of each reference gray level in the reference gray level sequence is marked as a first reference gray level, and the reference gray level adjacent to the left of the first reference gray level is marked as a second reference gray level; the method comprises the steps of recording a reference gray level adjacent to each reference gray level right in a reference gray level sequence as a third reference gray level, recording a reference gray level adjacent to the third reference gray level right as a fourth reference gray level, recording the absolute value of the difference value between each reference gray level and the number of pixels of the first reference gray level as a first difference absolute value, recording the absolute value of the difference value between the first reference gray level and the number of pixels of the second reference gray level as a second difference absolute value, recording the absolute value of the difference value between each reference gray level and the number of pixels of the third reference gray level as a third difference absolute value, recording the absolute value of the difference value between the third reference gray level and the number of pixels of the fourth reference gray level as a fourth difference absolute value, recording the ratio of the first difference absolute value to the number of pixels of each reference gray level as a first difference absolute value, recording the ratio of the second difference absolute value to the number of pixels of the first reference gray level as a second difference absolute value, recording the ratio of the third difference value to the number of pixels of each reference gray level as a third difference absolute value, recording the ratio of the third difference value to the number of pixels of the fourth reference gray level as a fourth difference value;
the absolute value of the difference between the first ratio and the second ratio is recorded as a first difference, the absolute value of the difference between the third ratio and the fourth ratio is recorded as a second difference, the average value of the first difference and the second difference is recorded as a first value of each reference gray level, the first values of all the reference gray levels are subjected to linear normalization to obtain normalized first values of each reference gray level, and the normalized first values of each reference gray level are used as pixel point number difference characteristics of each reference gray level.
5. The machine vision-based component spraying quality detection method according to claim 3, wherein the obtaining the defect comparison degree of each reference gray level according to the difference between the pixel point number difference characteristic of each reference gray level, the gray value corresponding to each reference gray level and the gray value corresponding to the reference gray level with the largest pixel point number comprises the following specific steps:
recording the absolute value of the difference between the gray value corresponding to each reference gray level and the gray value corresponding to the reference gray level with the largest pixel number as the second value of each reference gray level Will/>The third value of each reference gray level is recorded as a product result of the third value of each reference gray level and the pixel point quantity difference characteristic of each reference gray level, and the product result is used as the defect comparison degree of each reference gray level;
Wherein, An exponential function based on a natural constant is represented.
6. The machine vision-based component spraying quality detection method according to claim 1, wherein the step of obtaining the marked pixel point in each pixel point neighborhood range according to the gradient direction of each pixel point comprises the following specific steps:
And marking the pixel points in the eight adjacent areas pointed by the gradient direction of each pixel point as marked pixel points of each pixel point.
7. The machine vision-based component spraying quality detection method according to claim 1, wherein the obtaining the defect probability of the pixel corresponding to each reference gray level according to the defect contrast degree of the reference gray level of the marked pixel of all the pixels corresponding to each reference gray level, the gradient amplitude of all the pixels corresponding to each reference gray level, and the defect contrast degree of each reference gray level comprises the following specific steps:
Obtaining gradient defect comparison degree of each reference gray level according to the defect comparison degree of the reference gray level of the mark pixel point of all pixel points corresponding to each reference gray level, the gradient amplitude of all pixel points corresponding to each reference gray level and the defect comparison degree of each reference gray level;
and taking the average value of the defect contrast degree of each reference gray level and the gradient defect contrast degree of each reference gray level as the defect probability of the pixel point corresponding to each reference gray level.
8. The machine vision-based component spraying quality detection method according to claim 7, wherein the step of obtaining the gradient defect contrast degree of each reference gray level according to the defect contrast degree of the reference gray level of the marking pixel point of all the pixel points corresponding to each reference gray level, the gradient amplitude of all the pixel points corresponding to each reference gray level, and the defect contrast degree of each reference gray level comprises the following specific steps:
Marking the result of the product between the defect contrast degree of the reference gray level of the marked pixel point of each pixel point corresponding to each reference gray level and the gradient amplitude of the marked pixel point of each pixel point corresponding to each reference gray level as a first product value of the marked pixel point of each pixel point corresponding to each reference gray level, and marking the average value of the first product values of the marked pixel points of all the pixel points corresponding to each reference gray level as a second product value of each reference gray level Will beThe third product value of each reference gray level is recorded, and the product result of the third product value of each reference gray level and the defect contrast degree of each reference gray level is used as the gradient defect contrast degree of each reference gray level;
Wherein, An exponential function based on a natural constant is represented.
9. The machine vision-based component spraying quality detection method according to claim 1, wherein the specific steps of obtaining the corresponding gray level after equalization of each reference gray level and obtaining the enhanced component image according to the ratio of the number of pixels of the reference gray level to the total number of pixels of all the reference gray levels and the defect probability of the pixels corresponding to each reference gray level are as follows:
The ratio of the number of pixel points of each reference gray level to the total number of all pixel points of all reference gray levels is recorded as a first characteristic of each reference gray level, the ratio between the defect probability of the pixel point corresponding to each reference gray level and the accumulated sum of the defect probabilities of the pixel points corresponding to all reference gray levels is recorded as a second characteristic of each reference gray level, the result of subtracting 1 from the number of all reference gray levels is recorded as a third characteristic, the product result of the first characteristic of each reference gray level, the second characteristic of each reference gray level and the third characteristic is recorded as a fourth characteristic of each reference gray level, and the fourth characteristic of each reference gray level is rounded up and rounded down to obtain the corresponding gray level after equalization of each reference gray level;
And obtaining the enhanced part image through the corresponding gray level after each reference gray level is equalized.
10. The machine vision-based part spray quality detection method according to claim 1, wherein the spray quality detection of the enhanced part image comprises the following specific steps:
Acquiring a large number of part images, using U% of the part images for training the DNN neural network, and 1-U% of the part images for verification; the loss function of the neural network is a cross entropy loss function; performing quality detection of part spraying according to the enhanced part image and the trained DNN neural network, when the output of the neural network is 0, judging that no flow mark defect exists in the spraying process of the part, and when the output of the neural network is 1, judging that the flow mark defect exists in the spraying process of the part;
Wherein U is a preset parameter.
CN202410466141.6A 2024-04-18 2024-04-18 Machine vision-based part spraying quality detection method Active CN118096728B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410466141.6A CN118096728B (en) 2024-04-18 2024-04-18 Machine vision-based part spraying quality detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410466141.6A CN118096728B (en) 2024-04-18 2024-04-18 Machine vision-based part spraying quality detection method

Publications (2)

Publication Number Publication Date
CN118096728A CN118096728A (en) 2024-05-28
CN118096728B true CN118096728B (en) 2024-06-25

Family

ID=91153510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410466141.6A Active CN118096728B (en) 2024-04-18 2024-04-18 Machine vision-based part spraying quality detection method

Country Status (1)

Country Link
CN (1) CN118096728B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115311303A (en) * 2022-10-12 2022-11-08 南通富兰妮纺织品有限公司 Textile warp and weft defect detection method
CN116433663A (en) * 2023-06-13 2023-07-14 肥城恒丰塑业有限公司 Intelligent geotechnical cell quality detection method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437238B (en) * 2023-12-22 2024-03-29 深圳宝铭微电子有限公司 Visual inspection method for surface defects of packaged IC
CN117522870B (en) * 2024-01-04 2024-03-19 陕西凯迈航空航天机电设备有限公司 Intelligent defect detection method for aeroengine parts based on machine vision

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115311303A (en) * 2022-10-12 2022-11-08 南通富兰妮纺织品有限公司 Textile warp and weft defect detection method
CN116433663A (en) * 2023-06-13 2023-07-14 肥城恒丰塑业有限公司 Intelligent geotechnical cell quality detection method

Also Published As

Publication number Publication date
CN118096728A (en) 2024-05-28

Similar Documents

Publication Publication Date Title
CN114937055B (en) Image self-adaptive segmentation method and system based on artificial intelligence
CN108961217B (en) Surface defect detection method based on regular training
CN111815601B (en) Texture image surface defect detection method based on depth convolution self-encoder
CN109870461B (en) Electronic components quality detection system
CN109191459B (en) Automatic identification and rating method for continuous casting billet macrostructure center segregation defect
CN111383209B (en) Unsupervised flaw detection method based on full convolution self-encoder network
CN111915572B (en) Adaptive gear pitting quantitative detection system and method based on deep learning
CN115345885A (en) Method for detecting appearance quality of metal fitness equipment
CN107490582B (en) Assembly line workpiece detection system
CN111091548B (en) Railway wagon adapter dislocation fault image identification method and system based on deep learning
CN111079734B (en) Method for detecting foreign matters in triangular holes of railway wagon
CN115797354B (en) Method for detecting appearance defects of laser welding seam
CN115684176B (en) Online visual detection system for film surface defects
CN109781737B (en) Detection method and detection system for surface defects of hose
CN115063430B (en) Electric pipeline crack detection method based on image processing
CN111080602B (en) Method for detecting foreign matters in water leakage hole of railway wagon
CN115797361B (en) Aluminum template surface defect detection method
CN112258444A (en) Elevator steel wire rope detection method
CN116883408B (en) Integrating instrument shell defect detection method based on artificial intelligence
CN117635609B (en) Visual inspection method for production quality of plastic products
CN116630813A (en) Highway road surface construction quality intelligent detection system
CN117152161A (en) Shaving board quality detection method and system based on image recognition
CN111724376B (en) Paper disease detection method based on texture feature analysis
CN113516619A (en) Product surface flaw identification method based on image processing technology
CN114581805A (en) Coating roller surface defect detection method adopting 3D line laser profile technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant