CN115082479A - Machine part fatigue crack identification method based on saliency characteristics - Google Patents

Machine part fatigue crack identification method based on saliency characteristics Download PDF

Info

Publication number
CN115082479A
CN115082479A CN202211009269.7A CN202211009269A CN115082479A CN 115082479 A CN115082479 A CN 115082479A CN 202211009269 A CN202211009269 A CN 202211009269A CN 115082479 A CN115082479 A CN 115082479A
Authority
CN
China
Prior art keywords
machine part
image
area
identified
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211009269.7A
Other languages
Chinese (zh)
Inventor
董安丽
朱霞英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qidong Kaishun Machinery Manufacturing Co ltd
Original Assignee
Qidong Kaishun Machinery Manufacturing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qidong Kaishun Machinery Manufacturing Co ltd filed Critical Qidong Kaishun Machinery Manufacturing Co ltd
Priority to CN202211009269.7A priority Critical patent/CN115082479A/en
Publication of CN115082479A publication Critical patent/CN115082479A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention relates to the technical field of material testing and analysis, in particular to a machine part fatigue crack identification method based on significance characteristics. The method comprises the steps of acquiring a visible light image by an optical means, specifically by a visible light means, and carrying out material analysis and test according to the visible light image; the method can compare the significance value corresponding to each area of the machine part with a set threshold value, if the significance value corresponding to a certain area is larger than the set threshold value, the machine part to be identified is judged to have the fatigue crack, and the area with the significance value larger than the set threshold value is judged to have the fatigue crack. The fatigue crack of the machine part is automatically identified by judging whether the fatigue crack exists in the machine part by an optical means, particularly a visible light means.

Description

Machine part fatigue crack identification method based on saliency characteristics
Technical Field
The invention relates to the technical field of material testing and analysis, in particular to a machine part fatigue crack identification method based on significance characteristics.
Background
The fatigue damage accidents of various machine parts emerge endlessly with the continuous improvement of the modern manufacturing industry level. Therefore, the detection of fatigue cracks in machine parts is also becoming increasingly important. An important index for judging the performance of the machine part is the generation time and position of cracks and the process of crack development, the detection work of fatigue cracks of the machine part is mainly realized by manual regular inspection at present, the whole detection process has certain intermittence, and after workers experience long-time detection work, the situations of missed detection and error detection are inevitable. It is necessary to realize automatic identification of fatigue cracks of machine parts.
Disclosure of Invention
In order to realize automatic identification of the fatigue cracks of the machine part, the invention aims to provide a machine part fatigue crack identification method based on the saliency characteristics.
The invention provides a machine part fatigue crack identification method based on salient features, which comprises the following steps:
acquiring a target image, and performing preprocessing operation on the target image, wherein the preprocessing operation comprises denoising processing and sharpening processing; the target image comprises a machine part image to be identified, a machine part speckle image to be identified, a normal machine part image and a normal machine part speckle image;
uniformly dividing the target image after the preprocessing operation into a plurality of areas, and calculating the texture roughness difference between any two areas on the machine part image to be recognized according to the machine part image to be recognized after the preprocessing operation and the normal machine part image; calculating the region brightness distribution difference between any two regions on the machine part image to be recognized according to the machine part image to be recognized after the preprocessing operation; calculating the regional deformation degree difference between any two regions on the speckle images of the machine parts to be identified according to the speckle images of the machine parts to be identified and the speckle images of the normal machine parts after the preprocessing operation;
calculating the difference between any two regions according to the texture roughness difference, the region brightness distribution difference and the region deformation degree difference between any two regions, and calculating the significance of each region according to the difference between any two regions;
and comparing the significance value corresponding to each region with a set threshold value, if the significance value corresponding to a certain region is greater than the set threshold value, judging that the fatigue crack exists on the machine part to be identified, and judging that the region with the significance value greater than the set threshold value is the region with the fatigue crack.
Further, the calculating the texture roughness difference between any two regions on the image of the machine part to be recognized according to the image of the machine part to be recognized after the preprocessing operation and the image of the normal machine part includes:
calculating the number of boxes covering the area i in the normal machine part image, the number of boxes covering the area j in the normal machine part image, the number of boxes covering the area i in the machine part image to be identified and the number of boxes covering the area j in the machine part image to be identified;
calculating the fractal dimension of the area i in the normal machine part image according to the number of boxes covering the area i in the normal machine part image; calculating the fractal dimension of the region j in the normal machine part image according to the number of boxes covering the region j in the normal machine part image; calculating the fractal dimension of the area i in the machine part image to be identified according to the number of boxes covering the area i in the machine part image to be identified; calculating the fractal dimension of the region j in the machine part image to be identified according to the number of boxes covering the region j in the machine part image to be identified;
calculating the difference between the fractal dimension of the area i in the image of the machine part to be identified and the fractal dimension of the area i in the image of the normal machine part according to the fractal dimension of the area i in the image of the normal machine part and the fractal dimension of the area i in the image of the machine part to be identified; calculating the difference between the fractal dimension of the region j in the image of the machine part to be identified and the fractal dimension of the region j in the image of the normal machine part according to the fractal dimension of the region j in the image of the normal machine part and the fractal dimension of the region j in the image of the machine part to be identified;
and calculating the texture roughness difference of the region i and the region j on the machine part image to be identified according to the difference of the fractal dimensions of the region i and the difference of the fractal dimensions of the region j.
Further, the calculating the number of boxes covering the area i in the normal machine part image comprises:
establishing a three-dimensional coordinate system for a region i in the normal machine part image, wherein the length and the width of the region i in the normal machine part image respectively represent an x axis and a y axis in the three-dimensional coordinate system, and the gray value of each pixel in the region i in the normal machine part image represents a z axis; the height h of the box is calculated using the following formula:
Figure 987713DEST_PATH_IMAGE001
wherein G represents the total gray level of the normal machine part image, M is the side length of the normal machine part image, and k is the side length of each region;
calculating to obtain the height of the box, stacking the box in each area, and recording the number of boxes corresponding to the maximum gray level value in the area i in the normal machine part image as T; the box number corresponding to the minimum gray level in the region i in the normal machine part image is recorded as
Figure 576957DEST_PATH_IMAGE002
(ii) a Then the number of boxes covering area i in the normal machine part image is:
Figure 969411DEST_PATH_IMAGE003
wherein the content of the first and second substances,
Figure 974407DEST_PATH_IMAGE004
is the number of boxes covering area i in the image of the normal machine part.
Further, the calculating the fractal dimension of the area i in the normal machine part image according to the box number covering the area i in the normal machine part image includes:
the formula for calculating the fractal dimension of the region i in the normal machine part image is as follows:
Figure 195304DEST_PATH_IMAGE005
wherein the content of the first and second substances,
Figure 765570DEST_PATH_IMAGE004
to cover the number of boxes in area i in the normal machine part image,
Figure 580073DEST_PATH_IMAGE006
is the fractal dimension of a region i in the normal machine part image, and k is the side length of each region.
Further, calculating the area brightness distribution difference between any two areas on the image of the machine part to be recognized according to the image of the machine part to be recognized after the preprocessing operation, which comprises the following steps:
converting the machine part image to be identified from an RGB space into an HSV space;
removing the suspected area brightness abnormal values in the area i and the area j in the machine part image to be identified according to the box line graph; then, respectively calculating the average value of the area brightness distribution of the area i and the area j after the abnormal value elimination in the machine part image to be identified, and calculating the area brightness distribution difference of the area i and the area j in the machine part image to be identified by using the following formula:
Figure 310132DEST_PATH_IMAGE007
wherein the content of the first and second substances,
Figure 651114DEST_PATH_IMAGE008
the difference of the brightness distribution of the areas i and j in the image of the machine part to be identified is shown,
Figure 988686DEST_PATH_IMAGE009
representing the area brightness distribution average value of an area i in the machine part image to be identified;
Figure 898609DEST_PATH_IMAGE010
and representing the area brightness distribution average value of the area j in the machine part image to be identified.
Further, the calculating the difference of the area deformation degree between any two areas on the speckle image of the machine part to be recognized according to the speckle image of the machine part to be recognized after the preprocessing operation and the speckle image of the normal machine part includes:
calculating a Fourier descriptor of an area i in the normal machine part speckle image, a box number Fourier descriptor of an area j in the normal machine part speckle image, a Fourier descriptor of the area i in the machine part speckle image to be identified and a Fourier descriptor of the area j in the machine part image to be identified;
calculating the difference between the speckle image of the machine part to be identified and the Fourier descriptor of the area i in the speckle image of the normal machine part according to the Fourier descriptor of the area i in the speckle image of the normal machine part and the Fourier descriptor of the area i in the speckle image of the machine part to be identified; calculating the difference between the speckle image of the machine part to be identified and the Fourier descriptor of the region j in the speckle image of the normal machine part according to the Fourier descriptor of the region j in the speckle image of the normal machine part and the Fourier descriptor of the region j in the speckle image of the machine part to be identified;
and calculating the difference of the area deformability between the area i and the area j according to the deformation edge length difference value between the area i and the area j in the machine part speckle image to be identified, the difference of the Fourier descriptor of the area i and the difference of the Fourier descriptor of the area j.
Further, the calculating the difference between any two regions according to the texture roughness difference, the region brightness distribution difference and the region deformability difference between any two regions includes:
calculating the difference between any two regions according to the texture roughness difference, the region brightness distribution difference and the region deformation degree difference between any two regions, wherein the formula is as follows:
Figure 183091DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure 503214DEST_PATH_IMAGE012
for the difference value of the area i and the area j in the machine part image to be identified,
Figure 605162DEST_PATH_IMAGE013
representing the difference of the area brightness distribution of the area i and the area j in the image of the machine part to be identified
Figure 128678DEST_PATH_IMAGE008
The weight of (c);
Figure 200540DEST_PATH_IMAGE014
representing the difference of the roughness of the region textures of the region i and the region j in the image of the machine part to be identified
Figure 513185DEST_PATH_IMAGE015
The weight of (c);
Figure 395821DEST_PATH_IMAGE016
representing the area deformability of the area i and the area j in the speckle image of the machine part to be identified
Figure 921480DEST_PATH_IMAGE017
The weight of (c);
Figure 406820DEST_PATH_IMAGE018
is the space Euclidean distance between the area i and the area j in the machine part image to be identified.
Has the advantages that: the method judges whether the machine part has the fatigue cracks by an optical means, particularly a visible light means, and specifically obtains the fatigue cracks, realizes the judgment of whether the fatigue cracks exist on the machine part to be recognized according to the machine part image to be recognized, the machine part speckle image to be recognized, the normal machine part image and the normal machine part speckle image, and can recognize the areas where the fatigue cracks exist under the condition that the fatigue cracks exist; the invention discloses a fatigue crack identification method, belongs to an automatic identification method, and solves the problems of large occupation amount of human resources, low efficiency and the like when fatigue crack identification is performed manually in the prior art.
Drawings
FIG. 1 is a flow chart of a machine part fatigue crack identification method based on salient features of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be described below with reference to the drawings in the embodiments of the present invention.
In order to realize automatic identification of the fatigue cracks of the machine part, the embodiment provides a machine part fatigue crack identification method based on the saliency characteristics, as shown in fig. 1, comprising the following steps:
(1) acquiring a target image, and performing preprocessing operation on the target image, wherein the preprocessing operation comprises denoising processing and sharpening processing; the target image comprises a machine part image to be identified, a machine part speckle image to be identified, a normal machine part image and a normal machine part speckle image;
in order to judge whether the machine part to be identified has the fatigue crack, the embodiment uses the CCD camera to acquire the RGB space images of the machine part to be identified, including the machine part image to be identified and the machine part speckle image to be identified. The machine part to be identified of the present embodiment refers to a machine part to be fatigue crack identified, on which there is a possibility that a fatigue crack exists or a possibility that a fatigue crack does not exist. In addition, in order to realize the judgment of whether the fatigue crack exists on the machine part to be identified, the CCD camera is also used in the embodiment to acquire RGB space images of the normal machine part, including the normal machine part image and the normal machine part speckle image.
The method comprises the steps of carrying out preprocessing operation on the obtained images (including machine part images to be recognized, machine part speckle images to be recognized, normal machine part images and normal machine part speckle images, wherein the images are all visible light images and are collectively called as target images), wherein the preprocessing operation comprises denoising and sharpening, and the preprocessing operation on the images is carried out in order to achieve the purposes of eliminating the influence caused by noise and partial external interference, enhancing the contrast and highlighting the edge contour characteristics and improving the accuracy of subsequent fatigue crack recognition.
Common image denoising methods include mean filtering, median filtering, gaussian filtering and the like, and the purpose of denoising an image is finally achieved by reconstructing original image information by using Contourlet transformation. In this embodiment, any one of the denoising methods may be selected to denoise the image, and the process of denoising and sharpening the image in this embodiment is the prior art and is not described herein again.
(2) Uniformly dividing the target image after the preprocessing operation into a plurality of areas, and calculating the texture roughness difference between any two areas on the machine part image to be recognized according to the machine part image to be recognized after the preprocessing operation and the normal machine part image; calculating the region brightness distribution difference between any two regions on the machine part image to be recognized according to the machine part image to be recognized after the preprocessing operation; calculating the regional deformation degree difference between any two regions on the speckle images of the machine parts to be identified according to the speckle images of the machine parts to be identified and the speckle images of the normal machine parts after the preprocessing operation;
if the machine part to be identified has fatigue cracks, the regional brightness distribution, the regional texture roughness and the regional deformation degree of the fatigue cracks of the machine part are changed compared with those of a normal machine part; the change is specifically as follows: the brightness distribution of the area at the fatigue crack of the machine part in the image of the machine part to be identified is obviously lower than that of the area of the intact part around the fatigue crack; the texture roughness of the area caused by the texture at the fatigue crack of the machine part in the image of the machine part to be identified is obviously higher than the texture roughness of the intact part area around the fatigue crack; the regional deformability of the fatigue crack of the machine part obtained through the speckle image of the machine part to be identified is greatly different from that of the intact part around the fatigue crack. In view of this, the present embodiment identifies fatigue cracks on the machine part to be identified according to the characteristics of regional brightness distribution, regional texture roughness, and regional deformability of the fatigue cracks compared to the intact positions of the machine part; next, a specific process will be explained.
The machine part image to be identified, the machine part speckle image to be identified, the normal machine part image and the normal machine part after the preprocessing operationThe machine part speckle images are uniformly divided into k × k regions, where k is the side length of each region, in this embodiment, the four images are all M × M, M is the side length of each image,
Figure 452267DEST_PATH_IMAGE019
and the number of the areas obtained after each image is divided is K. The areas in the four images are in a one-to-one correspondence relationship, namely the ith area in the machine part image to be identified, the ith area in the machine part speckle image to be identified, the ith area in the normal machine part image and the ith area in the normal machine part speckle image refer to the same area of the machine part, and i =1, 2, …, K. K =64 in the present embodiment.
Calculating the difference of the region texture roughness of a region i and a region j in a machine part image to be identified;
for the machine part image to be identified, the area texture roughness caused by the texture at the fatigue crack of the machine part is obviously higher than the area texture roughness caused by the texture of the intact part around the fatigue crack, so the embodiment calculates the difference of the area texture roughness of the area i and the area j in the machine part image to be identified so as to obtain the image with the different area texture roughness values
Figure 755072DEST_PATH_IMAGE015
To indicate that the user is not in a normal position,
Figure 380745DEST_PATH_IMAGE015
the larger the difference between the texture roughness of the regions i and j. In this embodiment, the change of the roughness of the texture in the region where the fatigue crack is located is measured by calculating the difference of fractal dimensions of different regions, and the formula is as follows:
Figure 810720DEST_PATH_IMAGE020
Figure 694362DEST_PATH_IMAGE021
Figure 43435DEST_PATH_IMAGE022
wherein the content of the first and second substances,
Figure 28840DEST_PATH_IMAGE023
Figure 246195DEST_PATH_IMAGE024
respectively is the fractal dimension of the areas i and j in the machine part image to be identified,
Figure 856780DEST_PATH_IMAGE006
Figure 517700DEST_PATH_IMAGE025
respectively the fractal dimensions of the areas i and j in the normal machine part image,
Figure 849455DEST_PATH_IMAGE026
the difference of fractal dimension of the region i in the machine part image to be identified and the normal machine part image,
Figure 870501DEST_PATH_IMAGE027
is the difference of fractal dimension of the region j in the image of the machine part to be identified and the image of the normal machine part.
The formula for calculating the fractal dimension of the regions i and j in the normal machine part image in the embodiment is as follows:
Figure 338522DEST_PATH_IMAGE005
Figure 895975DEST_PATH_IMAGE028
wherein the content of the first and second substances,
Figure 105239DEST_PATH_IMAGE004
to cover the number of boxes in area i in the normal machine part image,
Figure 539763DEST_PATH_IMAGE029
is the number of boxes covering area j in the normal machine part image.
The formula for calculating the fractal dimensions of the areas i and j in the image of the machine part to be identified in the embodiment is as follows:
Figure 3236DEST_PATH_IMAGE030
Figure 599434DEST_PATH_IMAGE031
wherein the content of the first and second substances,
Figure 295994DEST_PATH_IMAGE032
in order to cover the number of boxes of the area i in the image of the machine part to be identified,
Figure 265700DEST_PATH_IMAGE033
the number of boxes covering the area j in the image of the machine part to be identified.
The calculation process of the number of boxes covering the area i in the normal machine part image in this embodiment is as follows:
establishing a three-dimensional coordinate system for a region i in the normal machine part image, wherein the length and the width of the region i in the normal machine part image respectively represent an x axis and a y axis in the three-dimensional coordinate system, and the gray value of each pixel in the region i in the normal machine part image represents a z axis; the height h of the box is calculated using the following formula:
Figure 849259DEST_PATH_IMAGE034
where G represents the total gray level of the normal machine part image.
The height of the box is obtained through calculation, the box is stacked in each area, the number of the boxes corresponding to the maximum gray level value in the area i in the normal machine part image is recorded as T, namely the area in the normal machine part image is divided into areasThe maximum value of the gray scale in the field i falls in the Tth box; the box number corresponding to the minimum gray level in the region i in the normal machine part image is recorded as
Figure 616358DEST_PATH_IMAGE002
I.e. the minimum value of the gray level in the region i in the image of the normal machine part falls on
Figure 534635DEST_PATH_IMAGE002
In one box; then the number of boxes covering area i in the normal machine part image is:
Figure 451907DEST_PATH_IMAGE035
wherein the content of the first and second substances,
Figure 751957DEST_PATH_IMAGE004
is the number of boxes covering area i in the image of the normal machine part.
According to the method for calculating the number of boxes covering the area i in the normal machine part image, the number of boxes covering the area i in the machine part image to be identified can be calculated, and the formula is as follows:
Figure 345749DEST_PATH_IMAGE036
wherein the content of the first and second substances,
Figure 361110DEST_PATH_IMAGE037
the number of boxes corresponding to the maximum gray value in the area i in the machine part image to be identified is set, namely the maximum gray value in the area i in the machine part image to be identified falls at the first
Figure 816493DEST_PATH_IMAGE037
In one box;
Figure 233699DEST_PATH_IMAGE038
the number of boxes corresponding to the minimum value of the gray scale in the area i in the image of the machine part to be identified, i.e. the number of boxes to be identifiedIdentifying a minimum gray level in region i in an image of a machine part falling on
Figure 467234DEST_PATH_IMAGE038
In one box.
Similarly, according to the method for calculating the number of boxes covering the area i in the normal machine part image, the number of boxes covering the area j in the normal machine part image can be calculated
Figure 232540DEST_PATH_IMAGE029
And the number of boxes covering the area j in the image of the machine part to be identified
Figure 491615DEST_PATH_IMAGE033
. This implementation does not describe the calculation process again.
Calculating the difference of the area brightness distribution of the area i and the area j in the machine part image to be identified;
for the machine part image to be identified, the regional brightness distribution at the fatigue crack of the machine part is obviously lower than that of the intact part around the fatigue crack, so the embodiment performs the regional brightness distribution calculation by converting the machine part image to be identified from the RGB space to the HSV space. The values of R, G, B are first normalized so that the regional brightness distribution values in HSV space can be obtained, namely:
Figure 887961DEST_PATH_IMAGE039
wherein, V is the lightness, max is for getting the maximum value, R is the color value of red passageway, G is the color value of green passageway, B is the color value of blue passageway.
The method comprises the following steps of removing the suspected area brightness abnormal values in an area i and an area j in an image of a machine part to be identified according to a box line graph, wherein the process of removing the abnormal values according to the box line graph is the prior art and is not repeated here; then, respectively calculating the average value of the area brightness distribution of the area i and the area j after the abnormal value elimination in the image of the machine part to be identified, and calculating the difference of the area brightness distribution of the area i and the area j in the image of the machine part to be identified by using the following formula:
Figure 698922DEST_PATH_IMAGE040
wherein the content of the first and second substances,
Figure 829820DEST_PATH_IMAGE041
the difference of the area brightness distribution of the area i and the area j in the image of the machine part to be identified is represented, the larger the value is, the larger the difference of the area brightness distribution of the area i and the area j is,
Figure 876274DEST_PATH_IMAGE009
and
Figure 282720DEST_PATH_IMAGE010
and respectively representing the area brightness distribution average values of the area i and the area j in the image of the machine part to be identified.
And thirdly, calculating the difference of the area deformability of the area i and the area j in the speckle image of the machine part to be identified.
For the speckle image of the machine part to be identified, the deformation degree of the area at the fatigue crack is obviously different from the deformation degree of the area of the intact part around the fatigue crack, so that the fatigue crack can be measured through the deformation size. Firstly, converting a gray scale image of a machine part speckle image to be identified into a binary image by a Floyd-Steinberg method, secondly, connecting deformation parts in K divided areas in the machine part speckle image to be identified (the machine part speckle image to be identified is converted into the binary image) in a point form, wherein each point is uniformly distributed in an interval, the number of the points in each area is 10 in the embodiment, then, the number of the corresponding points in the machine part speckle image to be identified is 10K =640, and coordinates of each point in the machine part speckle image to be identified are calculated according to the point form
Figure 139949DEST_PATH_IMAGE042
Is expressed in terms of, i.e.
Figure 7411DEST_PATH_IMAGE043
Figure 467342DEST_PATH_IMAGE044
Is the coordinate pair corresponding to the nth point,
Figure 589013DEST_PATH_IMAGE045
is the abscissa corresponding to the nth point,
Figure 473268DEST_PATH_IMAGE046
is the ordinate corresponding to the nth point.
Each coordinate pair can be treated as a complex number, that is:
Figure 562447DEST_PATH_IMAGE047
wherein the content of the first and second substances,
Figure 560490DEST_PATH_IMAGE048
in units of imaginary numbers.
For an area i in the speckle image of the machine part to be identified, calculating a corresponding Fourier descriptor according to a coordinate pair corresponding to 10 points included in the area i, wherein the formula is as follows:
Figure 536667DEST_PATH_IMAGE049
wherein the content of the first and second substances,
Figure 984966DEST_PATH_IMAGE050
and (4) performing Fourier description on an area i in the speckle image of the machine part to be identified.
Similarly, the fourier descriptor of the area i in the speckle image of the normal machine part can be calculated, and then the difference between the fourier descriptor of the area i in the speckle image of the normal machine part and the fourier descriptor of the area i in the speckle image of the machine part to be identified
Figure 702386DEST_PATH_IMAGE051
Comprises the following steps:
Figure 647995DEST_PATH_IMAGE052
wherein the content of the first and second substances,
Figure 993526DEST_PATH_IMAGE053
a fourier plot of region i in the speckle image of a normal machine part.
According to the method, the difference between the Fourier descriptor of the area j in the speckle image of the normal machine part and the Fourier descriptor of the area j in the speckle image of the machine part to be identified can be calculated
Figure 488092DEST_PATH_IMAGE054
In this embodiment, details of the specific solving process are not described again.
Because all there are 10 evenly distributed coordinate points in interval i and interval j in the machine part speckle image of waiting to discern, consequently can calculate respectively waiting to discern the distance between the most apart from coordinate point in regional i in the machine part speckle image of waiting to discern to and wait to discern the distance between the most apart from coordinate point in regional j in the machine part speckle image, thereby treat that the regional deformability of discerning in machine part speckle image region i and region j measures, specific calculation formula is as follows:
Figure 37016DEST_PATH_IMAGE055
Figure 32654DEST_PATH_IMAGE056
Figure 576899DEST_PATH_IMAGE057
wherein the content of the first and second substances,
Figure 239437DEST_PATH_IMAGE058
the maximum euclidean distance of 10 coordinate points in region i of the speckle image of a normal machine part,
Figure 541237DEST_PATH_IMAGE059
the maximum Euclidean distance of 10 coordinate points in the area i of the speckle image of the machine part to be identified,
Figure 809407DEST_PATH_IMAGE060
the maximum euclidean distance of 10 coordinate points in region j of the speckle image of a normal machine part,
Figure 4896DEST_PATH_IMAGE061
the maximum Euclidean distance of 10 coordinate points in an area j of the speckle image of the machine part to be identified;
Figure 716631DEST_PATH_IMAGE062
and the difference value of the deformed edge length between the area i and the area j in the speckle image of the machine part to be identified is obtained.
Calculating the difference of the area deformability of the interval i and the interval j in the speckle image of the machine part to be identified by using the following formula:
Figure 489415DEST_PATH_IMAGE063
wherein the content of the first and second substances,
Figure 427854DEST_PATH_IMAGE017
the difference value of the area deformability of the interval i and the interval j in the speckle image of the machine part to be identified,
Figure 618795DEST_PATH_IMAGE017
the larger the deformation difference between the area i and the area j in the speckle image of the machine part to be identified is.
(3) Calculating the difference between any two regions according to the texture roughness difference, the region brightness distribution difference and the region deformability difference between any two regions, and calculating the significance of each region according to the difference between any two regions;
through the step (2), the texture roughness difference and the area brightness distribution difference between any two areas on the machine part image to be identified and the area deformation degree difference between any two areas on the machine part speckle image to be identified are obtained; then, the difference between any two regions is calculated according to the texture roughness difference, the region brightness distribution difference and the region deformability difference between any two regions, and the formula is as follows:
Figure 750699DEST_PATH_IMAGE064
wherein the content of the first and second substances,
Figure 886145DEST_PATH_IMAGE012
for the difference value of the area i and the area j in the machine part image to be identified,
Figure 512429DEST_PATH_IMAGE012
the larger the difference between the area i and the area j in the image of the machine part to be identified is;
Figure 807145DEST_PATH_IMAGE013
representing the weight of the regional brightness distribution difference of the region i and the region j in the machine part image to be identified;
Figure 716807DEST_PATH_IMAGE014
representing the weight of the difference of the region texture roughness of the region i and the region j in the machine part image to be identified;
Figure 214916DEST_PATH_IMAGE016
the weight of the area deformation degree of the area i and the area j in the speckle image of the machine part to be identified is set in the embodiment
Figure 503946DEST_PATH_IMAGE013
Figure 653167DEST_PATH_IMAGE014
And
Figure 2240DEST_PATH_IMAGE016
all of the values of (a) and (b) are 1/3, and may be set by itself at the time of application as another embodiment.
Figure 990575DEST_PATH_IMAGE018
The Euclidean distance of the region i and the region j in the image of the machine part to be identified is used, and the Euclidean distance of the region j in the image of the machine part to be identified is specifically used as the Euclidean distance of the region j in the image of the machine part to be identified.
After the difference between any two regions is obtained through calculation, the significance value of each region can be calculated, and the formula is as follows:
Figure 755399DEST_PATH_IMAGE065
wherein the content of the first and second substances,
Figure 900073DEST_PATH_IMAGE066
for the saliency value of the area i in the image of the machine part to be identified,
Figure 279102DEST_PATH_IMAGE066
the larger the value of (b), the more likely the region is to be a fatigue crack region.
(4) And comparing the significance value corresponding to each region with a set threshold value, if the significance value corresponding to a certain region is greater than the set threshold value, judging that the fatigue crack exists on the machine part to be identified, and judging that the region with the significance value greater than the set threshold value is the region with the fatigue crack.
In this embodiment, the significance value corresponding to each region is obtained and compared with a set threshold, where the threshold is 0.75 in this embodiment, and if the significance value corresponding to a certain region is greater than 0.75, it is determined that a fatigue crack exists in the region; on the contrary, it is judged that no fatigue crack is present in the region. If the significance values of all the areas are not more than 0.75, judging that the machine part to be identified does not have fatigue cracks; and if the significance value of the region is more than 0.75, judging that the machine part to be identified has the fatigue crack.
The embodiment realizes the judgment of whether the fatigue crack exists on the machine part to be identified according to the machine part image to be identified, the machine part speckle image to be identified, the normal machine part image and the normal machine part speckle image, and can identify the area where the fatigue crack exists under the condition that the fatigue crack exists; the fatigue crack identification method belongs to an automatic identification method, and solves the problems of large occupation amount of human resources, low efficiency and the like when the existing fatigue crack identification is carried out manually.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; the modifications or substitutions do not make the essence of the corresponding technical solutions deviate from the technical solutions of the embodiments of the present application, and are included in the protection scope of the present application.

Claims (7)

1. A machine part fatigue crack identification method based on salient features is characterized by comprising the following steps:
acquiring a target image, and performing preprocessing operation on the target image, wherein the preprocessing operation comprises denoising processing and sharpening processing; the target image comprises a machine part image to be identified, a machine part speckle image to be identified, a normal machine part image and a normal machine part speckle image;
uniformly dividing the target image after the preprocessing operation into a plurality of areas, and calculating the texture roughness difference between any two areas on the machine part image to be recognized according to the machine part image to be recognized after the preprocessing operation and the normal machine part image; calculating the region brightness distribution difference between any two regions on the machine part image to be recognized according to the machine part image to be recognized after the preprocessing operation; calculating the regional deformation degree difference between any two regions on the speckle images of the machine parts to be identified according to the speckle images of the machine parts to be identified and the speckle images of the normal machine parts after the preprocessing operation;
calculating the difference between any two regions according to the texture roughness difference, the region brightness distribution difference and the region deformability difference between any two regions, and calculating the significance of each region according to the difference between any two regions;
and comparing the significance value corresponding to each region with a set threshold value, if the significance value corresponding to a certain region is greater than the set threshold value, judging that the fatigue crack exists on the machine part to be identified, and judging that the region with the significance value greater than the set threshold value is the region with the fatigue crack.
2. The machine part fatigue crack identification method based on the saliency feature of claim 1, wherein the calculating the texture roughness difference between any two areas on the machine part image to be identified according to the machine part image to be identified after the preprocessing operation and the normal machine part image comprises:
calculating the number of boxes covering the area i in the normal machine part image, the number of boxes covering the area j in the normal machine part image, the number of boxes covering the area i in the machine part image to be identified and the number of boxes covering the area j in the machine part image to be identified;
calculating the fractal dimension of the area i in the normal machine part image according to the number of boxes covering the area i in the normal machine part image; calculating the fractal dimension of the region j in the normal machine part image according to the number of boxes covering the region j in the normal machine part image; calculating the fractal dimension of the area i in the machine part image to be identified according to the number of boxes covering the area i in the machine part image to be identified; calculating the fractal dimension of the region j in the machine part image to be identified according to the number of boxes covering the region j in the machine part image to be identified;
calculating the difference between the fractal dimension of the area i in the image of the machine part to be identified and the fractal dimension of the area i in the image of the normal machine part according to the fractal dimension of the area i in the image of the normal machine part and the fractal dimension of the area i in the image of the machine part to be identified; calculating the difference between the fractal dimension of the area j in the image of the machine part to be identified and the fractal dimension of the area j in the image of the normal machine part according to the fractal dimension of the area j in the image of the machine part to be identified and the fractal dimension of the area j in the image of the machine part to be identified;
and calculating the texture roughness difference of the region i and the region j on the machine part image to be identified according to the difference of the fractal dimensions of the region i and the difference of the fractal dimensions of the region j.
3. The method for identifying fatigue cracks of machine parts based on salient features according to claim 2, wherein the step of calculating the number of boxes covering the area i in the normal machine part image comprises the following steps:
establishing a three-dimensional coordinate system for a region i in the normal machine part image, wherein the length and the width of the region i in the normal machine part image respectively represent an x axis and a y axis in the three-dimensional coordinate system, and the gray value of each pixel in the region i in the normal machine part image represents a z axis; the height h of the box is calculated using the following formula:
Figure DEST_PATH_IMAGE001
wherein G represents the total gray level of the normal machine part image, M is the side length of the normal machine part image, and k is the side length of each region;
calculating to obtain the height of the box, stacking the box in each area, and recording the number of boxes corresponding to the maximum gray level value in the area i in the normal machine part image as T; the box number corresponding to the minimum gray level in the region i in the normal machine part image is recorded as
Figure 341160DEST_PATH_IMAGE002
(ii) a Then the number of boxes covering area i in the normal machine part image is:
Figure 118623DEST_PATH_IMAGE003
wherein the content of the first and second substances,
Figure 301080DEST_PATH_IMAGE004
is the number of boxes covering area i in the image of the normal machine part.
4. The machine part fatigue crack identification method based on the saliency feature of claim 2, wherein the calculating the fractal dimension of the region i in the normal machine part image according to the box number covering the region i in the normal machine part image comprises:
the formula for calculating the fractal dimension of the region i in the normal machine part image is as follows:
Figure 796784DEST_PATH_IMAGE005
wherein the content of the first and second substances,
Figure 965466DEST_PATH_IMAGE004
to cover the number of boxes in area i in the normal machine part image,
Figure 863014DEST_PATH_IMAGE006
is the fractal dimension of a region i in the normal machine part image, and k is the side length of each region.
5. The machine part fatigue crack identification method based on the saliency feature of claim 1, wherein the calculation of the regional brightness distribution difference between any two regions on the machine part image to be identified according to the machine part image to be identified after the preprocessing operation comprises:
converting the machine part image to be identified from an RGB space into an HSV space;
removing the suspected area brightness abnormal values in the area i and the area j in the machine part image to be identified according to the box line graph; then, respectively calculating the average value of the area brightness distribution of the area i and the area j after the abnormal value elimination in the machine part image to be identified, and calculating the area brightness distribution difference of the area i and the area j in the machine part image to be identified by using the following formula:
Figure DEST_PATH_IMAGE007
wherein the content of the first and second substances,
Figure 389942DEST_PATH_IMAGE008
the difference of the brightness distribution of the areas i and j in the image of the machine part to be identified is shown,
Figure 904100DEST_PATH_IMAGE009
representing the area brightness distribution average value of an area i in the machine part image to be identified;
Figure 729668DEST_PATH_IMAGE010
and representing the area brightness distribution average value of the area j in the machine part image to be identified.
6. The machine part fatigue crack identification method based on the saliency feature of claim 1, wherein said calculating the difference of the regional deformability between any two regions on the speckle image of the machine part to be identified according to the speckle image of the machine part to be identified after the preprocessing operation and the speckle image of the normal machine part comprises:
calculating a Fourier descriptor of an area i in the normal machine part speckle image, a box number Fourier descriptor of an area j in the normal machine part speckle image, a Fourier descriptor of the area i in the machine part speckle image to be identified and a Fourier descriptor of the area j in the machine part image to be identified;
calculating the difference between the speckle image of the machine part to be identified and the Fourier descriptor of the area i in the speckle image of the normal machine part according to the Fourier descriptor of the area i in the speckle image of the normal machine part and the Fourier descriptor of the area i in the speckle image of the machine part to be identified; calculating the difference between the speckle image of the machine part to be identified and the Fourier descriptor of the region j in the speckle image of the normal machine part according to the Fourier descriptor of the region j in the speckle image of the normal machine part and the Fourier descriptor of the region j in the speckle image of the machine part to be identified;
and calculating the difference of the area deformability between the area i and the area j according to the deformation edge length difference value between the area i and the area j in the machine part speckle image to be identified, the difference of the Fourier descriptor of the area i and the difference of the Fourier descriptor of the area j.
7. The method for identifying fatigue cracks of machine parts based on salient features according to claim 1, wherein the step of calculating the difference between any two regions according to the texture roughness difference, the region brightness distribution difference and the region deformation degree difference between any two regions comprises the following steps:
calculating the difference between any two regions according to the texture roughness difference, the region brightness distribution difference and the region deformation degree difference between any two regions, wherein the formula is as follows:
Figure 216144DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure 976290DEST_PATH_IMAGE012
for the difference value of the area i and the area j in the machine part image to be identified,
Figure 977744DEST_PATH_IMAGE013
representing the difference of the area brightness distribution of the area i and the area j in the image of the machine part to be identified
Figure 612862DEST_PATH_IMAGE008
The weight of (c);
Figure 953845DEST_PATH_IMAGE014
representing the difference of the roughness of the region textures of the region i and the region j in the image of the machine part to be identified
Figure DEST_PATH_IMAGE015
The weight of (c);
Figure 619312DEST_PATH_IMAGE016
representing the area deformability of the area i and the area j in the speckle image of the machine part to be identified
Figure 311325DEST_PATH_IMAGE017
The weight of (c);
Figure 750134DEST_PATH_IMAGE018
is the space Euclidean distance between the area i and the area j in the machine part image to be identified.
CN202211009269.7A 2022-08-23 2022-08-23 Machine part fatigue crack identification method based on saliency characteristics Pending CN115082479A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211009269.7A CN115082479A (en) 2022-08-23 2022-08-23 Machine part fatigue crack identification method based on saliency characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211009269.7A CN115082479A (en) 2022-08-23 2022-08-23 Machine part fatigue crack identification method based on saliency characteristics

Publications (1)

Publication Number Publication Date
CN115082479A true CN115082479A (en) 2022-09-20

Family

ID=83244953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211009269.7A Pending CN115082479A (en) 2022-08-23 2022-08-23 Machine part fatigue crack identification method based on saliency characteristics

Country Status (1)

Country Link
CN (1) CN115082479A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107256547A (en) * 2017-05-26 2017-10-17 浙江工业大学 A kind of face crack recognition methods detected based on conspicuousness
CN113888462A (en) * 2021-08-27 2022-01-04 中国电力科学研究院有限公司 Crack identification method, system, readable medium and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107256547A (en) * 2017-05-26 2017-10-17 浙江工业大学 A kind of face crack recognition methods detected based on conspicuousness
CN113888462A (en) * 2021-08-27 2022-01-04 中国电力科学研究院有限公司 Crack identification method, system, readable medium and storage medium

Similar Documents

Publication Publication Date Title
CN114723701B (en) Gear defect detection method and system based on computer vision
CN112950508B (en) Drainage pipeline video data restoration method based on computer vision
CN110349126B (en) Convolutional neural network-based marked steel plate surface defect detection method
CN116758061B (en) Casting surface defect detection method based on computer vision
CN109596634B (en) Cable defect detection method and device, storage medium and processor
CN109523529B (en) Power transmission line defect identification method based on SURF algorithm
CN107490582B (en) Assembly line workpiece detection system
CN115020267B (en) Semiconductor surface defect detection method
CN112330628A (en) Metal workpiece surface defect image detection method
CN108256521B (en) Effective area positioning method for vehicle body color identification
CN115619793B (en) Power adapter appearance quality detection method based on computer vision
CN115063430B (en) Electric pipeline crack detection method based on image processing
CN113793337B (en) Locomotive accessory surface abnormal degree evaluation method based on artificial intelligence
CN116883408B (en) Integrating instrument shell defect detection method based on artificial intelligence
CN115953398B (en) Defect identification method for strip steel surface
CN115131359A (en) Method for detecting pitting defects on surface of metal workpiece
CN115272336A (en) Metal part defect accurate detection method based on gradient vector
CN113252103A (en) Method for calculating volume and mass of material pile based on MATLAB image recognition technology
CN115272350A (en) Method for detecting production quality of computer PCB mainboard
CN115018785A (en) Hoisting steel wire rope tension detection method based on visual vibration frequency identification
CN114359251A (en) Automatic identification method for concrete surface damage
CN113269758A (en) Cigarette appearance detection method and test device based on machine vision
CN108269264B (en) Denoising and fractal method of bean kernel image
CN115830027B (en) Machine vision-based automobile wire harness cladding defect detection method
CN115082479A (en) Machine part fatigue crack identification method based on saliency characteristics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination