CN108305259A - A kind of abrasive type automatic identifying method of multi-texturing Fusion Features - Google Patents

A kind of abrasive type automatic identifying method of multi-texturing Fusion Features Download PDF

Info

Publication number
CN108305259A
CN108305259A CN201810118514.5A CN201810118514A CN108305259A CN 108305259 A CN108305259 A CN 108305259A CN 201810118514 A CN201810118514 A CN 201810118514A CN 108305259 A CN108305259 A CN 108305259A
Authority
CN
China
Prior art keywords
image
abrasive grain
formula
debris
abrasive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810118514.5A
Other languages
Chinese (zh)
Other versions
CN108305259B (en
Inventor
武通海
邵涛
王硕
陈�峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201810118514.5A priority Critical patent/CN108305259B/en
Publication of CN108305259A publication Critical patent/CN108305259A/en
Application granted granted Critical
Publication of CN108305259B publication Critical patent/CN108305259B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/45Analysis of texture based on statistical description of texture using co-occurrence matrix computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

A kind of abrasive type automatic identifying method of multi-texturing Fusion Features, the method for using color space coordinate conversion first against the abrasive grain of the dark color, light color and the secondary colour that occur in Debris Image, abrasive grain picture under comparative analysis RGB and hsv color space, combining adaptive threshold method, eight connected region algorithm and Morphological scale-space technology are split abrasive grain, abrasive grain is detached with background, it extracts target abrasive grain and compares the Debris Image split under different colours spatial component, select segmentation effect best as the target Debris Image studied;The textural characteristics parameter of the different characteristic manners of two kinds of selection, merges two kinds of textural characteristics parameters using Principal Component Analysis, obtains input vector of the Texture characteristic parameter as neural network model, to realize the automatic identification of serious sliding abrasive grain and tired abrasive grain;The present invention solves the problems, such as that difficult Debris Analysis field multiple color particle partition and serious sliding abrasive grain and tired wear Particles Recognition precision be not high.

Description

A kind of abrasive type automatic identifying method of multi-texturing Fusion Features
Technical field
The invention belongs to the Spectral Analysis Technologies in Machine Fault Diagnosis field, and in particular to offline iron spectrum image digitazation is certainly Dynamic identification technology, more particularly to a kind of abrasive type automatic identifying method of multi-texturing Fusion Features.
Background technology
Mechanical machine part failure mainly as caused by abrasion, fatigue and corrosion, and wear be to cause mechanical breakdown The main reason for failure, 80% or more equipment fault is caused by abrasion.Abrasive grain is then that friction pair was being worn Relative motion, and the direct product of surface abrasion that the medium between interface is had an effect occur for its contact surface in journey, Carry abundant wear information, the quantity of extracting wear particle by being analyzed and researched to the abrasive grain in fluid, pattern, Concentration is quantitatively or qualitatively analyzed at grading information, and then to the state of wear of equipment, finds out luring for wear-out failure generation Inducement element improves equipment reliability of operation, safety and maintainability to predict the health status development of mechanical equipment. Therefore, Debris Analysis (WDA) technology has become equipment running status monitoring and fault diagnosis most efficient method.
In numerous Debris Analysis (WDA) technologies, the ferrous specturm technique based on Debris Image analysis is due to its analysis precision Height, accuracy are good, can accurately obtain the mechanism and wear type of abrasive grain, to provide profound abrasion reason, have become The major technique foundation of mechanical system abrasion analysis.Common wear type mainly has normally during equipment wearing Abrasion, adhesive wear, abrasive wear, fatigue wear, corrosive wear etc., the abrasive type in corresponding analyzing iron spectrum are main For normal abrasive grain, seriously slide abrasive grain, cutting wear particles, tired abrasive grain, spherical wear particles etc..Different types of abrasive grain has different The core of morphological feature, offline Spectral Analysis Technology is that through human brain to spies such as the pattern of abrasive grain, texture, color, sizes Sign is analyzed, the analyst for relying primarily on domain expert or having wide experience, this requires operator to need with very high Professional knowledge, it is difficult to promote and apply.Due to the result of this qualitative analysis be often too dependent on analyst subjective judgement and Experience is likely to result in prodigious error, and manual analysis efficiency is low, this will lead to human resources to a certain extent Waste.Therefore the automatic identification technology of abrasive grain then comes into being, and for offline iron spectrogram picture, Shanghai Maritime University proposes one kind Iron composes the method (publication number of abrasive grain texture feature extraction and pattern-recognition:CN104484675A), Chinese People's Liberation Army's Headquarters of the General Staff Mou Buluhang research institutes propose a kind of interactive wear particle image annotation method (publication number:CN 102768730A).With calculating The continuous development of machine technology and mode identification method constantly pushes offline iron spectrum Debris Image analysis towards automation, number Change direction to develop.
At present iron spectrogram as particle partition field it is most widely used be grey relevant dynamic matrix, it is simple and practicable, but due to mill Grain generally comprises dark, light color and 3 major class of secondary colour, and a width iron spectrogram may occur brightness as in and be higher than background gray scale simultaneously Bright abrasive grain and brightness are less than the dark abrasive grain of background gray scale, incomplete it will cause dividing according to single fixed threshold value, in turn Influence next abrasive grain Feature extraction and recognition.Tsinghua University waits by force in generation in " the abrasive grain figure based on backcolor identification Color Debris Image at twice carries dark abrasive grain and light abrasive grain using grey relevant dynamic matrix first in picture dividing method " It takes, then the result extracted twice is individually subtracted with original image and obtains pretreated background image;Then background image is extracted Average gray corresponding to three sets of half-tone informations of red, green, blue and standard deviation, in conjunction with the mistake of each color component of test of many times Poor range extracts pretreated background image, to obtain Debris Image.This method needs repeatedly to carry out threshold value point It is low to cut operating efficiency, and its experience application is relatively fixed, for different research objects and different ferroscopes It is required for re-starting test of many times to seek error empirical value.Guo Heng light of naval aviation engineering college etc. " is being based on parting The Debris Image of feature is divided " the middle parting dimension using Debris Image, multiple parting dimension, the half-tone information of image, in conjunction with Self-organizing feature map neural network carries out Debris Image segmentation.This method needs to extract a large amount of Debris Image parting dimension spy Sign is trained neural network, and of long duration, efficiency is low, and the problems such as be susceptible to hole defect.In addition tired abrasive grain with it is serious Size, the vpg connection feature differentiation for sliding abrasive grain are smaller, and surface texture has certain similitude, how to choose conjunction Suitable textural characteristics are then the critical issues for improving tired abrasive grain and serious sliding wear Particles Recognition accuracy rate.Nanjing aerospace is big Wu Zhenfeng etc. has studied more than 100 abrasive grain ginsengs in " wear particle Microscope morphology characteristic quantification description system " Number, including abrasive grain form parameter, color parameter (RGB), parametric texture and fractal parameter etc..Wherein it is based on gray level co-occurrence matrixes line Reason characteristic parameter is smaller for serious sliding abrasive grain and tired abrasive grain discrimination in practical application, leads to two kinds of abrasive grains certainly Dynamic identification precision is relatively low.
In conclusion abrasive grain automatic identification is extracted there are target abrasive grain and the difficulties such as abrasive grain feature extraction, so as to cause Automatic identification accuracy rate is relatively low, cannot meet industrial practical application request.Therefore, by the present computer technology and iron spectrum diagnosis skill Art is combined, and is actually answered to realize the automatic identification to abrasive grain to meet industry with a kind of fast and effectively quantitative analysis method It has been very urgent with demand.
Invention content
In order to solve the technical bottleneck that existing abrasive grain automatically analyzes, the purpose of the present invention is to provide a kind of multi-texturing features The abrasive type automatic identifying method of fusion, for the abrasive grain needs of dark color, light color and the secondary colour occurred in Debris Image Abrasive grain is split using a variety of image processing techniques methods, abrasive grain is detached with background, extracts target abrasive grain.Needle The similitude of tired abrasive grain and serious sliding abrasive grain is used and is based on gray level co-occurrence matrixes and Tamura textural characteristics parameter phases The method of fusion obtains the textural characteristics of abrasive grain, finally combines machine learning method to realize the Classification and Identification of abrasive grain, and then attach most importance to The research of big equipment state of wear and abrasion mechanism provides reference frame.
To achieve the above object, the present invention adopts the following technical scheme that:
A kind of abrasive type automatic identifying method of multi-texturing Fusion Features, includes the following steps:
Step 1:The identical abrasive grain picture of typical all kinds of quantity is selected from standard grind image library, as training Sample;
Step 2:Processing is carried out to all kinds of Debris Images and obtains Debris Image rgb value, then gray processing is carried out to RGB image Processing, and former RGB image is converted to the S under hsv color space, the Debris Image corresponding to V component;
Step 3:The image after RGB image gray processing and the Debris Image corresponding to S, V component are carried out respectively adaptive Threshold segmentation and Morphological scale-space are answered, image, S components after RGB image gray processing are found out using zone marker interconnection algorithm The abrasive grain area of Debris Image, three width Debris Image of V component Debris Image selects the wherein maximum Debris Image of abrasive grain area;
It is as follows:
S1 is as follows to three width abrasive grain gray level images into row threshold division:
1) note T is the segmentation threshold of Debris Image foreground and background, and value range is 0~255, take 0 successively to T~ 255 values are calculated;
2) foreground and background is divided the image into according to present threshold value T, it is w to find out foreground points respectively and account for image scaled0, Average gray is u0;It is w that background points, which account for image scaled,1, average gray u1, the overall average gray scale of image is u;
The variance of foreground and background image is calculated according to formula (4):
Formula (4):δ2(T)=w0×(u0-u)2+w1×(u1-u)2
3) variance δ corresponding when T takes 0~255 is found out respectively2(T), 256 variance yields are compared, when selecting variance minimum The threshold value that corresponding T values are divided as Debris Image;
4) gray level image is subjected to binaryzation operation according to formula (5) again, it is black to obtain abrasive grain, and background is white Mask schemes:
Formula (5):
F (x, y) indicates the gray-scale map of Debris Image in formula (5);
S2 carries out Morphological scale-space to obtained gray level image, that is, carries out " reversed " processing, " closed operation " processing, and " hole Fill in hole " operation;Wherein " reverse process " is that " closed operation " operation is prepared, and " closed operation " operation can connect smooth abrasive grain figure As edge picture number point, " holes filling " is then directed to the hole occurred inside closed Debris Image and is filled, and ensures to be partitioned into Debris Image integrality;
S3, in Debris Image, there are multiple abrasive grains, and " eight connected region algorithm " is utilized to find out the maximum guarantor of abrasive grain area It gives over to and observes object for research, reject other smaller abrasive grains, operator is as shown, intermediate point is then current pixel point;Tool Steps are as follows for body:
1) judge most left in this eight neighborhood, upper left is most upper, the case where upper right;If all without point, then it represents that one The beginning in a new region;
If 2) in this eight neighborhood it is most left a little, it is upper right all a little, then it is minimum in the two to mark this point Mark point, and change big label and be;
If 3) upper left in this eight neighborhood is a little, it is upper right all a little, then it is minimum in the two to mark this point Mark point, and change big label and be;
4) otherwise according to most left, upper left is most upper, upper right sequence, and it is one in four to mark this point;
5) all abrasive grains in tag image, and each abrasive grain area is counted, retain maximum abrasive grain in image;
S4, remains with the Debris Image of maximum abrasive grain obtained by three width Debris Images of comparison are final, selection is partitioned into mill Grain accumulates maximum Debris Image as further Debris Analysis research object;
Step 4:Gray level co-occurrence matrixes texture feature extraction based on probability statistics is carried out to selected Debris Image, It obtains and 4 closely related characteristic parameters of abrasive grain textural characteristics, respectively energy, the moment of inertia, correlation, entropy;
Step 5:Tamura texture feature extractions based on human vision sense organ are carried out to selected Debris Image, are obtained Obtain three characteristic parameters closely related with abrasive grain textural characteristics, respectively roughness, contrast, direction degree;
Step 6:It chooses gray level co-occurrence matrixes textural characteristics and Tamura textural characteristics utilizes Principal Component Analysis (PCA) The input feature vector that Texture characteristic parameter is acquired as neural network is trained, and builds the neural network classifier of abrasive grain;
Step 7:It selects Debris Image to be identified and repeats step 2 to step 5, the mill to be identified that will be extracted The input feature value for the neural network classifier that grain character is finished as training, realizes the automatic Classification and Identification of abrasive grain.
The step two is as follows:
The rgb value of each pixel (x, y) in Debris Image is carried out gray processing processing according to formula (1), obtained by S1 Gray level image,
Formula (1):R (x, y), G in f (x, y)=0.299R (x, y)+0.587G (x, y)+0.114B (x, y) formula (1) R, G of pixel, B component value at coordinate (x, y) in (x, y), B (x, y) representative image, R (x, y), G in following step formula (x, y), B (x, y) indicate that meaning is identical as formula (1);
The rgb value of each pixel (x, y) in Debris Image is carried out color space coordinate according to formula (2) and turned by S2 It changes, obtains the Debris Image under V component,
Formula (2):
The rgb value of each pixel (x, y) in Debris Image is carried out color space coordinate according to formula (3) and turned by S3 It changes, obtains the Debris Image under S components,
Formula (3):
The step six is as follows:
S1 is based on selected abrasive grain textural characteristics parameter, textural characteristics parameter matrix is built, as shown in formula (20):
Formula (19):X=[xi1 xi2 … xij], (i=1,2 ..., n;J=1,2 ..., p)
Wherein xijIt is textural characteristics parameter, i is sample abrasive grain number, and j is textural characteristics number of parameters;
S2 calculates textural characteristics parameter correlation matrix, calculates as shown in formula (20):
Formula (20):
S3 calculates the eigen vector of textural characteristics matrix, as shown in formula (21):
Formula (21):| λ I-R | U=0
I is unit vector, λ in formulai(i=1,2 ..., n), λ1≥λ2…≥λn>=0, U=[ui1 ui2 … uin] it is spy Value indicative λiFeature vector;
S4, the contribution rate of gauging surface textural characteristics parameter, as shown in formula (22):
Formula (22):
S5 calculates abrasive particle surface texture comprehensive parameters, as shown in formula (23):
Formula (23):Zi×k=XUT, (i=1,2 ..., n;K=1,2 ..., n)
Wherein Zi×kIt is and Xi×jCorresponding principal component eigenmatrix, Z=[zi1 zi2 … zik] as finally obtained Abrasive grain Texture feature vector;
S6, the neural network classifier wherein input layer that abrasive grain is built based on obtained Texture characteristic parameter are Z =[zi1 zi2 … zik], it is needed before training by each textural characteristics parameter normalization in Z to [- 1,1], output layer is serious sliding Dynamic abrasive grain and tired abrasive grain, input layer use Tanh activation primitives, training to use momentum gradient descent method with hidden layer, wherein dynamic Amount, learning rate, learning rate growth rate, learning rate minimizing speed be respectively 0.9,0.01 .05,0.7, trained end condition It is less than 0.001 for the error of predicted value and desired value.
Advantageous effect:
The present invention is applied to Wear Condition of Machinery and monitors field, has the advantages that:
1, Spectral Analysis Technology is combined by the present invention with image processing techniques, is obtained with multiple using ferroscope The spectral slice image of abrasive grain, by the way that former RGB image to be gone to three width abrasive grains under the single component under hsv color space coordinate respectively Image is split gained image, and the automatic Debris Image obtained after the wherein best segmentation of segmentation effect is realized with more The Debris Image segmentation of kind situation, is suitable for the dividing processing of all Debris Images in analyzing iron spectrum field;
2, the present invention is based on image texture characteristics carries for the texture features of tired abrasive grain and serious sliding abrasive particle surface Suitable textural characteristics parameter has been taken, accurate characterization parameter is provided for abrasive grain automatic identification;
3, it is based on machine learning algorithm herein, the Texture of abrasive particle surface is obtained using Principal Component Analysis (PCA) Feature carries out the Classification and Identification of abrasive grain in conjunction with neural network model, improves the identification essence of tired abrasive grain and serious sliding abrasive grain Degree.
Description of the drawings
Fig. 1 be it is a kind of based on offline iron spectrogram as the abrasive type automatic identification of particle partition and multi-texturing Fusion Features is total Body flow chart.
Fig. 2 is part training sample abrasive grain figure.
Fig. 3 (a) is while having the image of dark and light abrasive grain.
Fig. 3 (b) is only partitioned into the image of dark abrasive grain.
The image that the dark and light abrasive grains of Fig. 3 (c) are all split.
Fig. 3 (d) is only partitioned into the image of light abrasive grain.
Fig. 4 (a)-(d) correlations that respectively seriously sliding abrasive grain and tired abrasive grain are extracted, the moment of inertia, entropy, energy spy Levy parameter babinet figure.
Fig. 4 (e)-(g) is respectively that direction degree, contrast, roughness that seriously sliding abrasive grain and tired abrasive grain are extracted are special Levy parameter babinet figure.
Fig. 5 (a)-(b) is 3 × 3 masks of image.
Fig. 6 (a) is to test the classification results schematic diagram that abrasive grain is tired abrasive grain.
Fig. 6 (b) is to test the classification results schematic diagram that abrasive grain is serious sliding abrasive grain.
Specific implementation mode
This method is illustrated below in conjunction with the accompanying drawings.
Referring to Fig.1, a kind of abrasive type automatic identifying method of multi-texturing Fusion Features, includes the following steps:
Step 1: selecting identical abrasive grain picture (the serious sliding of typical all kinds of quantity from standard grind image library Abrasive grain and tired abrasive grain), as training sample, and the amplification factor of different abrasive grain pictures is marked, part sample is as shown in Figure 2.
Step 2: since Debris Image has the abrasive grain of dark light color and secondary colour three types, it is various in order to improve The segmentation effect of Debris Image carries out processing to Debris Image and obtains Debris Image rgb value, then carries out gray processing to RGB image Processing, and former RGB image is converted to the S under hsv color space, the Debris Image corresponding to V component, by original width colour Debris Image is converted to the Debris Image after three width gray processings, is as follows:
The rgb value of each pixel (x, y) in Debris Image is carried out gray processing processing according to formula (1), obtained by S1 Gray level image,
Formula (1):R (x, y), G in f (x, y)=0.299R (x, y)+0.587G (x, y)+0.114B (x, y) formula (1) R, G of pixel, B component value at coordinate (x, y) in (x, y), B (x, y) representative image, R (x, y), G in following step formula (x, y), B (x, y) indicate that meaning is identical as formula (1);
The rgb value of each pixel (x, y) in Debris Image is carried out color space coordinate according to formula (2) and turned by S2 It changes, obtains the Debris Image under V component,
Formula (2):
The rgb value of each pixel (x, y) in Debris Image is carried out color space coordinate according to formula (3) and turned by S3 It changes, obtains the Debris Image under S components,
Formula (3):
Step 3:The Debris Image corresponding to S, V component is obtained by color space conversion, in conjunction with RGB image gray processing Image afterwards carries out adaptive threshold image segmentation to three width images respectively using maximum variance between clusters (OTSU), recycles Zone marker interconnection algorithm finds out the abrasive grain area of three width Debris Images after segmentation respectively, and it is maximum to select wherein abrasive grain area Debris Image is as research object, and segmentation effect is as shown in figure 3, be as follows:
S1 is as follows to three width abrasive grain gray level images into row threshold division:
1) note T is the segmentation threshold of Debris Image foreground and background, and value range is 0~255, take 0 successively to T~ 255 values are calculated.
2) foreground and background is divided the image into according to present threshold value T, it is w to find out foreground points respectively and account for image scaled0, Average gray is u0;It is w that background points, which account for image scaled,1, average gray u1, the overall average gray scale of image is u.
3) variance of foreground and background image is calculated according to formula (4):
Formula (4):δ2(T)=w0×(u0-u)2+w1×(u1-u)2
4) variance δ corresponding when T takes 0~255 is found out respectively2(T), 256 variance yields are compared, when selecting variance minimum The threshold value that corresponding T values are divided as Debris Image.
5) gray level image is subjected to binaryzation operation according to formula (5) again, it is black to obtain abrasive grain, and background is white Mask schemes:
Formula (5):
F (x, y) indicates the gray-scale map of Debris Image in formula (5).
S2 carries out Morphological scale-space to obtained gray level image, that is, carries out " reversed " processing, " closed operation " processing, and " hole Fill in hole " operation.Wherein " reverse process " is that " closed operation " operation is prepared, and " closed operation " operation can connect smooth abrasive grain figure As edge picture number point, " holes filling " is then directed to the hole occurred inside closed Debris Image and is filled, and ensures to be partitioned into Debris Image integrality.
S3, in Debris Image, there are multiple abrasive grains, and " eight connected region algorithm " is utilized to find out the maximum guarantor of abrasive grain area It gives over to and observes object for research, reject other smaller abrasive grains, operator is as shown, intermediate point is then current pixel point.Tool Steps are as follows for body:
1) judge most left in this eight neighborhood, upper left is most upper, the case where upper right.If all without point, then it represents that one The beginning in a new region.
If 2) in this eight neighborhood it is most left a little, it is upper right all a little, then it is minimum in the two to mark this point Mark point, and change big label and be.
If 3) upper left in this eight neighborhood is a little, it is upper right all a little, then it is minimum in the two to mark this point Mark point, and change big label and be.
4) otherwise according to most left, upper left is most upper, upper right sequence, and it is one in four to mark this point.
5) all abrasive grains in tag image, and each abrasive grain area is counted, retain maximum abrasive grain in image.
S4, remains with the Debris Image of maximum abrasive grain obtained by three width Debris Images of comparison are final, selection is partitioned into mill Grain accumulates maximum Debris Image as further Debris Analysis research object.
Step 4:Serious sliding abrasive grain is closer to tired abrasive grain in shape, size characteristic parameter, and the main distinction exists In surface texture feature, therefore the gray level co-occurrence matrixes textural characteristics based on probability statistics are carried out to selected Debris Image and are carried It takes, obtains and 4 closely related characteristic parameters of abrasive grain textural characteristics, respectively energy, the moment of inertia, correlation, entropy.It is extracted Parameters value such as Fig. 4 (a)~(d) shown in, the formula of seeking of parameters is distinguished shown in following formula (6)~(9):
Formula (6):
Wherein p (i, j, d, θ) --- gray level co-occurrence matrixes element value;(i, j) --- gray scale is total to element position in matrix;d For pixel distance;θ --- grain direction, generally 0 °, 45 °, 90 ° or 135 °.
Formula (7):
Formula (8):
In formula
Formula (9):
Step 5:Tamura textural characteristics are mainly measured using the subjective psychology of the mankind as standard, corresponding well Abrasive particle surface texture can intuitively be expressed in the form of characteristic parameter, is conducive to tired mill by human visual perception The differentiation of grain and serious sliding abrasive particle surface texture.This research is mainly extracted the roughness, contrast and direction of abrasive particle surface Degree, shown in parameters value such as Fig. 4 (e)~(g) that training sample is extracted, design parameter seeks that steps are as follows:
S1, roughness is for the texture pattern with different structure, and cell sizes are bigger or primitive number of repetition is smaller, Feel more coarse to people.Its computational methods is as follows:
1) it is 2 to calculate size in imagek×2kEach as the luminance mean value such as formula of several points in the active window of rectangular area (10) shown in.Wherein (x, y) represents the brightness value of (i, j) in institute's selection area point, is determined as several ranges by k, Such as 1 × 1,2 × 2,3 × 3..., 32 × 32.
Formula (10):
2) for each as several points calculate separately it in the horizontal and vertical directions between the active window of non-overlapping copies Mean intensity is poor.As shown in formula (11), (12):
Formula (11):Ek,h=| Ak(x+2k-1,y)-Ak(x-2k-1,y)|
Formula (12):Ek,v=| Ak(x,y+2k-1)-Ak(x,y-2k-1)|
Wherein Ek,hThis is represented as several horizontal direction differences, Ek,vThis is represented as several vertical direction differences.According to each As several points, finding can make E values reach maximum optimum size SbestAs shown in formula (13), wherein no matter any direction can make E values Reach maximum k, as shown in formula (14).
Formula (13):Sbest(x, y)=2k
Formula (14):Ek=Emax=max (E1,E2,...,Eh)
3) by calculating S in entire imagebestBe averaged and be worth to roughness Fcrs, wherein m and n be image width and Highly, as shown in formula (15):
Formula (15):
S2, contrast refer in piece image light and shade region it is most bright it is white and most dark it is black between luminance level, as number The bigger contrast that represents of disparity range it is bigger, vice versa.To it is each as several neighborhoods calculate separately its mean value, variance, The statistical natures such as kurtosis, to weigh the global variable of contrast in whole image or region.It is calculated as shown in formula (16):
Formula (16):
Wherein σ is the standard variance of gradation of image, σ4It indicates that gradation of image is worth kurtosis, passes through α4=u44Definition;u4It is Quadravalence matrix mean value, σ2Indicate the variance of gray value of image.
S3, direction degree describe texture is how to take a walk or concentrate along certain directions, its shape with texture primitive and The rule of arrangement is related.Its computational methods is as follows:
1) mould and local edge direction of the gradient vector at each pixel are calculated.It is calculated such as formula (17), formula (18) institute Show:
Formula (17):| Δ G |=(| Δ H |+| Δ V |)/2
Formula (18):
The two formula can pass through two of (a) and (b) in around image pixel point 3 × 3 rectangular area trellis diagram 5 A 3 × 3 mask is realized.
2) 0- π region divisions are taken the maximum value in each section at 16 decilesCount θ HDAngle corresponds to phase in each region Answer | Δ G | it is more than the pixel quantity n of given threshold value (threshold value is set as 12)p;Calculate the gradient vector number construction of all pixels Histogram HD, the histogram first to the codomain range of θ carry out discretization, andIt is the position of peak value in the histogram.Finally, The directionality of image totality can be by calculating in histogram shown in the acuity such as formula (19) of peak value:
Formula (19):
Wherein p represents some peak value, wpThe peak value obtains range between representing paddy.This feature represents the direction of whole image Property arrangement consistent degree.
Step 6:Principal Component Analysis is utilized for selected gray level co-occurrence matrixes textural characteristics and Tamura textural characteristics (PCA) input feature vector for acquiring Texture characteristic parameter as neural network is trained, and builds the neural network point of abrasive grain Class device, is as follows:
S1 is based on selected abrasive grain textural characteristics parameter, textural characteristics parameter matrix is built, as shown in formula (20):
Formula (19):X=[xi1 xi2 … xij], (i=1,2 ..., n;J=1,2 ..., p)
Wherein xijIt is textural characteristics parameter, i is sample abrasive grain number, and j is textural characteristics number of parameters;
S2 calculates textural characteristics parameter correlation matrix, calculates as shown in formula (20):
Formula (20):
S3 calculates the eigen vector of textural characteristics matrix, as shown in formula (21):
Formula (21):| λ I-R | U=0
I is unit vector, λ in formulai(i=1,2 ..., n), λ1≥λ2…≥λn>=0, U=[ui1 ui2 … uin] it is spy Value indicative λiFeature vector.
S4, the contribution rate of gauging surface textural characteristics parameter, as shown in formula (22):
Formula (22):
S5 calculates abrasive particle surface texture comprehensive parameters, as shown in formula (23):
Formula (23):Zi×k=XUT, (i=1,2 ..., n;K=1,2 ..., n)
Wherein Zi×kIt is and Xi×jCorresponding principal component eigenmatrix, Z=[zi1 zi2 … zik] as finally obtained Abrasive grain Texture feature vector.
S6, the neural network classifier wherein input layer that abrasive grain is built based on obtained Texture characteristic parameter are Z =[zi1 zi2 … zik], it is needed before training by each textural characteristics parameter normalization in Z to [- 1,1], output layer is serious sliding Dynamic abrasive grain and tired abrasive grain, input layer use Tanh activation primitives, training to use momentum gradient descent method with hidden layer, wherein dynamic Amount, learning rate, learning rate growth rate, learning rate minimizing speed be respectively 0.9,0.01 .05,0.7, trained end condition It is less than 0.001 for the error of predicted value and desired value.
Step 7:It selects Debris Image to be identified and repeats step 2 to step 5, the mill to be identified that will be extracted The input feature value for the neural network classifier that grain character is finished as training, realizes the automatic Classification and Identification of abrasive grain, wherein It is as shown in Figure 6 to test wear debris classifying result schematic diagram.

Claims (3)

1. a kind of abrasive type automatic identifying method of multi-texturing Fusion Features, which is characterized in that include the following steps:
Step 1:The identical abrasive grain picture of typical all kinds of quantity is selected from standard grind image library, as training sample;
Step 2:Processing is carried out to all kinds of Debris Images and obtains Debris Image rgb value, then gray processing processing is carried out to RGB image, And former RGB image is converted to the S under hsv color space, the Debris Image corresponding to V component;
Step 3:Adaptive thresholding is carried out to the image after RGB image gray processing and the Debris Image corresponding to S, V component respectively Value segmentation and Morphological scale-space, the image after RGB image gray processing, S component abrasive grains are found out using zone marker interconnection algorithm The abrasive grain area of image, three width Debris Image of V component Debris Image selects the wherein maximum Debris Image of abrasive grain area;Specifically Steps are as follows:
S1 is as follows to three width abrasive grain gray level images into row threshold division:
1) note T is the segmentation threshold of Debris Image foreground and background, and value range is 0~255, takes 0~255 value successively to T It is calculated;
2) foreground and background is divided the image into according to present threshold value T, it is w to find out foreground points respectively and account for image scaled0, average Gray scale is u0;It is w that background points, which account for image scaled,1, average gray u1, the overall average gray scale of image is u;
The variance of foreground and background image is calculated according to formula (4):
Formula (4):δ2(T)=w0×(u0-u)2+w1×(u1-u)2
3) variance δ corresponding when T takes 0~255 is found out respectively2(T), 256 variance yields are compared, select variance minimum when institute right The threshold value that the T values answered are divided as Debris Image;
4) gray level image is subjected to binaryzation operation according to formula (5) again, it is black to obtain abrasive grain, and background is the mask of white Figure:
Formula (5):
F (x, y) indicates the gray-scale map of Debris Image in formula (5);
S2 carries out Morphological scale-space to obtained gray level image, that is, carries out " reversed " processing, " closed operation " processing, and " hole is filled out Fill " operation;Wherein " reverse process " is that " closed operation " operation is prepared, and " closed operation " operation can connect smooth Debris Image side Edge picture number point, " holes filling " are then directed to the hole occurred inside closed Debris Image and are filled, and ensure the mill being partitioned into The integrality of grain image;
S3, in Debris Image, there are multiple abrasive grains, and utilizing " eight connected region algorithm " to find out, abrasive grain area is maximum to be reserved for Object is observed for research, rejects other smaller abrasive grains, operator is as shown, intermediate point is then current pixel point;Specific step It is rapid as follows:
1) judge most left in this eight neighborhood, upper left is most upper, the case where upper right;If all without point, then it represents that one new Region beginning;
If 2) in this eight neighborhood it is most left a little, it is upper right all a little, then mark this point for the minimum label in the two Point, and change big label and be;
If 3) upper left in this eight neighborhood is a little, it is upper right all a little, then mark this point for the minimum label in the two Point, and change big label and be;
4) otherwise according to most left, upper left is most upper, upper right sequence, and it is one in four to mark this point;
5) all abrasive grains in tag image, and each abrasive grain area is counted, retain maximum abrasive grain in image;
S4, remains with the Debris Image of maximum abrasive grain obtained by three width Debris Images of comparison are final, selection is partitioned into abrasive grain face The maximum Debris Image of product is as further Debris Analysis research object;
Step 4:Gray level co-occurrence matrixes texture feature extraction based on probability statistics is carried out to selected Debris Image, is obtained 4 characteristic parameters closely related with abrasive grain textural characteristics, respectively energy, the moment of inertia, correlation, entropy;
Step 5:Tamura texture feature extractions based on human vision sense organ are carried out to selected Debris Image, obtain with Three closely related characteristic parameters of abrasive grain textural characteristics, respectively roughness, contrast, direction degree;
Step 6:It chooses gray level co-occurrence matrixes textural characteristics and Tamura textural characteristics is acquired using Principal Component Analysis (PCA) Texture characteristic parameter is trained as the input feature vector of neural network, builds the neural network classifier of abrasive grain;
Step 7:It selects Debris Image to be identified and repeats step 2 to step 5, the abrasive grain to be identified that will be extracted is special The input feature value for levying the neural network classifier finished as training, realizes the automatic Classification and Identification of abrasive grain.
2. a kind of abrasive type automatic identifying method of multi-texturing Fusion Features according to claim 1, which is characterized in that The step two is as follows:
The rgb value of each pixel (x, y) in Debris Image is carried out gray processing processing according to formula (1), obtains gray scale by S1 Image,
Formula (1):R (x, y) in f (x, y)=0.299R (x, y)+0.587G (x, y)+0.114B (x, y) formula (1), G (x, Y), R, G of pixel, B component value at coordinate (x, y) in B (x, y) representative image, R (x, y) in following step formula, G (x, Y), B (x, y) indicates that meaning is identical as formula (1);
The rgb value of each pixel (x, y) in Debris Image is carried out color space coordinate conversion according to formula (2), obtained by S2 Debris Image under to V component,
Formula (2):
The rgb value of each pixel (x, y) in Debris Image is carried out color space coordinate conversion according to formula (3), obtained by S3 Debris Image under S components,
Formula (3):
3. a kind of abrasive type automatic identifying method of multi-texturing Fusion Features according to claim 1, which is characterized in that The step six is as follows:
S1 is based on selected abrasive grain textural characteristics parameter, textural characteristics parameter matrix is built, as shown in formula (20):
Formula (19):X=[xi1 xi2 … xij], (i=1,2 ..., n;J=1,2 ..., p)
Wherein xijIt is textural characteristics parameter, i is sample abrasive grain number, and j is textural characteristics number of parameters;
S2 calculates textural characteristics parameter correlation matrix, calculates as shown in formula (20):
Formula (20):
S3 calculates the eigen vector of textural characteristics matrix, as shown in formula (21):
Formula (21):| λ I-R | U=0
I is unit vector, λ in formulai(i=1,2 ..., n), λ1≥λ2…≥λn>=0, U=[ui1 ui2 … uin] it is characterized value λi Feature vector;
S4, the contribution rate of gauging surface textural characteristics parameter, as shown in formula (22):
Formula (22):
S5 calculates abrasive particle surface texture comprehensive parameters, as shown in formula (23):
Formula (23):Zi×k=XUT, (i=1,2 ..., n;K=1,2 ..., n)
Wherein Zi×kIt is and Xi×jCorresponding principal component eigenmatrix, Z=[zi1 zi2 … zik] it is finally obtained abrasive grain Texture feature vector;
S6, the neural network classifier wherein input layer that abrasive grain is built based on obtained Texture characteristic parameter are Z=[zi1 zi2 … zik], it is needed before training by each textural characteristics parameter normalization in Z to [- 1,1], output layer is serious sliding abrasive grain With tired abrasive grain, input layer uses Tanh activation primitives, training to use momentum gradient descent method, wherein momentum, study with hidden layer Rate, learning rate growth rate, learning rate minimizing speed be respectively 0.9,0.01 .05,0.7, trained end condition is predicted value It is less than 0.001 with the error of desired value.
CN201810118514.5A 2018-02-06 2018-02-06 Multi-texture feature fusion type automatic abrasive particle type identification method Active CN108305259B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810118514.5A CN108305259B (en) 2018-02-06 2018-02-06 Multi-texture feature fusion type automatic abrasive particle type identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810118514.5A CN108305259B (en) 2018-02-06 2018-02-06 Multi-texture feature fusion type automatic abrasive particle type identification method

Publications (2)

Publication Number Publication Date
CN108305259A true CN108305259A (en) 2018-07-20
CN108305259B CN108305259B (en) 2020-03-24

Family

ID=62864420

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810118514.5A Active CN108305259B (en) 2018-02-06 2018-02-06 Multi-texture feature fusion type automatic abrasive particle type identification method

Country Status (1)

Country Link
CN (1) CN108305259B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111402215A (en) * 2020-03-07 2020-07-10 西南交通大学 Contact net insulator state detection method based on robust principal component analysis method
CN112381140A (en) * 2020-11-13 2021-02-19 国家能源集团泰州发电有限公司 Abrasive particle image machine learning identification method based on new characteristic parameters
CN113223022A (en) * 2021-05-31 2021-08-06 湖南科技大学 Multivariate image segmentation method based on multivariate texture image analysis algorithm
CN115018845A (en) * 2022-08-09 2022-09-06 聊城市泓润能源科技有限公司 Method for detecting quality of lubricating oil abrasive particles
CN115937755A (en) * 2023-02-21 2023-04-07 山东双力现代农业装备有限公司 Visual detection method for vertical milling blade of tractor gearbox
CN117934461A (en) * 2024-03-21 2024-04-26 广州航海学院 Method, system and equipment for analyzing polishing surface roughness of side polishing optical fiber

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102768730A (en) * 2012-06-25 2012-11-07 中国人民解放军总参谋部陆航研究所 Interactive wear particle image annotation method
CN103886579A (en) * 2013-12-11 2014-06-25 西安交通大学 Abrasive particle chain self-adaptive segmentation method orienting online ferrographic image automatic identification
CN104484675A (en) * 2014-12-15 2015-04-01 上海海事大学 Method for extraction of texture features and pattern recognition of ferrographic wear particles
CN105631481A (en) * 2016-01-07 2016-06-01 西安交通大学 Ferrograph abrasive particle composite characteristic construction method based on heredity programming
CA2969232A1 (en) * 2014-12-30 2016-07-07 Halliburton Energy Services, Inc. Downhole tool surfaces configured to reduce drag forces and erosion during exposure to fluid flow

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102768730A (en) * 2012-06-25 2012-11-07 中国人民解放军总参谋部陆航研究所 Interactive wear particle image annotation method
CN103886579A (en) * 2013-12-11 2014-06-25 西安交通大学 Abrasive particle chain self-adaptive segmentation method orienting online ferrographic image automatic identification
CN104484675A (en) * 2014-12-15 2015-04-01 上海海事大学 Method for extraction of texture features and pattern recognition of ferrographic wear particles
CA2969232A1 (en) * 2014-12-30 2016-07-07 Halliburton Energy Services, Inc. Downhole tool surfaces configured to reduce drag forces and erosion during exposure to fluid flow
CN105631481A (en) * 2016-01-07 2016-06-01 西安交通大学 Ferrograph abrasive particle composite characteristic construction method based on heredity programming

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
冯云: "基于铁谱技术的油液监测数字化研究", 《中国优秀硕士学位论文全文数据库 工程科技辑II》 *
罗炳海等: "基于PCA-BP神经网络的磨粒自动识别", 《润滑与密封》 *
谢永华: "数字图像处理技术在木材表面缺陷检测中的应用研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111402215A (en) * 2020-03-07 2020-07-10 西南交通大学 Contact net insulator state detection method based on robust principal component analysis method
CN111402215B (en) * 2020-03-07 2022-04-29 西南交通大学 Contact net insulator state detection method based on robust principal component analysis method
CN112381140A (en) * 2020-11-13 2021-02-19 国家能源集团泰州发电有限公司 Abrasive particle image machine learning identification method based on new characteristic parameters
CN112381140B (en) * 2020-11-13 2024-02-06 国家能源集团泰州发电有限公司 Abrasive particle image machine learning identification method based on new characteristic parameters
CN113223022A (en) * 2021-05-31 2021-08-06 湖南科技大学 Multivariate image segmentation method based on multivariate texture image analysis algorithm
CN113223022B (en) * 2021-05-31 2022-04-12 湖南科技大学 Multivariate image segmentation method based on multivariate texture image analysis algorithm
CN115018845A (en) * 2022-08-09 2022-09-06 聊城市泓润能源科技有限公司 Method for detecting quality of lubricating oil abrasive particles
CN115018845B (en) * 2022-08-09 2022-10-25 聊城市泓润能源科技有限公司 Method for detecting quality of lubricating oil abrasive particles
CN115937755A (en) * 2023-02-21 2023-04-07 山东双力现代农业装备有限公司 Visual detection method for vertical milling blade of tractor gearbox
CN117934461A (en) * 2024-03-21 2024-04-26 广州航海学院 Method, system and equipment for analyzing polishing surface roughness of side polishing optical fiber
CN117934461B (en) * 2024-03-21 2024-06-07 广州航海学院 Method, system and equipment for analyzing polishing surface roughness of side polishing optical fiber

Also Published As

Publication number Publication date
CN108305259B (en) 2020-03-24

Similar Documents

Publication Publication Date Title
CN108305259A (en) A kind of abrasive type automatic identifying method of multi-texturing Fusion Features
CN106529429B (en) A kind of skin of face analysis system based on image recognition
US10860879B2 (en) Deep convolutional neural networks for crack detection from image data
Jagadev et al. Detection of leukemia and its types using image processing and machine learning
CN104794491B (en) Based on the fuzzy clustering Surface Defects in Steel Plate detection method presorted
Agaian et al. Automated screening system for acute myelogenous leukemia detection in blood microscopic images
CN108319966B (en) The method for identifying and classifying of equipment in a kind of substation's complex background infrared image
CN105893925A (en) Human hand detection method based on complexion and device
CN106650770A (en) Mura defect detection method based on sample learning and human visual characteristics
Dadwal et al. Color image segmentation for fruit ripeness detection: a review
CN106250801A (en) Based on Face datection and the fatigue detection method of human eye state identification
CN107025652A (en) A kind of flame detecting method based on kinetic characteristic and color space time information
CN110070526A (en) Defect inspection method based on the prediction of deep neural network temperature figure
CN107730515A (en) Panoramic picture conspicuousness detection method with eye movement model is increased based on region
Wuest et al. Region based segmentation of QuickBird multispectral imagery through band ratios and fuzzy comparison
CN107154044B (en) Chinese food image segmentation method
CN107677216A (en) A kind of multiple abrasive particle three-dimensional appearance synchronous obtaining methods based on photometric stereo vision
CN108062508A (en) The extracting method of equipment in substation's complex background infrared image
CN110473199A (en) A kind of detection of color spot acne and health assessment method based on the segmentation of deep learning example
Liu et al. A novel color-texture descriptor based on local histograms for image segmentation
CN106056078B (en) Crowd density estimation method based on multi-feature regression type ensemble learning
Sharma et al. Automatically detection of skin cancer by classification of neural network
CN103185731A (en) Device for detecting beef tenderness based on color image textural features and method thereof
CN104123569B (en) Video person number information statistics method based on supervised learning
Castelo-Quispe et al. Optimization of brazil-nuts classification process through automation using colour spaces in computer vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant