CN116721391B - Method for detecting separation effect of raw oil based on computer vision - Google Patents

Method for detecting separation effect of raw oil based on computer vision Download PDF

Info

Publication number
CN116721391B
CN116721391B CN202311008220.4A CN202311008220A CN116721391B CN 116721391 B CN116721391 B CN 116721391B CN 202311008220 A CN202311008220 A CN 202311008220A CN 116721391 B CN116721391 B CN 116721391B
Authority
CN
China
Prior art keywords
point
growth
initial seed
pixel
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311008220.4A
Other languages
Chinese (zh)
Other versions
CN116721391A (en
Inventor
司相芳
霍保芝
徐燕
胡伟涛
孔庆龙
赵蒙
范雪梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Hengxin Technology Development Co ltd
Original Assignee
Shandong Hengxin Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Hengxin Technology Development Co ltd filed Critical Shandong Hengxin Technology Development Co ltd
Priority to CN202311008220.4A priority Critical patent/CN116721391B/en
Publication of CN116721391A publication Critical patent/CN116721391A/en
Application granted granted Critical
Publication of CN116721391B publication Critical patent/CN116721391B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A20/00Water conservation; Efficient water supply; Efficient water use
    • Y02A20/20Controlling water pollution; Waste water treatment
    • Y02A20/204Keeping clear the surface of open water from oil spills

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image data processing, and provides a raw oil separation effect detection method based on computer vision, which comprises the following steps: obtaining a separation gray scale map; acquiring a denoising image according to the separation gray level image, acquiring an initial seed point, acquiring a growing pixel point based on the initial seed point, and acquiring gradient dispersion of the initial seed point and the growing pixel point; acquiring a gray matrix, acquiring contrast according to element values and element positions, acquiring edge roughness by combining gray value differences, and acquiring the relative roughness of an initial seed point and a growing pixel point according to the edge roughness; obtaining a similar window, obtaining edge continuity of an initial seed point and a growing pixel point, obtaining a growing value according to gradient dispersion, edge continuity and relative roughness, obtaining a growing edge according to the growing value, and finishing effect detection. The invention can obtain the classification effect more accurately through multiple analysis.

Description

Method for detecting separation effect of raw oil based on computer vision
Technical Field
The invention relates to the technical field of image data processing, in particular to a raw oil separation effect detection method based on computer vision.
Background
The quality of various oil products is definitely used as the main target in raw oil separation effect detection, and the detection is an important link before raw oil products enter the market, and in the process of detecting the raw oil separation effect, the final detection result is affected by various factors, so that the accuracy of the result cannot meet the standard requirement, and the problems of negligence of technicians, irregular equipment operation and the like can cause errors in the detection result. The traditional method for detecting the separation effect of the raw oil mostly adopts an experimental analysis method, but has the defects of high cost, long time, large influence of human factors and the like, and is unfavorable for real-time control and analysis, so that improvement on the detection method for the separation effect of the raw oil is urgently needed.
Disclosure of Invention
The invention provides a method for detecting a separation effect of raw oil based on computer vision, which aims to solve the problem of error of results, and adopts the following specific technical scheme:
the embodiment of the invention provides a method for detecting the separation effect of raw oil based on computer vision, which comprises the following steps:
collecting an original image, preprocessing the original image, and obtaining a separation gray scale image;
acquiring a denoising image according to the separation gray level image, acquiring two ROI areas according to the denoising image, wherein the two ROI areas are respectively in a water area and an oil area, acquiring initial seed points in the ROI areas, acquiring growth pixel points based on the initial seed points, and acquiring gradient dispersion of the initial seed points and the growth pixel points according to gradient values of the initial seed points and the growth pixel points;
respectively acquiring a matrix window for the initial seed point and the growth pixel point, acquiring a corresponding gray matrix according to each matrix window, and acquiring the contrast of the matrix window according to the element value and the element position after the normalization of the gray matrix; obtaining gray scale similar entropy of the matrix window according to gray scale value differences of different pixel points and initial seed points in the matrix window; obtaining edge roughness according to contrast and gray scale similarity entropy of the matrix window, and obtaining relative roughness of the initial seed point and the growing pixel point according to difference of the edge roughness of the initial seed point and the growing pixel point;
respectively acquiring a similar window for the initial seed point and the growth pixel point, acquiring edge continuity of the initial seed point and the growth pixel point according to gray values of the pixel points in the similar window, acquiring a growth value according to gradient dispersion, edge continuity and relative roughness between the initial seed point and the growth pixel point, and performing iteration according to the growth value to acquire a growth edge;
and finishing effect detection according to the distance between the growing edges of all the ROI areas.
Preferably, the method for acquiring the initial seed point in the ROI area comprises the following steps:
for a preset number of pixel points with the largest gradient value, calculating standard deviation of gray values of the pixel points and all pixel points in eight neighborhoods of the pixel points, adding the gradient value and the standard deviation of each pixel point in the preset number of pixel points with the largest gradient, and taking the pixel point corresponding to the maximum value of the addition of the gradient value and the standard deviation as an initial seed point.
Preferably, the method for obtaining the growing pixel point based on the initial seed point comprises the following steps:
for an initial seed point, acquiring a gradient direction of the initial seed point, and making a vertical line of the gradient direction, wherein the direction pointed by the vertical line is a growth direction, and the two growth directions are respectively marked as a first growth direction and a second growth direction;
the first pixel point encountered by the initial seed point along any growth direction is marked as a growth possible point, the overgrowth possible point is made into a vertical line of the growth direction, two pixel points closest to the growth possible point are respectively taken from two sides of the growth possible point on the vertical line, and the growth possible point and four pixel points closest to the growth possible point are marked as growth pixel points.
Preferably, the method for obtaining the gradient dispersion of the initial seed point and the growth pixel point according to the gradient values of the initial seed point and the growth pixel point comprises the following steps:
in the method, in the process of the invention,a horizontal gradient value representing the initial seed point,a numerical gradient value representing the initial seed point,representing the horizontal gradient value of the nth growing pixel point corresponding to the initial seed point,representing the value gradient value of the nth growth pixel point number corresponding to the initial seed point,representing the maximum gradient value in the denoised image,representing the gradient dispersion of the initial seed point and its corresponding nth grown pixel point.
Preferably, the method for obtaining the corresponding gray matrix according to each matrix window includes:
each matrix window acquires a plurality of gray level co-occurrence matrixes in a preset direction, the gray level co-occurrence matrixes only have 4 gray levels, the step pitch is 1, and the gray level matrixes of each matrix window are obtained by weighting the preset number of gray level co-occurrence matrixes.
Preferably, the method for obtaining the contrast ratio of the matrix window according to the element value and the element position after the normalization of the gray matrix comprises the following steps:
in the method, in the process of the invention,representing normalized gray scale momentThe number of columns of the array,the number of rows representing the normalized gray matrix,the element values representing the j-th row and i-th column of the normalized gray matrix,representing the contrast of the matrix window.
Preferably, the method for obtaining the gray scale similarity entropy of the matrix window according to the gray scale value difference between different pixel points and the initial seed points in the matrix window comprises the following steps:
the gray values of all the pixel points are differenced with the gray values of the initial seed points in the matrix window, all the pixel points are classified according to the comparison of the difference value and a preset threshold value, the number of the pixel points in each type is obtained and is recorded as a first number, and the ratio of the first number to the number of the pixel points in the matrix window is recorded as the occurrence frequency of each type of pixel points;
and calculating the gray scale similarity entropy of the image entropy acquisition matrix window according to the occurrence frequency of each type of pixel points.
Preferably, the method for obtaining the relative roughness of the initial seed point and the growing pixel point according to the difference of the edge roughness of the initial seed point and the growing pixel point comprises the following steps:
taking the absolute value of the difference between the edge roughness of the initial seed point and the edge roughness of the growing pixel point as the relative roughness of the initial seed point and the growing pixel point.
Preferably, the method for obtaining the edge continuity of the initial seed point and the growing pixel point according to the gray value of the pixel point in the similar window comprises the following steps:
obtaining the gray value average value of pixel points in a similar window, and calculating an edge coherence factor:
in the method, in the process of the invention,representing the gray value of the v pixel point in the similar window corresponding to the initial seed point,representing the gray value of the v pixel point in the similar window corresponding to the growing pixel point,represents the average value of the gray values of the pixel points in the similar window corresponding to the initial seed point,representing the average value of the gray values of the pixel points in the similar window corresponding to the growing pixel points,representing the number of pixels within a similar window,an edge coherence factor representing the initial seed point and the grown pixel point;
and multiplying the edge coherence factor by the gray value difference corresponding to the initial seed point and the growing pixel point to obtain the edge coherence of the initial seed point and the growing pixel point.
Preferably, the method for obtaining the growth edge by iteration according to the growth value includes:
and carrying out weighted summation on gradient dispersion, relative roughness and edge continuity of the initial seed point and the growth pixel point to obtain a growth value, taking the growth pixel point with the minimum growth value as the next seed point, marking the next seed point as the initial seed point until no new pixel point exists in the growth direction, and combining all the obtained seed points to obtain a growth edge if no new pixel point exists in the first growth direction and the second growth direction.
The beneficial effects of the invention are as follows: the invention adopts an image region growing algorithm in the field of computer vision to detect the oil separation effect of raw materials and automatically identify key characteristics such as the oil-water interface position, the oil film thickness and the like so as to evaluate the separation effect.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a schematic flow chart of a method for detecting a separation effect of raw oil based on computer vision according to an embodiment of the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, a flowchart of a method for detecting a separation effect of raw oil based on computer vision according to an embodiment of the invention is shown, and the method includes the following steps:
step S001, collecting an original image, preprocessing the original image, and obtaining a separation gray scale image.
Using a CCD (charge coupled device) technology-based camera to shoot a raw oil separation result, enabling the CCD camera to shoot the raw oil separation liquid level in parallel to obtain an original image, wherein the original image is an image (RGB image) displayed in an RGB color mode, R is a red channel, G is a green channel, B is a blue channel, using weighted average graying to obtain a raw oil gray scale map of the acquired RGB image, using semantic segmentation to the separation gray scale map, marking a container part where the raw oil is located as 1, marking a background part as 0, segmenting and extracting pixel points marked as 1, using a loss function as a mean square error loss function, and inputting the gray scale map of the raw oil to obtain a gray scale map only containing the part where the raw oil is located as the separation gray scale map.
Step S002, obtaining a denoising image according to the separation gray level image, obtaining two interested areas (Region of Interest, ROI) according to the denoising image, obtaining initial seed points in the water area and the oil area respectively, obtaining growth pixel points based on the initial seed points, and obtaining gradient dispersion of the initial seed points and the growth pixel points according to gradient values of the initial seed points and the growth pixel points.
Since noise is inevitably generated in the image during the process of capturing the image, gaussian filtering is used for denoising the separated gray scale images to obtain a denoised image.
For the edge region growing algorithm, firstly, an initial seed point is selected, for the edge region growing algorithm, for the raw oil separation boundary of the denoising image, an ROI region is obtained, an oil film middle layer can appear between oil and water, a separation boundary exists between a water region and the middle layer, and therefore, for the two separation boundaries, an ROI region is respectively obtained, for example, a Sobel operator (sobel) is used for obtaining a gradient value of each pixel point of the denoising image, U pixel points with the largest gradient value are obtained in the ROI region, in the embodiment, the U value is 50, for the U pixel points with the largest gradient value, standard deviation of gray values of the pixel points and all eight adjacent pixel points is calculated, the standard deviation of the pixel points is used as a measurement index of image contrast, the pixel points which are close to the separation boundary are larger, and therefore, the gradient value and the gradient value of each pixel point in the U pixel points with the largest gradient value and the standard deviation are added as the standard seed point of the initial pixel points.
After the initial seed points are acquired, a growth criterion of a region growth algorithm is required to be acquired, and for the problem that the noise-removed image is observed, the phenomenon of blurring of an image edge line is caused by the fact that uneven interfaces such as bubbles, oil drops and the like exist at the boundary part of raw material oil separation, the growth criterion is perfected according to the similarity degree between the selected initial seed points and the neighborhood pixel points.
Specifically, for the ROI area, since the separation boundary of the raw oil separation is the position with the largest gradient, for the initial seed point, the gradient direction is obtained, the vertical line of the gradient direction is made, the direction pointed by the vertical line is the growth direction, the initial seed point grows in the growth direction, the growth direction is two and is respectively marked as the first growth direction and the second growth direction, a growth sequence is obtained based on the first growth direction and the second growth direction and is respectively marked as the first growth sequence and the second growth sequence, the first pixel point of the initial seed point encountered along the first growth direction is marked as the growth possible point, the overgrowth possible point is made as the vertical line of the growth direction, the two pixels closest to the growth possible point are respectively taken at two sides of the vertical line, and the four pixels closest to the growth possible point are marked as the growth pixel point.
The horizontal gradient value and the vertical gradient value of each pixel point can be obtained through sobel, and the gradient dispersion of the two pixel points is obtained according to the horizontal gradient value and the vertical gradient value of the initial seed point and the growing pixel point, and the formula is as follows:
in the method, in the process of the invention,water representing initial seed pointThe value of the flat gradient is calculated,a numerical gradient value representing the initial seed point,representing the horizontal gradient value of the nth growing pixel point corresponding to the initial seed point,representing the value gradient value of the nth growth pixel point number corresponding to the initial seed point,representing the maximum gradient value in the denoised image,representing the gradient dispersion of the initial seed point and its corresponding nth grown pixel point.
So far, the gradient dispersion of the initial seed point and each growing pixel point is obtained.
Step S003, respectively obtaining a matrix window for the initial seed point and the growth pixel point, obtaining a corresponding gray matrix according to each matrix window, and obtaining the contrast of the matrix window according to the element value and the element position after the normalization of the gray matrix; obtaining gray scale similar entropy of the matrix window according to gray scale value differences of different pixel points and initial seed points in the matrix window; and obtaining the edge roughness according to the contrast ratio and the gray scale similarity entropy of the matrix window, and obtaining the relative roughness of the initial seed point and the growing pixel point according to the difference of the edge roughness of the initial seed point and the growing pixel point.
For the denoising image, the pixel points on the upper side and the lower side of the raw material oil separation part have larger brightness difference, and according to the characteristics, the brightness of the edge pixel points can be analyzed.
Specifically, the initial seed points and the corresponding growing pixel points are taken as the midpoints to obtain matrix windows, wherein the size of the matrix windows isIn the present embodiment, let5, acquiring a gray level co-occurrence matrix in each matrix window; the parameters of the gray level co-occurrence matrix are as follows:
firstly dividing the gray level value into 4 gray levels, dividing the gray level value of each pixel point in a matrix window by 64, rounding downwards, selecting four directions of 0 degree, 45 degrees, 90 degrees and 135 degrees in the matrix window, wherein the step distance is 1, namely, two adjacent pixel points construct pixel pairs, a gray level co-occurrence matrix is obtained in each direction based on the pixel pairs, the four gray level co-occurrence matrices are combined into a final gray level co-occurrence matrix, and the raw material oil separation interface is a transverse axis, so that the weight in the 90 degrees direction is higher and the weight in the 0 degree direction is smaller, and therefore, in the embodiment, the 0 degree direction weight is set to 0.1,45 degrees, the 135 degree direction full weight is set to 0.2, and the 90 degree direction weight is set to 0.5, so that the final gray level co-occurrence matrix is recorded as the gray level matrix.
For the gray matrix, normalizing the element values in the gray matrix by using the maximum and minimum values, marking the difference value between the element values and the minimum values in the gray matrix as a first difference value, marking the difference value between the maximum value and the minimum value in the gray matrix as a second difference value, and enabling the ratio of the first difference value to the second difference value to be the normalized value of each element value, thereby obtaining the normalized gray matrix.
Obtaining the contrast of a matrix window according to the element value of each element in the normalized gray matrix and the position of the element value, wherein the formula is as follows:
in the method, in the process of the invention,the number of columns representing the normalized gray matrix,the number of rows representing the normalized gray matrix,the element values representing the j-th row and i-th column of the normalized gray matrix,representing the contrast of the matrix window.To measure the degree of brightness non-uniformity at the boundary of the denoised image,the larger the value, the more likely the pixel point is to be a point on the boundary.
In the matrix window, the gray value of each pixel point is differed from the gray value of the initial seed point, the pixel point with the difference value smaller than 1 is marked as a class of pixel points, the pixel point with the difference value smaller than 2 is marked as a class of pixel points, the pixel point with the difference value smaller than 3 is marked as three classes of pixel points, the pixel point with the difference value smaller than 4 is marked as four classes of pixel points, the pixel point with the difference value smaller than 5 is marked as five classes of pixel points, the number of each class of pixel points is counted, the ratio of the number of each class of pixel points to the number of the pixel points of the rectangular window is calculated, the ratio is the appearance frequency of each class of pixel points, and the gray similar entropy of the matrix window is obtained according to the appearance frequency of each class of pixel points in each matrix window, and the formula is as follows:
in the method, in the process of the invention,indicating how many classes of pixel points are acquired in total,represents the frequency of occurrence of the k-type pixel points,representing the gray scale similarity entropy of the matrix window.
Obtaining the edge roughness of each matrix window according to the gray scale similarity entropy and contrast of each matrix window, wherein the formula is as follows:
in the method, in the process of the invention,the gray scale similarity entropy of the matrix window is represented,representing the contrast of the matrix window,representing the edge roughness of the matrix window.
Each matrix window corresponds to one initial seed point or growth pixel point, so that the edge roughness of the initial seed point or growth pixel point is obtained.
Obtaining the relative roughness of the initial seed point and the growing pixel point according to the edge roughness of the initial seed point and the edge roughness of the growing pixel point, wherein the formula is as follows:
in the method, in the process of the invention,representing the edge roughness of the initial seed point,representing the edge roughness of the growing pixel points,a linear normalization function is represented and,representing the relative roughness of the initial seed points and the grown pixel points.
Thus, the relative roughness of the initial seed point and each grown pixel point is obtained.
Step S004, respectively obtaining a similar window for the initial seed point and the growth pixel point, obtaining edge continuity of the initial seed point and the growth pixel point according to gray values of the pixel points in the similar window, obtaining a growth value according to gradient dispersion, edge continuity and relative roughness between the initial seed point and the growth pixel point, and carrying out iteration according to the growth value to obtain a growth edge.
As for the observation analysis of the denoising image, the surrounding pixels of the initial seed point and the next seed pixel point are similar, and the gray information of the initial seed point and the next seed pixel point has consistency, so that the initial seed point and the growing pixel point are respectively taken as the center point to respectively acquire oneThe window of (2) is denoted as a similar window, in this embodiment letIn the similar window, the gray value average value of all the pixel points is obtained, and the edge coherence factor is obtained according to the difference between the gray value and the gray value average value of each pixel point in the similar window of the initial seed point and the growing pixel point, wherein the formula is as follows:
in the method, in the process of the invention,representing the gray value of the v pixel point in the similar window corresponding to the initial seed point,representing the gray value of the v pixel point in the similar window corresponding to the growing pixel point,represents the average value of the gray values of the pixel points in the similar window corresponding to the initial seed point,representing the average value of the gray values of the pixel points in the similar window corresponding to the growing pixel points,representing the number of pixels within a similar window,representing the edge coherence factor of the initial seed point and the growing pixel point.
Obtaining edge continuity according to gray values of the initial seed points and the growing pixel points and edge continuity factors, wherein the formula is as follows:
in the method, in the process of the invention,represents the average value of the gray values of the pixel points in the similar window corresponding to the initial seed point,representing the average value of the gray values of the pixel points in the similar window corresponding to the growing pixel points,the gray value representing the initial seed point,the gray value representing the growing pixel point,an edge coherence factor representing the initial seed point and the growing pixel point,a linear normalization function is represented and,representing edge continuity between the initial seed point and the growing pixel point. Edge coherence of two pixel pointsThe stronger the two pixels are, the closer the edge is to the continuity.
Obtaining a growth value according to the obtained gradient dispersion, edge continuity and relative roughness between the initial seed point and the growth pixel point, wherein the formula is as follows:
in the method, in the process of the invention,representing the gradient dispersion of the initial seed point and its corresponding nth grown pixel point,representing the relative roughness of the initial seed point and its corresponding nth grown pixel point,represents the edge continuity of the initial seed point with its corresponding nth grown pixel point,the weights of the three are respectively represented,the growth value corresponding to the nth growth pixel is represented, and the weight of the three is set to be 1/3 in the embodiment.
For each growth pixel point, a growth pixel point with the minimum growth value is obtained as the next seed point, the seed point is put into a first growth sequence, the seed point is used as an initial seed point again, the method is continuously used for obtaining new seed points in the first growth sequence until no new pixel point exists in the growth direction, and therefore a first growth sequence and a second growth sequence are obtained, and the two growth sequences are combined and recorded as growth edges.
Thus, a growth edge is obtained.
Step S005, effect detection is completed according to the distance between the growing edges of all the ROI areas.
And acquiring a growth edge for each ROI region, acquiring two growth edges, acquiring the distance between the two growth edges by using a DTW distance, marking the two growth edges as raw oil-water-oil separation boundaries, namely the oil film thickness, and comparing the oil film thickness with the existing standard to finish the detection of the raw oil separation effect.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (5)

1. The method for detecting the separation effect of the raw oil based on computer vision is characterized by comprising the following steps of:
collecting an original image, preprocessing the original image, and obtaining a separation gray scale image;
acquiring a denoising image according to the separation gray level image, acquiring two ROI areas according to the denoising image, wherein the two ROI areas are respectively in a water area and an oil area, acquiring initial seed points in the ROI areas, acquiring growth pixel points based on the initial seed points, and acquiring gradient dispersion of the initial seed points and the growth pixel points according to gradient values of the initial seed points and the growth pixel points;
respectively acquiring a matrix window for the initial seed point and the growth pixel point, acquiring a corresponding gray matrix according to each matrix window, and acquiring the contrast of the matrix window according to the element value and the element position after the normalization of the gray matrix; obtaining gray scale similar entropy of the matrix window according to gray scale value differences of different pixel points and initial seed points in the matrix window; obtaining edge roughness according to contrast and gray scale similarity entropy of the matrix window, and obtaining relative roughness of the initial seed point and the growing pixel point according to difference of the edge roughness of the initial seed point and the growing pixel point;
respectively acquiring a similar window for the initial seed point and the growth pixel point, acquiring edge continuity of the initial seed point and the growth pixel point according to gray values of the pixel points in the similar window, acquiring a growth value according to gradient dispersion, edge continuity and relative roughness between the initial seed point and the growth pixel point, and performing iteration according to the growth value to acquire a growth edge;
finishing effect detection according to the distance between the growing edges of all the ROI areas;
the method for acquiring the gradient dispersion of the initial seed point and the growth pixel point according to the gradient values of the initial seed point and the growth pixel point comprises the following steps:
in the method, in the process of the invention,horizontal gradient value representing initial seed point, +.>Numerical gradient value representing initial seed point, +.>Representing the horizontal gradient value of the nth growing pixel point corresponding to the initial seed point, +.>Representing the value gradient value of the n-th growth pixel point number corresponding to the initial seed point,/and the method>Representing the maximum gradient value in the denoised image, < >>Representing the gradient dispersion of the initial seed point and the corresponding nth growth pixel point;
the method for obtaining the contrast of the matrix window according to the element value and the element position after the normalization of the gray matrix comprises the following steps:
in the method, in the process of the invention,column number representing normalized gray matrix, ++>Representing the number of rows of the normalized gray matrix,/->Element value representing the j-th row and i-th column of the normalized gray matrix,>representing the contrast of the matrix window;
the method for obtaining the gray scale similarity entropy of the matrix window according to the gray scale value difference between different pixel points and initial seed points in the matrix window comprises the following steps:
the gray values of all the pixel points are differenced with the gray values of the initial seed points in the matrix window, all the pixel points are classified according to the comparison of the difference value and a preset threshold value, the number of the pixel points in each type is obtained and is recorded as a first number, and the ratio of the first number to the number of the pixel points in the matrix window is recorded as the occurrence frequency of each type of pixel points; calculating the occurrence frequency of each type of pixel point to obtain the gray scale similarity entropy of the matrix window;
the method for acquiring the edge roughness comprises the following steps:
obtaining the edge roughness of each matrix window according to the gray scale similarity entropy and contrast of each matrix window, wherein the formula is as follows:
in the method, in the process of the invention,gray scale similarity entropy representing matrix window, +.>Contrast, ∈r, representing matrix window>Representing the edge roughness of the matrix window;
the method for acquiring the relative roughness of the initial seed point and the growth pixel point according to the difference of the edge roughness of the initial seed point and the growth pixel point comprises the following steps:
taking the absolute value of the difference between the edge roughness of the initial seed point and the edge roughness of the growing pixel point as the relative roughness of the initial seed point and the growing pixel point;
the method for acquiring the edge continuity of the initial seed point and the growing pixel point according to the gray value of the pixel point in the similar window comprises the following steps:
obtaining the gray value average value of pixel points in a similar window, and calculating an edge coherence factor:
in the method, in the process of the invention,representing the gray value of the v-th pixel point in the similar window corresponding to the initial seed point, ++>Representing the gray value of the v-th pixel point in the similar window corresponding to the growing pixel point,/and>representing initial seedAverage value of gray values of pixel points in similar window corresponding to point, +.>Representing the average value of the gray values of the pixels in the similar window corresponding to the growing pixel,/->Representing the number of pixels in a similar window, < +.>An edge coherence factor representing the initial seed point and the grown pixel point; taking an initial seed point and a growing pixel point as central points to respectively obtain a +.>The window of (2) is marked as a similar window, ">
Multiplying the edge coherence factor by gray value differences corresponding to the initial seed point and the growing pixel point to obtain the edge coherence of the initial seed point and the growing pixel point;
the method for obtaining the growth value comprises the following steps: and carrying out weighted summation on the gradient dispersion, the relative roughness and the edge continuity of the initial seed point and the growth pixel point to obtain a growth value.
2. The method for detecting the separation effect of the raw oil based on computer vision according to claim 1, wherein the method for acquiring the initial seed point in the ROI area is as follows:
for a preset number of pixel points with the largest gradient value, calculating standard deviation of gray values of the pixel points and all pixel points in eight neighborhoods of the pixel points, adding the gradient value and the standard deviation of each pixel point in the preset number of pixel points with the largest gradient, and taking the pixel point corresponding to the maximum value of the addition of the gradient value and the standard deviation as an initial seed point.
3. The method for detecting the separation effect of the raw oil based on computer vision according to claim 1, wherein the method for acquiring the growing pixel points based on the initial seed points is as follows:
for an initial seed point, acquiring a gradient direction of the initial seed point, and making a vertical line of the gradient direction, wherein the direction pointed by the vertical line is a growth direction, and the two growth directions are respectively marked as a first growth direction and a second growth direction;
the first pixel point encountered by the initial seed point along any growth direction is marked as a growth possible point, the overgrowth possible point is made into a vertical line of the growth direction, two pixel points closest to the growth possible point are respectively taken from two sides of the growth possible point on the vertical line, and the growth possible point and four pixel points closest to the growth possible point are marked as growth pixel points.
4. The method for detecting the separation effect of the raw oil based on computer vision according to claim 1, wherein the method for acquiring the corresponding gray matrix according to each matrix window is as follows:
each matrix window acquires a plurality of gray level co-occurrence matrixes in a preset direction, the gray level co-occurrence matrixes only have 4 gray levels, the step pitch is 1, and the gray level matrixes of each matrix window are obtained by weighting the preset number of gray level co-occurrence matrixes.
5. The method for detecting a separation effect of raw oil based on computer vision according to claim 3, wherein the method for iteratively obtaining a growth edge according to a growth value is as follows:
and taking the growing pixel point with the smallest growth value as the next seed point, marking the next seed point as the initial seed point until no new pixel point exists in the growth direction, and if no new pixel point exists in the first growth direction and the second growth direction, combining all the obtained seed points to obtain a growth edge.
CN202311008220.4A 2023-08-11 2023-08-11 Method for detecting separation effect of raw oil based on computer vision Active CN116721391B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311008220.4A CN116721391B (en) 2023-08-11 2023-08-11 Method for detecting separation effect of raw oil based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311008220.4A CN116721391B (en) 2023-08-11 2023-08-11 Method for detecting separation effect of raw oil based on computer vision

Publications (2)

Publication Number Publication Date
CN116721391A CN116721391A (en) 2023-09-08
CN116721391B true CN116721391B (en) 2023-10-31

Family

ID=87873840

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311008220.4A Active CN116721391B (en) 2023-08-11 2023-08-11 Method for detecting separation effect of raw oil based on computer vision

Country Status (1)

Country Link
CN (1) CN116721391B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116912277B (en) * 2023-09-12 2023-12-12 山东鲁泰化学有限公司 Circulating water descaling effect evaluation method and system
CN116912250B (en) * 2023-09-13 2023-11-28 山东众成菌业股份有限公司 Fungus bag production quality detection method based on machine vision
CN117422705B (en) * 2023-11-24 2024-04-05 太康精密(中山)有限公司 Connector terminal quality detection method based on image vision

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5271064A (en) * 1991-06-14 1993-12-14 University Of Cincinnati Apparatus and method for smoothing regions and enhancing edges in gray scale images
CN106447688A (en) * 2016-03-31 2017-02-22 大连海事大学 Method for effectively segmenting hyperspectral oil-spill image
CN114627140A (en) * 2022-05-16 2022-06-14 新风光电子科技股份有限公司 Coal mine ventilator intelligent adjusting method based on high-voltage frequency converter
CN115082721A (en) * 2022-07-22 2022-09-20 南通仁源节能环保科技有限公司 Pressure control method for air-float decontamination of oil-containing sewage
CN115100201A (en) * 2022-08-25 2022-09-23 淄博齐华制衣有限公司 Blending defect detection method of flame-retardant fiber material
CN115690106A (en) * 2023-01-03 2023-02-03 菏泽城建新型工程材料有限公司 Deep-buried anchor sealing detection method based on computer vision
CN116363133A (en) * 2023-06-01 2023-06-30 无锡斯达新能源科技股份有限公司 Illuminator accessory defect detection method based on machine vision

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5271064A (en) * 1991-06-14 1993-12-14 University Of Cincinnati Apparatus and method for smoothing regions and enhancing edges in gray scale images
CN106447688A (en) * 2016-03-31 2017-02-22 大连海事大学 Method for effectively segmenting hyperspectral oil-spill image
CN114627140A (en) * 2022-05-16 2022-06-14 新风光电子科技股份有限公司 Coal mine ventilator intelligent adjusting method based on high-voltage frequency converter
CN115082721A (en) * 2022-07-22 2022-09-20 南通仁源节能环保科技有限公司 Pressure control method for air-float decontamination of oil-containing sewage
CN115100201A (en) * 2022-08-25 2022-09-23 淄博齐华制衣有限公司 Blending defect detection method of flame-retardant fiber material
CN115690106A (en) * 2023-01-03 2023-02-03 菏泽城建新型工程材料有限公司 Deep-buried anchor sealing detection method based on computer vision
CN116363133A (en) * 2023-06-01 2023-06-30 无锡斯达新能源科技股份有限公司 Illuminator accessory defect detection method based on machine vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Automatic Road Crack Detection Using Random Structured Forests;Yong Shi et al.;《IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS》;第17卷(第12期);3434-3445 *
基于边缘检测和种子自动区域生长算法的铝铸件图像分割;王蒙 等;《西安邮电学院学报》;16(6);16-19 *

Also Published As

Publication number Publication date
CN116721391A (en) 2023-09-08

Similar Documents

Publication Publication Date Title
CN116721391B (en) Method for detecting separation effect of raw oil based on computer vision
CN109978822B (en) Banana maturity judging modeling method and judging method based on machine vision
WO2021139258A1 (en) Image recognition based cell recognition and counting method and apparatus, and computer device
CN112308832B (en) Bearing quality detection method based on machine vision
CN108564085B (en) Method for automatically reading of pointer type instrument
CN108181316B (en) Bamboo strip defect detection method based on machine vision
CN109447945B (en) Quick counting method for basic wheat seedlings based on machine vision and graphic processing
CN109035273B (en) Image signal fast segmentation method of immunochromatography test paper card
CN110210448B (en) Intelligent face skin aging degree identification and evaluation method
CN116645367B (en) Steel plate cutting quality detection method for high-end manufacturing
CN111652213A (en) Ship water gauge reading identification method based on deep learning
CN115187548A (en) Mechanical part defect detection method based on artificial intelligence
CN112215790A (en) KI67 index analysis method based on deep learning
CN115797352B (en) Tongue picture image processing system for traditional Chinese medicine health-care physique detection
CN110648330B (en) Defect detection method for camera glass
CN113470041B (en) Immunohistochemical cell image cell nucleus segmentation and counting method and system
CN113420614A (en) Method for identifying mildewed peanuts by using near-infrared hyperspectral images based on deep learning algorithm
CN113435460A (en) Method for identifying brilliant particle limestone image
CN115511814A (en) Image quality evaluation method based on region-of-interest multi-texture feature fusion
CN116883408A (en) Integrating instrument shell defect detection method based on artificial intelligence
CN115731493A (en) Rainfall micro physical characteristic parameter extraction and analysis method based on video image recognition
CN117237747B (en) Hardware defect classification and identification method based on artificial intelligence
CN115063375B (en) Image recognition method for automatically analyzing ovulation test paper detection result
CN114881984A (en) Detection method and device for rice processing precision, electronic equipment and medium
CN114373156A (en) Non-contact type water level, flow velocity and flow intelligent monitoring system based on video image recognition algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant