CN118115823B - Intelligent classification method and system for construction waste - Google Patents

Intelligent classification method and system for construction waste Download PDF

Info

Publication number
CN118115823B
CN118115823B CN202410532856.7A CN202410532856A CN118115823B CN 118115823 B CN118115823 B CN 118115823B CN 202410532856 A CN202410532856 A CN 202410532856A CN 118115823 B CN118115823 B CN 118115823B
Authority
CN
China
Prior art keywords
edge line
line segment
construction waste
distribution density
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410532856.7A
Other languages
Chinese (zh)
Other versions
CN118115823A (en
Inventor
薛敬宾
郝鸿韬
郑丽丽
张收
靳磊
张伟
朱伟朋
揭亮
张新辉
张阳阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Yichang Prefabricated Building Technology Co ltd
Original Assignee
Shandong Yichang Prefabricated Building Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Yichang Prefabricated Building Technology Co ltd filed Critical Shandong Yichang Prefabricated Building Technology Co ltd
Priority to CN202410532856.7A priority Critical patent/CN118115823B/en
Publication of CN118115823A publication Critical patent/CN118115823A/en
Application granted granted Critical
Publication of CN118115823B publication Critical patent/CN118115823B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to an intelligent classification method and system for construction waste, comprising the following steps: collecting RGB images of the construction waste mixture, and graying to obtain a construction waste gray image; performing edge detection on the construction waste gray level image to obtain a plurality of edge line segments of the construction waste gray level image; acquiring a plurality of neighborhood windows of each edge line segment according to the edge line segment; carrying out principal component analysis on pixel points of edge line segments in a neighborhood window to obtain the distribution significance of each edge line segment; obtaining the filtering adjustment degree of each pixel point on each edge line segment according to the distribution saliency degree and the gray value of the pixel point on the edge line segment; obtaining a construction waste gray image after denoising according to the filtering adjustment degree; and classifying the construction wastes intelligently according to the construction waste gray level images after denoising. The method solves the problem that the edge of the construction waste is fitted, and improves the intelligent classification effect of the construction waste.

Description

Intelligent classification method and system for construction waste
Technical Field
The invention relates to the technical field of image processing, in particular to an intelligent classification method and system for construction waste.
Background
The construction waste is provided with a plurality of reusable materials, such as concrete, bricks, steel bars and the like, and the materials are classified, so that the reusable materials can be efficiently recovered and reused, the dependence on natural resources is reduced, and the aim of recycling economy is fulfilled. In the recycling process of the construction waste, the crushing machine, the sorting machine and other devices are generally utilized for sorting the construction waste, but the crushing is incomplete due to the fact that different materials in the construction waste are problematic, namely, large fragments such as wood and plastic exist, and the vibration sorting effect is not ideal. Therefore, the crushed building waste mixture needs to be intelligently classified by utilizing a machine vision device so as to improve the recycling efficiency of crushing, sorting and recycling the building waste.
In the prior art, a bilateral filtering algorithm is adopted to carry out filtering treatment on the construction waste image, so that the edge information of various types of construction waste can be effectively maintained, interference noise information is restrained, and intelligent classification of the construction waste is convenient to realize; however, in the process of filtering the construction waste image by using the bilateral filtering algorithm, the edge-keeping effect depends on the gradient distribution characteristics of the pixel points in the neighborhood window, when the gradient information of the pixel points in the neighborhood window is too complex, namely, the edge pixel points of a plurality of different types of construction waste exist, the edge-keeping effect has the over-fitting phenomenon, so that the intelligent classification effect of the construction waste is poor, and the utilization rate of the reusable materials is low.
Disclosure of Invention
In order to solve the problems, the invention provides an intelligent classification method and system for construction waste.
The intelligent classification method and system for the construction waste adopt the following technical scheme:
the embodiment of the invention provides an intelligent classification method for construction waste, which comprises the following steps:
Collecting RGB images of the construction waste mixture, and graying to obtain a construction waste gray image;
Performing edge detection on the construction waste gray level image to obtain a plurality of edge line segments of the construction waste gray level image; acquiring a plurality of neighborhood windows of each edge line segment according to the edge line segment; carrying out principal component analysis on pixel points of edge line segments in the neighborhood windows to obtain a distribution density sequence of each neighborhood window of each edge line segment; acquiring a first distribution density sequence and a second distribution density sequence of each neighborhood window of each edge line segment according to the distribution density sequences; obtaining the density difference of the first distribution density sequence and the second distribution density sequence of each neighborhood window of each edge line segment according to the first distribution density sequence and the second distribution density sequence; obtaining the distribution significance of each edge line segment according to the density difference;
Obtaining the filtering adjustment degree of each pixel point on each edge line segment according to the distribution saliency degree and the gray value of the pixel point on the edge line segment; obtaining bilateral filtering value domain weight coefficients after adjustment of each pixel point on each edge line segment according to the filtering adjustment degree; carrying out bilateral filtering on the construction waste gray image according to the adjusted bilateral filtering value domain weight coefficient to obtain a construction waste gray image after denoising;
and classifying the construction wastes intelligently according to the construction waste gray level images after denoising.
Further, the step of obtaining a plurality of neighborhood windows of each edge line segment according to the edge line segment comprises the following specific steps:
marking any edge line segment as a target edge line segment, acquiring a minimum circumscribed square of the target edge line segment, marking the minimum circumscribed square as a first circumscribed square, and marking the side length of the first circumscribed square as The central pixel point of the first circumscribed square is marked as a first central point; constructing a plurality of square windows by taking a first central point as the center, and marking the square windows as neighborhood windows of target edge line segments, wherein the length change interval of the neighborhood windows isIs a preset first value.
Further, the principal component analysis is performed on the pixel points of the edge line segments in the neighborhood windows to obtain a distribution density sequence of each neighborhood window of each edge line segment, which comprises the following specific steps:
Marking any one neighborhood window of the target edge line segment as a target neighborhood window, carrying out principal component analysis on pixel points on all edge line segments in the target neighborhood window, obtaining a principal component direction with the maximum characteristic value, and marking the principal component direction as an initial principal component direction; projecting pixel points on all edge line segments in a target neighborhood window to the direction of an initial principal component to obtain a plurality of projection points in the direction of the initial principal component; the method comprises the steps of obtaining the number of pixel points corresponding to each projection point in the initial principal component direction, arranging all the numbers according to the sequence of the projection points in the initial principal component direction, obtaining a sequence, and recording the sequence as a distribution density sequence of a target neighborhood window.
Further, the method for obtaining the first distribution density sequence and the second distribution density sequence of each neighborhood window of each edge line segment according to the distribution density sequence comprises the following specific steps:
Acquiring a first projection point and a last projection point of pixel points of a target edge line segment in projection points corresponding to an initial principal component direction, marking the number corresponding to the first projection point in a distribution density sequence of a target neighborhood window as a first number, and marking partial distribution density sequences corresponding to all elements before the first number in the distribution density sequence of the target neighborhood window as a first distribution density sequence of the target neighborhood window; and marking the number corresponding to the last projection point in the distribution density sequence of the target neighborhood window as a second number, and marking the partial distribution density sequence corresponding to all elements after the second number in the distribution density sequence of the target neighborhood window as a second distribution density sequence of the target neighborhood window.
Further, the obtaining the density difference of the first distribution density sequence and the second distribution density sequence of each neighborhood window of each edge line segment according to the first distribution density sequence and the second distribution density sequence comprises the following specific steps:
And (3) marking the ratio of the average value of all the numbers in the first distribution density sequence to the number of the numbers in the first distribution density sequence as a first ratio, marking the ratio of the average value of all the numbers in the second distribution density sequence to the number of the numbers in the second distribution density sequence as a second ratio, subtracting the second ratio from the first ratio, taking an absolute value, and taking the absolute value as the density difference of the first distribution density sequence and the second distribution density sequence of the target neighborhood window of the target edge line segment.
Further, the obtaining the distribution significance of each edge line segment according to the density difference comprises the following specific steps:
The first line segment of the target edge The density difference of the first distribution density sequence and the second distribution density sequence of each neighborhood window minus the first of the target edge line segmentsThe absolute value of the result of the density difference between the first distribution density sequence and the second distribution density sequence of each neighborhood window is taken, and the average value is obtained by accumulation summation, so as to obtain the distribution significant factor of the target edge line segment,The number of neighborhood windows of the target edge line segment; and obtaining the distribution significant factors of each edge line segment, and carrying out linear normalization processing on all the distribution significant factors, wherein the obtained result is used as the distribution significant degree of each edge line segment.
Further, the filtering adjustment degree of each pixel point on each edge line segment is obtained according to the distribution significance degree and the gray value of the pixel point on the edge line segment, and the specific steps are as follows:
On the target edge line segment Subtracting the average value of the gray values of the pixel points in the neighborhood sliding window where the target edge line segment is located from the gray values of the pixel points, and marking the obtained difference value as a first difference value, wherein the neighborhood sliding window is a neighborhood sliding window of a bilateral filtering algorithm; on the target edge line segmentSubtracting the average value of the gray values of all the pixel points on the target edge line segment from the gray values of the pixel points, and marking the obtained difference value as a second difference value; subtracting a preset significance threshold from the distribution significance of the target edge line segments, and marking the obtained difference as a third difference; obtaining the maximum value of a plurality of differences obtained by subtracting a preset significance threshold from the distribution significance of all edge line segments, and marking the maximum value as a fourth difference; the ratio of the third difference value to the fourth difference value is marked as a first ratio, the product of the first difference value, the second difference value and the first ratio is marked as a first factor, and the inverse ratio value of the first factor is used as the first factor on the target edge line segmentThe degree of filtering adjustment of the individual pixels.
Further, the obtaining the bilateral filtering value domain weight coefficient after adjustment of each pixel point on each edge line segment according to the filtering adjustment degree includes the following specific steps:
if the object edge line segment is the first The filter adjustment degree of each pixel point is larger than or equal to the average value of the filter adjustment degrees of all pixel points on the target edge line segment, and the first pixel point on the target edge line segmentThe specific acquisition method of the bilateral filtering value domain weight coefficient after the adjustment of each pixel point is as follows: obtaining the first line segment on the edge of the target according to the bilateral filtering algorithmBilateral filtering value domain weight coefficient of each pixel point is recorded as an initial coefficient, and the first point on the target edge line segmentSubtracting the average value of the filter adjustment degrees of all the pixel points on the target edge line segment from the filter adjustment degrees of all the pixel points, marking the obtained difference value as a fifth difference value, marking the ratio of the fifth difference value to the average value of the filter adjustment degrees of all the pixel points on the target edge line segment as a third ratio, marking the result of adding one to the third ratio as a first added value, marking the product of the first added value and the initial coefficient as the first on the target edge line segmentBilateral filtering value domain weight coefficients after adjustment of the pixel points;
if the object edge line segment is the first The filter adjustment degree of each pixel point is smaller than the average value of the filter adjustment degrees of all pixel points on the target edge line segment, and the first pixel point on the target edge line segmentThe specific acquisition method of the bilateral filtering value domain weight coefficient after the adjustment of each pixel point is as follows: on the target edge line segmentThe ratio of the filter adjustment degree of each pixel point to the average value of the filter adjustment degrees of all the pixel points on the target edge line segment is recorded as a fourth ratio, and the product result of the fourth ratio and the initial coefficient is used as the first point on the target edge line segmentBilateral filtering value domain weight coefficient after adjustment of each pixel point.
Further, the intelligent classification of the construction waste according to the construction waste gray level image after denoising comprises the following specific steps:
Carrying out connected domain detection on the construction waste gray level image after denoising to obtain a plurality of connected domains of the construction waste gray level image after denoising; the method comprises the steps of obtaining the number of pixel points contained in each connected domain, presetting a first threshold value, taking the connected domain with the number of pixel points larger than the first threshold value as a light building material area, and taking the connected domain with the number of pixel points smaller than or equal to the first threshold value as a hard building material area.
The invention also provides an intelligent classification system for the construction waste, which comprises a memory and a processor, wherein the processor executes a computer program stored in the memory to realize the steps of the method.
The technical scheme of the invention has the beneficial effects that: according to the invention, after RGB images of the construction waste mixture are acquired, the construction waste gray image is obtained through graying, the data volume is reduced, the processing complexity is reduced, filtering and denoising are facilitated, the construction waste gray image is subjected to edge detection to obtain edge line segments of the construction waste gray image, the distribution density sequences of the edge line segments in different neighborhood windows are analyzed through main components, further, the density difference of a first distribution density sequence and a second distribution density sequence of each neighborhood window of the edge line segments is obtained, the density difference represents the density difference between the two sides of the distribution density sequences when the edge line segments are in the neighborhood windows, the smaller the difference change is, and the greater the possibility that the edge line segments are in the edge overlapping area of light construction materials and hard construction materials is; and then obtaining the distribution significance of the edge line segments, obtaining the filtering adjustment degree of each pixel point on the edge line segments through the distribution significance, further adjusting the bilateral filtering value range weight coefficient of the pixel points on the edge line segments, enabling the efficiency and the precision of the subsequent classification operation to be higher, finally obtaining the construction waste gray level image after denoising through the adjusted bilateral filtering value range weight coefficient, completing classification, realizing intelligent classification of the construction waste, and improving the crushing, sorting, recycling and recycling efficiency of the construction waste.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of steps of an intelligent classification method for construction waste according to an embodiment of the present invention;
Fig. 2 is a characteristic relation flow chart of an intelligent classification method for construction waste according to an embodiment of the invention.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following is a detailed description of specific implementation, structure, characteristics and effects of the intelligent classification method and system for construction waste according to the invention with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The invention provides a method and a system for intelligent classification of construction waste.
Referring to fig. 1 and 2, a step flow chart and a feature relation flow chart of a method for intelligent classification of construction waste according to an embodiment of the present invention are shown, where the method includes the following steps:
and S001, collecting RGB images of the construction waste mixture, and graying to obtain a construction waste gray image.
It should be noted that, in this embodiment, the purpose is to implement adjustment of bilateral filtering weights according to gradient distribution characteristics of pixel points in a local area in an image, so as to improve the effect of intelligent classification of building rubbish, and before starting analysis, the image is collected first.
Specifically, on a garbage conveyor belt conveying path of a crusher discharge port for processing construction garbage, an industrial camera is arranged in a light sufficient area above the conveyor belt, an RGB image of a construction garbage mixture is shot through overlooking of the industrial camera, gray processing is carried out on the RGB image, and the obtained image is recorded as a construction garbage gray image.
The method has the advantages that the crushing effect of hard building materials such as concrete and bricks is good, the crushed fragments are small, the fragments are generally distributed at the bottom of a building rubbish mixture in a discrete manner, the corresponding gray values of the building materials in a building rubbish gray image are small, and the edges are smooth; the crushing effect of light building materials such as timber, plastics is often incomplete, saw-tooth or broken fragments exist after crushing, the gray value in an image is larger, and due to the light texture, the light texture is distributed in the upper region of a building rubbish mixture, the light texture often causes that the edge of a local region of the saw-tooth or broken fragments is overlapped with the edge information of lower concrete and bricks, so that the problem of excessive filtering of the edge information exists in the process of processing the edge information by a bilateral filtering algorithm, and the edge is not obvious.
Thus, the gray level image of the construction waste is obtained.
Step S002, carrying out edge detection on the construction waste gray level image to obtain a plurality of edge line segments of the construction waste gray level image; acquiring a plurality of neighborhood windows of each edge line segment according to the edge line segment; and carrying out principal component analysis on the pixel points of the edge line segments in the neighborhood window to obtain the distribution significance degree of each edge line segment.
It should be noted that, due to the saw tooth-like or broken-like piece characteristics of the lightweight building material such as wood, plastic, etc., there are the staggered distribution characteristics of broken edges of the lightweight building material and substantial edges of the rigid building material in the local area, and the edge direction is random, but it presents different distribution characteristics under the local area of different sizes, mainly due to the larger influence of the piece size of the lightweight building material. Therefore, according to the embodiment, according to the consistency of gray level changes in the local areas with different sizes, the extension characteristics of the different areas are analyzed, so that the filtering adjustment degree of the pixel point is obtained, and the adjustment of the bilateral filtering weight is realized.
It should be further noted that, in the gray level image of the building rubbish, the edge information of the light building material and the edge information of the hard building material have local area overlapping, and the edge disorder degree under the neighborhood of smaller scale is higher, but the edge under the neighborhood of larger scale has regional distribution due to the larger size of the light building material, that is, the edge density of the broken area is the largest, the edge density of the hard building material area is the second, and the internal edge density of the light building material area is the smallest, so that the distribution significant degree of the edge line segments can be obtained by utilizing the distribution change condition of the edge points under different local window sizes.
Specifically, edge detection is performed on the construction waste gray level image to obtain a plurality of edge line segments of the construction waste gray level image, and the specific steps are as follows:
passing of construction waste grey scale image Performing edge detection by an operator to obtain a plurality of edge line segments of the construction waste gray level image;
further, a plurality of neighborhood windows of each edge line segment are obtained according to the edge line segment, and the neighborhood windows are specifically as follows:
marking any edge line segment as a target edge line segment, acquiring a minimum circumscribed square of the target edge line segment, marking the minimum circumscribed square as a first circumscribed square, and marking the side length of the first circumscribed square as The central pixel point of the first circumscribed square is marked as a first central point; it should be noted that if the side length of the first external squareIf the number is even, selecting the point at the upper left corner of the middle four points of the first external square as a central pixel point; constructing a plurality of square windows by taking a first central point as the center, and marking the square windows as neighborhood windows of target edge line segments, wherein the length change interval of the neighborhood windows isFor a preset first value, the embodiment usesDescription is made; it should be noted that the length change of the neighborhood window of the target edge line segment is increased one at a timeFor example, the first neighborhood window has a size ofThe size of the second neighborhood window isSequentially increasing until
Further, principal component analysis is performed on the pixel points of the edge line segments in the neighborhood windows to obtain a distribution density sequence of each neighborhood window of each edge line segment, which is specifically as follows:
Marking any one neighborhood window of the target edge line segment as a target neighborhood window, carrying out principal component analysis on pixel points on all edge line segments in the target neighborhood window, obtaining a principal component direction with the maximum characteristic value, and marking the principal component direction as an initial principal component direction; it should be noted that, the principal component analysis is performed on the pixel points on all the edge line segments in the target neighborhood window, and the direction of obtaining the first principal component is the existing method, which is not repeated in this embodiment; projecting pixel points on all edge line segments in a target neighborhood window to the direction of an initial principal component to obtain a plurality of projection points in the direction of the initial principal component; it should be noted that, one projection point of the initial principal component direction may correspond to a plurality of pixel points on the edge line segment; the method comprises the steps of obtaining the number of pixel points corresponding to each projection point in the initial principal component direction, arranging all the numbers according to the sequence of the projection points in the initial principal component direction, obtaining a sequence, and recording the sequence as a distribution density sequence of a target neighborhood window.
Further, a first distribution density sequence and a second distribution density sequence of each neighborhood window of each edge line segment are obtained according to the distribution density sequences, and the method specifically comprises the following steps:
Acquiring a first projection point and a last projection point of pixel points of a target edge line segment in projection points corresponding to an initial principal component direction, marking the number corresponding to the first projection point in a distribution density sequence of a target neighborhood window as a first number, and marking partial distribution density sequences corresponding to all elements before the first number in the distribution density sequence of the target neighborhood window as a first distribution density sequence of the target neighborhood window; and (3) marking the number corresponding to the last projection point in the distribution density sequence of the target neighborhood window as a second number, and marking the partial distribution density sequence corresponding to all elements after the second number in the distribution density sequence of the target neighborhood window as a second distribution density sequence of the target neighborhood window.
Further, according to the first distribution density sequence and the second distribution density sequence, the density difference of the first distribution density sequence and the second distribution density sequence of each neighborhood window of each edge line segment is obtained, which is specifically as follows:
In the method, in the process of the invention, As an average of all numbers in the first distribution density sequence,For the number of numbers in the first distribution density sequence,For the preset super parameter, the purpose is to prevent the denominator from being 0, in this embodimentIn the description which follows,In order to take the absolute value of the value,As an average of all numbers in the second distribution density sequence,For the number of numbers in the second distribution density sequence,The density differences of the first distribution density sequence and the second distribution density sequence of the target neighborhood window are the target edge line segments.
It should be noted that, when the density difference between the first distribution density sequence and the second distribution density sequence indicates that the target edge line segment is in the target neighborhood window, the more consistent the inter-class density difference between the first distribution density sequence and the second distribution density sequence changes, the greater the possibility that the edge line segment is in the overlapping area of the edges of the light building material and the hard building material.
Further, the distribution significance of each edge line segment is obtained according to the density difference, and the distribution significance is specifically as follows:
In the method, in the process of the invention, Is the first of the target edge line segmentsThe density differences of the first distribution density sequence and the second distribution density sequence of the respective neighborhood windows,Is the first of the target edge line segmentsThe density differences of the first distribution density sequence and the second distribution density sequence of the respective neighborhood windows,In order to take the absolute value of the value,The number of neighborhood windows of the target edge line segment; note that, the firstThe window ratio of the neighborhoodThe length of each neighborhood window is largeIs a significant factor of the distribution of the target edge line segments.
And obtaining the distribution significant factors of each edge line segment, and carrying out linear normalization processing on all the distribution significant factors, wherein the obtained result is used as the distribution significant degree of each edge line segment.
Note that the distribution significance of the edge line segment indicates the cumulative difference of the regions of the edge line segment under the condition of multiple neighborhood region changes, and the larger the difference is, the more significant the change is.
So far, the distribution significance degree of each edge line segment in the construction waste gray level image is obtained.
Step S003, obtaining the filtering adjustment degree of each pixel point on each edge line segment according to the distribution saliency degree and the gray value of the pixel point on the edge line segment; and obtaining the construction waste gray level image after denoising according to the filtering adjustment degree.
It should be noted that, according to the obtained distribution significance of the edge line segment, the probability that the edge of the superimposed area is analyzed from the distribution feature of the edge line segment, but the edge points on the edge line segment are respectively belonging to different area categories, and the gray scale difference is large, when the neighborhood sliding window of the bilateral filtering algorithm contains the edge, the edge effect is difficult to be realized due to the too close distance of the spatial domain, so that the filtering adjustment degree of the pixel points is obtained according to the gray scale change difference of the edge points on the edge line segment.
It should be noted that, according to the obtained distribution significance degree of the edge line segments, the edge line segments with larger distribution significance degree are located in the bilateral filtering window, and at this time, the gray value of the edge point on the edge line segments can have a larger influence on the value range weight of the bilateral filtering, so that the gray information distribution condition of the edge point on the edge line segments needs to be further analyzed.
Specifically, according to the distribution saliency degree and the gray value of the pixel points on the edge line segments, the filtering adjustment degree of each pixel point on each edge line segment is obtained, and specifically, the method comprises the following steps:
In the method, in the process of the invention, On the target edge line segmentThe gray value of each pixel point,The average value of gray values of pixel points in a neighborhood sliding window where the target edge line segment is located; it should be noted that, the neighborhood sliding window is a neighborhood sliding window of the bilateral filtering algorithm, specifically, the existing method of the bilateral filtering algorithm, and this embodiment will not be described again; is the average value of the gray values of all pixel points on the target edge line segment, To the extent that the distribution of the target edge line segments is significant,For a preset significance threshold, the embodiment usesIn the description which follows,Is the firstThe distribution of the individual edge line segments is significant,The number of edge line segments of the construction waste gray level image; the present embodiment uses an exponential function based on natural constants The model of (2) presents an inverse proportion relation, and an inverse proportion function can be set according to specific implementation conditions during specific implementation; On the target edge line segment The degree of filtering adjustment of the individual pixels.
It should be noted that, according to the obtained distribution saliency of the edge line segments, in combination with a preset saliency threshold, when the distribution saliency of the edge line segments is greater than the saliency threshold, the edge line segments can be considered to have a value domain weight influence in the filtering window. And selecting and analyzing gray scale difference characteristics of the edge line segments in the neighborhood window size when the density difference representation of the distribution significance degree of the edge line segments is maximum, wherein the gray scale values in the neighborhood window are mainly influenced by the gray scale information of the light building material region. Therefore, for the edge points on the same edge line segment, the smaller the gray level difference between the gray level value of the edge point and the average value in the neighborhood window is, and the smaller the gray level difference between the gray level value of the edge point and the average value of the gray level value on the edge line is, the greater the possibility that the edge point belongs to the broken edge region of the light building material is, the greater the filtering adjustment degree of the edge point is, and the filtering adjustment degree of the edge point is obtained.
After the filtering adjustment degree of the edge points is obtained, when the bilateral filtering algorithm obtains the value domain weight, when the central element of the filtering window is the edge pixel point, the value domain weight is adjusted by utilizing the filtering adjustment degree difference between the edge points in the filtering window, and then the adjusted weight coefficient is obtained.
Specifically, the bilateral filtering value range weight coefficient adjusted by each pixel point on each edge line segment is obtained according to the filtering adjustment degree, and specifically the method comprises the following steps:
In the method, in the process of the invention, On the target edge line segmentThe degree of filtering adjustment of the individual pixels,The average value of the filter adjustment degree of all pixel points on the target edge line segment,The specific acquisition method of (1) is as follows: obtaining the first line segment on the edge of the target according to the bilateral filtering algorithmBilateral filtering value range weight coefficient of each pixel point is recorded as; It should be noted that, according to the bilateral filtering algorithm, the first line segment on the edge of the target is obtainedThe bilateral filtering value domain weight coefficient of each pixel point is the existing method of the bilateral filtering algorithm, and the embodiment is not repeated; On the target edge line segment Bilateral filtering value domain weight coefficient after adjustment of each pixel point.
When the following is performedWhen the bilateral filtering value domain weight coefficient is increased, whenAnd when the bilateral filtering value domain weight coefficient is reduced, the broken edges of the light building materials in the overlapping area of the edge information of the light building materials and the edge information of the hard building materials can be better reserved.
Further, the construction waste gray image after denoising is obtained by performing bilateral filtering on the construction waste gray image according to the adjusted bilateral filtering value range weight coefficient, and it is to be noted that, the construction waste gray image after denoising is performed by performing bilateral filtering on the construction waste gray image according to the bilateral filtering value range weight coefficient adjusted by each pixel point on each edge line segment, and the existing method for obtaining the construction waste gray image after denoising is a bilateral filtering algorithm is not described herein.
Thus, the construction waste gray level image after denoising is obtained.
And S004, intelligently classifying the construction waste according to the construction waste gray level image after denoising.
The above-mentioned construction waste gray image after denoising is obtained, which realizes the effect of better retaining the broken edges of the light construction material in the overlapping area of the edge information of the light construction material and the edge information of the hard construction material, and further classifies the construction waste gray image after denoising.
Specifically, the construction waste is intelligently classified according to the construction waste gray level image after denoising, and the construction waste is specifically as follows:
Carrying out connected domain detection on the construction waste gray level image after denoising to obtain a plurality of connected domains of the construction waste gray level image after denoising; it should be noted that, performing connected domain detection on the construction waste gray image after denoising, and obtaining a plurality of connected domains of the construction waste gray image after denoising is an existing method, which is not described in detail in this embodiment; the number of the pixel points contained in each connected domain is obtained, a first threshold value is preset, the first threshold value is taken as 100 in the embodiment, the connected domain with the number of the pixel points larger than the first threshold value is used as a light building material area, and the connected domain with the number of the pixel points smaller than or equal to the first threshold value is used as a hard building material area.
So far, the intelligent classification of the construction waste is completed by filtering and classifying the construction waste gray level images.
Another embodiment of the present invention provides an intelligent classification system for construction waste, the system comprising a memory and a processor, the processor executing a computer program stored in the memory, performing the following operations:
Collecting RGB images of the construction waste mixture, and graying to obtain a construction waste gray image; performing edge detection on the construction waste gray level image to obtain a plurality of edge line segments of the construction waste gray level image; acquiring a plurality of neighborhood windows of each edge line segment according to the edge line segment; carrying out principal component analysis on pixel points of edge line segments in the neighborhood windows to obtain a distribution density sequence of each neighborhood window of each edge line segment; acquiring a first distribution density sequence and a second distribution density sequence of each neighborhood window of each edge line segment according to the distribution density sequences; obtaining the density difference of the first distribution density sequence and the second distribution density sequence of each neighborhood window of each edge line segment according to the first distribution density sequence and the second distribution density sequence; obtaining the distribution significance of each edge line segment according to the density difference; obtaining the filtering adjustment degree of each pixel point on each edge line segment according to the distribution saliency degree and the gray value of the pixel point on the edge line segment; obtaining bilateral filtering value domain weight coefficients after adjustment of each pixel point on each edge line segment according to the filtering adjustment degree; carrying out bilateral filtering on the construction waste gray image according to the adjusted bilateral filtering value domain weight coefficient to obtain a construction waste gray image after denoising; and classifying the construction wastes intelligently according to the construction waste gray level images after denoising.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the invention, but any modifications, equivalent substitutions, improvements, etc. within the principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. The intelligent classification method for the construction waste is characterized by comprising the following steps of:
Collecting RGB images of the construction waste mixture, and graying to obtain a construction waste gray image;
Performing edge detection on the construction waste gray level image to obtain a plurality of edge line segments of the construction waste gray level image; acquiring a plurality of neighborhood windows of each edge line segment according to the edge line segment; carrying out principal component analysis on pixel points of edge line segments in the neighborhood windows to obtain a distribution density sequence of each neighborhood window of each edge line segment; acquiring a first distribution density sequence and a second distribution density sequence of each neighborhood window of each edge line segment according to the distribution density sequences; obtaining the density difference of the first distribution density sequence and the second distribution density sequence of each neighborhood window of each edge line segment according to the first distribution density sequence and the second distribution density sequence; obtaining the distribution significance of each edge line segment according to the density difference;
Obtaining the filtering adjustment degree of each pixel point on each edge line segment according to the distribution saliency degree and the gray value of the pixel point on the edge line segment; obtaining bilateral filtering value domain weight coefficients after adjustment of each pixel point on each edge line segment according to the filtering adjustment degree; carrying out bilateral filtering on the construction waste gray image according to the adjusted bilateral filtering value domain weight coefficient to obtain a construction waste gray image after denoising;
and classifying the construction wastes intelligently according to the construction waste gray level images after denoising.
2. The intelligent classification method of building rubbish according to claim 1, wherein the steps of obtaining a plurality of neighborhood windows of each edge line segment according to the edge line segment include the following specific steps:
marking any edge line segment as a target edge line segment, acquiring a minimum circumscribed square of the target edge line segment, marking the minimum circumscribed square as a first circumscribed square, and marking the side length of the first circumscribed square as The central pixel point of the first circumscribed square is marked as a first central point; constructing a plurality of square windows by taking a first central point as the center, and marking the square windows as neighborhood windows of target edge line segments, wherein the length change interval of the neighborhood windows isIs a preset first value.
3. The intelligent classification method of construction waste according to claim 2, wherein the principal component analysis is performed on the pixel points of the edge line segments in the neighborhood windows to obtain the distribution density sequence of each neighborhood window of each edge line segment, and the method comprises the following specific steps:
Marking any one neighborhood window of the target edge line segment as a target neighborhood window, carrying out principal component analysis on pixel points on all edge line segments in the target neighborhood window, obtaining a principal component direction with the maximum characteristic value, and marking the principal component direction as an initial principal component direction; projecting pixel points on all edge line segments in a target neighborhood window to the direction of an initial principal component to obtain a plurality of projection points in the direction of the initial principal component; the method comprises the steps of obtaining the number of pixel points corresponding to each projection point in the initial principal component direction, arranging all the numbers according to the sequence of the projection points in the initial principal component direction, obtaining a sequence, and recording the sequence as a distribution density sequence of a target neighborhood window.
4. The method for intelligently classifying construction waste according to claim 3, wherein the steps of obtaining the first distribution density sequence and the second distribution density sequence of each neighborhood window of each edge line segment according to the distribution density sequence include the following specific steps:
Acquiring a first projection point and a last projection point of pixel points of a target edge line segment in projection points corresponding to an initial principal component direction, marking the number corresponding to the first projection point in a distribution density sequence of a target neighborhood window as a first number, and marking partial distribution density sequences corresponding to all elements before the first number in the distribution density sequence of the target neighborhood window as a first distribution density sequence of the target neighborhood window; and marking the number corresponding to the last projection point in the distribution density sequence of the target neighborhood window as a second number, and marking the partial distribution density sequence corresponding to all elements after the second number in the distribution density sequence of the target neighborhood window as a second distribution density sequence of the target neighborhood window.
5. The intelligent classification method of construction waste according to claim 4, wherein the obtaining the density difference between the first distribution density sequence and the second distribution density sequence of each neighborhood window of each edge line segment according to the first distribution density sequence and the second distribution density sequence comprises the following specific steps:
And (3) marking the ratio of the average value of all the numbers in the first distribution density sequence to the number of the numbers in the first distribution density sequence as a first ratio, marking the ratio of the average value of all the numbers in the second distribution density sequence to the number of the numbers in the second distribution density sequence as a second ratio, subtracting the second ratio from the first ratio, taking an absolute value, and taking the absolute value as the density difference of the first distribution density sequence and the second distribution density sequence of the target neighborhood window of the target edge line segment.
6. The intelligent classification method of construction waste according to claim 2, wherein the obtaining the distribution significance of each edge line segment according to the density difference comprises the following specific steps:
The first line segment of the target edge The density difference of the first distribution density sequence and the second distribution density sequence of each neighborhood window minus the first of the target edge line segmentsThe absolute value of the result of the density difference between the first distribution density sequence and the second distribution density sequence of each neighborhood window is taken, and the average value is obtained by accumulation summation, so as to obtain the distribution significant factor of the target edge line segment,The number of neighborhood windows of the target edge line segment; and obtaining the distribution significant factors of each edge line segment, and carrying out linear normalization processing on all the distribution significant factors, wherein the obtained result is used as the distribution significant degree of each edge line segment.
7. The intelligent classification method of construction waste according to claim 2, wherein the obtaining the filtering adjustment degree of each pixel point on each edge line segment according to the distribution significance degree and the gray value of the pixel point on the edge line segment comprises the following specific steps:
On the target edge line segment Subtracting the average value of the gray values of the pixel points in the neighborhood sliding window where the target edge line segment is located from the gray values of the pixel points, and marking the obtained difference value as a first difference value, wherein the neighborhood sliding window is a neighborhood sliding window of a bilateral filtering algorithm; on the target edge line segmentSubtracting the average value of the gray values of all the pixel points on the target edge line segment from the gray values of the pixel points, and marking the obtained difference value as a second difference value; subtracting a preset significance threshold from the distribution significance of the target edge line segments, and marking the obtained difference as a third difference; obtaining the maximum value of a plurality of differences obtained by subtracting a preset significance threshold from the distribution significance of all edge line segments, and marking the maximum value as a fourth difference; the ratio of the third difference value to the fourth difference value is marked as a first ratio, the product of the first difference value, the second difference value and the first ratio is marked as a first factor, and the inverse ratio value of the first factor is used as the first factor on the target edge line segmentThe degree of filtering adjustment of the individual pixels.
8. The method for intelligently classifying the construction waste according to claim 2, wherein the obtaining the bilateral filtering value domain weight coefficient after adjustment of each pixel point on each edge line segment according to the filtering adjustment degree comprises the following specific steps:
if the object edge line segment is the first The filter adjustment degree of each pixel point is larger than or equal to the average value of the filter adjustment degrees of all pixel points on the target edge line segment, and the first pixel point on the target edge line segmentThe specific acquisition method of the bilateral filtering value domain weight coefficient after the adjustment of each pixel point is as follows: obtaining the first line segment on the edge of the target according to the bilateral filtering algorithmBilateral filtering value domain weight coefficient of each pixel point is recorded as an initial coefficient, and the first point on the target edge line segmentSubtracting the average value of the filter adjustment degrees of all the pixel points on the target edge line segment from the filter adjustment degrees of all the pixel points, marking the obtained difference value as a fifth difference value, marking the ratio of the fifth difference value to the average value of the filter adjustment degrees of all the pixel points on the target edge line segment as a third ratio, marking the result of adding one to the third ratio as a first added value, marking the product of the first added value and the initial coefficient as the first on the target edge line segmentBilateral filtering value domain weight coefficients after adjustment of the pixel points;
if the object edge line segment is the first The filter adjustment degree of each pixel point is smaller than the average value of the filter adjustment degrees of all pixel points on the target edge line segment, and the first pixel point on the target edge line segmentThe specific acquisition method of the bilateral filtering value domain weight coefficient after the adjustment of each pixel point is as follows: on the target edge line segmentThe ratio of the filter adjustment degree of each pixel point to the average value of the filter adjustment degrees of all the pixel points on the target edge line segment is recorded as a fourth ratio, and the product result of the fourth ratio and the initial coefficient is used as the first point on the target edge line segmentBilateral filtering value domain weight coefficient after adjustment of each pixel point.
9. The intelligent classification method of the construction waste according to claim 1, wherein the intelligent classification of the construction waste according to the denoised construction waste gray level image comprises the following specific steps:
Carrying out connected domain detection on the construction waste gray level image after denoising to obtain a plurality of connected domains of the construction waste gray level image after denoising; the method comprises the steps of obtaining the number of pixel points contained in each connected domain, presetting a first threshold value, taking the connected domain with the number of pixel points larger than the first threshold value as a light building material area, and taking the connected domain with the number of pixel points smaller than or equal to the first threshold value as a hard building material area.
10. A smart construction waste classification system comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the computer program when executed by the processor performs the steps of a smart construction waste classification method as claimed in any one of claims 1 to 9.
CN202410532856.7A 2024-04-30 2024-04-30 Intelligent classification method and system for construction waste Active CN118115823B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410532856.7A CN118115823B (en) 2024-04-30 2024-04-30 Intelligent classification method and system for construction waste

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410532856.7A CN118115823B (en) 2024-04-30 2024-04-30 Intelligent classification method and system for construction waste

Publications (2)

Publication Number Publication Date
CN118115823A CN118115823A (en) 2024-05-31
CN118115823B true CN118115823B (en) 2024-06-28

Family

ID=91212615

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410532856.7A Active CN118115823B (en) 2024-04-30 2024-04-30 Intelligent classification method and system for construction waste

Country Status (1)

Country Link
CN (1) CN118115823B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170040983A (en) * 2015-10-06 2017-04-14 한양대학교 산학협력단 Method and apparatus of image denoising using multi-scale block region detection
CN116029941A (en) * 2023-03-27 2023-04-28 湖南融城环保科技有限公司 Visual image enhancement processing method for construction waste

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210327030A1 (en) * 2020-04-20 2021-10-21 Varjo Technologies Oy Imaging system and method incorporating selective denoising

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170040983A (en) * 2015-10-06 2017-04-14 한양대학교 산학협력단 Method and apparatus of image denoising using multi-scale block region detection
CN116029941A (en) * 2023-03-27 2023-04-28 湖南融城环保科技有限公司 Visual image enhancement processing method for construction waste

Also Published As

Publication number Publication date
CN118115823A (en) 2024-05-31

Similar Documents

Publication Publication Date Title
CN115861135B (en) Image enhancement and recognition method applied to panoramic detection of box body
Thanammal et al. Effective histogram thresholding techniques for natural images using segmentation
CN108460744B (en) Cement notch road surface image noise reduction enhancement and crack feature extraction method
Ahonen et al. Soft histograms for local binary patterns
CN107480643B (en) Intelligent garbage classification processing robot
CN101059425A (en) Method and device for identifying different variety green tea based on multiple spectrum image texture analysis
CN110766689A (en) Method and device for detecting article image defects based on convolutional neural network
CN102129572A (en) Face detection method and device adopting cascade classifier
Yamaguchi et al. Automated crack detection for concrete surface image using percolation model and edge information
CN103175844A (en) Detection method for scratches and defects on surfaces of metal components
CN110428450A (en) Dimension self-adaption method for tracking target applied to the mobile inspection image of mine laneway
Sayed et al. Image object extraction based on curvelet transform
CN118115823B (en) Intelligent classification method and system for construction waste
Doost et al. Texture classification with local binary pattern based on continues wavelet transformation
Kartsov et al. Non-local means denoising algorithm based on local binary patterns
Widynski et al. A contrario edge detection with edgelets
CN115063679B (en) Pavement quality assessment method based on deep learning
Dhar et al. Interval type-2 fuzzy set and human vision based multi-scale geometric analysis for text-graphics segmentation
CN101296312A (en) Wavelet and small curve fuzzy self-adapting conjoined image denoising method
Azzabou et al. Uniform and textured regions separation in natural images towards MPM adaptive denoising
CN100573551C (en) Phase displacement fuzzy difference binaryzation method during fingerprint digital picture is handled
Gautam et al. A GUI for automatic extraction of signature from image document
Mohebbian et al. Increase the efficiency of DCT method for detection of copy-move forgery in complex and smooth images
Vincent Local grayscale granulometries based on opening trees
CN118115497B (en) Quartz sand crushing and grinding detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant