CN116682098A - Automatic urban household garbage identification and classification system and method - Google Patents

Automatic urban household garbage identification and classification system and method Download PDF

Info

Publication number
CN116682098A
CN116682098A CN202310562501.8A CN202310562501A CN116682098A CN 116682098 A CN116682098 A CN 116682098A CN 202310562501 A CN202310562501 A CN 202310562501A CN 116682098 A CN116682098 A CN 116682098A
Authority
CN
China
Prior art keywords
garbage
image
delivery
loss
representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310562501.8A
Other languages
Chinese (zh)
Inventor
薛强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaoji Shanghai Environmental Protection Technology Co ltd
Original Assignee
Xiaoji Shanghai Environmental Protection Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaoji Shanghai Environmental Protection Technology Co ltd filed Critical Xiaoji Shanghai Environmental Protection Technology Co ltd
Priority to CN202310562501.8A priority Critical patent/CN116682098A/en
Publication of CN116682098A publication Critical patent/CN116682098A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/0033Refuse receptacles; Accessories therefor specially adapted for segregated refuse collecting, e.g. receptacles with several compartments; Combination of receptacles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/14Other constructional features; Accessories
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/138Identification means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/176Sorting means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02WCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
    • Y02W30/00Technologies for solid waste management
    • Y02W30/10Waste collection, transportation, transfer or storage, e.g. segregated refuse collecting, electric or hybrid propulsion

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Mechanical Engineering (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The invention belongs to the technical field of dustbin control systems, and particularly discloses an automatic urban household garbage identification and classification system and method. According to the invention, an automatic urban household garbage identification and classification system and a method thereof are designed, garbage is delivered into a garbage bin in an unattended intelligent garbage station, a camera shoots a garbage delivery process, an image is obtained, after the pretreatment of denoising and size normalization of the image, a deep learning algorithm is adopted, a YOLOv5 architecture is used, a neural network artificial intelligent system based on PyTroch and Open-CV computer vision are used as main materials, a garbage identification model is trained autonomously, and then the garbage identification model is utilized to identify objects in the image so as to distinguish different garbage types; finally, storing and displaying the garbage delivery classification information, so that the function of automatically identifying the types of delivered household garbage can be realized, whether the classification of garbage is correct and the classification of garbage are judged, and therefore, a manager can know the garbage condition in the garbage can in time and process the garbage in time.

Description

Automatic urban household garbage identification and classification system and method
Technical Field
The invention relates to the technical field of dustbin control systems, in particular to an automatic urban household garbage identification and classification system and an automatic urban household garbage identification and classification method.
Background
At present, household garbage can be generally divided into four main categories: recyclable waste, wet waste, hazardous waste, and other waste, wherein the recyclable waste primarily includes: newspapers, cartons, books, advertising placards, plastic bottles, plastic toys, oil drums, wine bottles, glasses, pop-top cans, old iron pans, old clothing, bags, old dolls, old digital products, old household appliances, and the like; the wet garbage mainly comprises: food waste, leftovers, expired food, vegetables and fruits, melon peel and fruit pits, green plants of flowers, chinese medicine residues and the like; the harmful garbage mainly comprises: waste batteries (rechargeable batteries, lead-acid batteries, nickel-cadmium batteries, button batteries, etc.), waste paint, disinfectants, fluorescent tubes, mercury-containing thermometers, waste medicines, packages thereof, etc.; other garbage is dry garbage, and the dry garbage mainly comprises: cutlery box, napkin, wet paper towel, toilet paper, plastic bag, food packaging bag, severely polluted paper, cigarette butt, paper diaper, disposable cup, bone, shell, flowerpot, etc.
With the continuous development of society, garbage classification is very necessary. Therefore, in life, household garbage is classified and delivered into the garbage can according to four categories. However, in the existing unattended intelligent garbage station, after people deliver household garbage, the garbage can cannot identify the types and the amounts of the garbage, so that the garbage situation in the garbage can cannot be judged, and the garbage cannot be treated in time; unless the manager reaches the garbage station and approaches the garbage can, the garbage condition can be observed, and the workload of the manager is greatly increased.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the invention and thus may include information that does not form the prior art that is already known to those of ordinary skill in the art.
Disclosure of Invention
In view of at least one of the above technical problems, the invention provides an automatic urban household garbage identification and classification system and an automatic urban household garbage identification and classification method.
According to one aspect of the invention, an automatic urban household garbage identification and classification system is provided, which comprises an image acquisition module, a login module, a preprocessing module, an image identification module and a recording module;
the image acquisition module is used for shooting or recording the garbage delivery process by using a camera so as to acquire images of garbage;
the preprocessing module is used for denoising the image and normalizing the size so that the image quality is suitable for algorithm analysis;
the image recognition module is used for recognizing objects in the image to distinguish different garbage types;
the login module is used for managing account login of the user, entering a management main interface and checking delivery records in delivery management;
and the recording module is used for recording the information of garbage delivery.
An automatic urban household garbage identification and classification method comprises the following steps:
s1, collecting images of garbage articles: delivering the garbage into a garbage can, and shooting the garbage delivering process by a camera to obtain an image;
s2, image preprocessing: image denoising and size normalization make the image quality suitable for algorithm analysis;
s3, image identification: adopting a deep learning algorithm, using a YOLOv5 architecture, using a neural network artificial intelligent system based on PyTroch and an Open-CV computer vision as main parts, independently training a garbage identification model, and identifying objects in the image by using the garbage identification model to distinguish different garbage types;
s4, delivery classification record: storing and displaying garbage delivery classification information, wherein the garbage delivery classification information comprises IEMI (information and information management information) of equipment, delivery number, delivery time, delivery picking, types of garbage in the picture analyzed in the step S3, specific coordinates of the garbage item and a label.
Further, in step S3, the model of the detect. Py file is selected as a model specially trained for garbage recognition, and using the command python detect. Py—source 0, the system automatically invokes the camera and the system automatically starts to recognize items that are in the camera range.
Further, at the time of image recognition in step S3, the YOLOv5 architecture is used, and the following training strategy is adopted: multiscale training, automatic tuning, warm-up and cosine LR schedulers, EMA, hybrid accuracy, evolution hyper-parameters, computational loss, balance loss, grid sensitivity cancellation, and target establishment.
Further, the YOLOv5 architecture comprises the following steps:
inputting an image, adjusting the size of the image, and transmitting the image in a network;
the image input passes through a series of convolution layers and a CSPNet module to extract characteristic information in the image;
outputting predicted frames, categories and confidence scores through a branch on the feature map extracted by the CSPNet;
and screening and adjusting the prediction frame to obtain a detection result.
Further, the penalty of the YOLOv5 architecture consists of three parts: classification loss, target existence loss, positioning loss, calculation of the YOLOv5 architecture loss is as follows:
Loss=λ 1 L cls2 L obj3 L loc
wherein L is cls To classify the loss: representing a cross entropy loss between the class of the predicted bounding box and the class of the actual bounding box; l (L) obj Loss for the target, representing a binary cross entropy loss between the confidence score of the predicted frame and the actual frame; l (L) loc To locate the loss: representing the square loss between the coordinates of the predicted bounding box and the coordinates of the actual bounding box.
Further, the weights of the target existing losses of the three prediction layers (P3, P4, P5) are different, the balance weights are [4.0,1.0,0.4], respectively, the balance losses are:
wherein the method comprises the steps ofThe target existence loss values of the small object, the medium object and the large object are respectively represented, and the parameters 4.0,1.0 and 0.4 respectively represent the weight coefficients corresponding to the objects with different sizes.
Further, in the YOLOv5 architecture, the cancellation grid sensitivity is:
b x =(2·σ(t x )-0.5)+c x
b y =(2·σ(t y )-0.5)+c y
b w =p w ·(2·σ(t w )) 2
b h =p h ·(2·σ(t h )) 2
wherein b x An abscissa representing a center point of the prediction frame, c x Abscissa, t, representing top left pixel of prediction box in feature map x Represents the offset parameters learned by the model during training (2. Sigma (t) x ) -0.5) let t x Mapped to [ -0.5,0.5]To obtain an offset of the center point, wherein σ is a sigmoid function;
wherein b y Representing the ordinate of the central point of the prediction frame, c y Ordinate, t, representing the top left pixel of the prediction box in the feature map y Represents the offset parameters learned by the model during training (2. Sigma (t) y ) -0.5) let t y Mapped to [ -0.5,0.5]To obtain an offset of the center point, wherein σ is a sigmoid function;
b w representing the width of the prediction box, wherein, pw representing a priori frame width, t, of the prediction frame w Offset parameter, 2·σ (t), representing frame width learned during model training w ) Will offset t w Mapped to [0,1 ]]Between them;
b h representing the height of the prediction box, where p h A priori frame height, t, representing the prediction frame h Offset parameter, 2·σ (t), representing frame height learned during model training h ) Will offset t h Mapped to [0,1 ]]Between them.
Further, the YOLOv5 architecture includes a train. Py code set file, a val. Py code set file, and a detect. Py code set file;
the file running mode is designed based on the train. Py code-group file, the val. Py code-group file, and the detect. Py code-group file:
the train.py code group file, the val.py code group file and the detect.py code group file are sequentially operated so that different file models can be operated; wherein the training garbage identification model is run using the train. Py code set file; detecting the garbage identification model by using the val.py code group file, thereby selecting a correct garbage identification model; the detection is performed using the detect. Py code set file for the image acquired by the real-time camera.
Further, in step S4, an imei screening function is set, and the device is used for querying the garbage delivery classification information under the device through the imei of the device; setting a time index screening function, and searching garbage delivery classification information in the time range through the starting date and the ending date of the time.
The invention has the following technical effects:
according to the invention, an automatic urban household garbage identification and classification system and a method thereof are designed, garbage is delivered into a garbage bin in an unattended intelligent garbage station, a camera shoots a garbage delivery process, an image is obtained, after the pretreatment of denoising and size normalization of the image, a deep learning algorithm is adopted, a YOLOv5 architecture is used, a neural network artificial intelligent system based on PyTroch and Open-CV computer vision are used as main materials, a garbage identification model is trained autonomously, and then the garbage identification model is utilized to identify objects in the image so as to distinguish different garbage types; finally, storing and displaying the garbage delivery classification information, so that the function of automatically identifying the types of delivered household garbage can be realized, whether the classification of garbage is correct and the classification of garbage are judged, and therefore, a manager can know the garbage condition in the garbage can in time and process the garbage in time.
The invention will be further described with reference to the drawings and examples.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will briefly explain the embodiments or the drawings needed in the prior art, and it is obvious that the drawings in the following description are only some embodiments of the present invention and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of steps of an automatic identification and classification method for urban household garbage in the invention;
FIG. 2 is a block diagram of an automatic recognition and classification system for urban household garbage according to the invention;
FIG. 3 is a schematic diagram of a garbage image collected by a camera and a picture after identification and analysis;
fig. 4 is a diagram of the classification information of garbage delivery recorded by the recording module in the management platform in the present invention.
Detailed Description
In order that the above objects, features and advantages of the invention will be readily understood, a more particular description of the invention will be rendered by reference to the appended drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. The present invention may be embodied in many other forms than described herein and similarly modified by those skilled in the art without departing from the spirit of the invention, whereby the invention is not limited to the specific embodiments disclosed below.
In one embodiment of the present invention, as shown in fig. 1 to 2, an automatic recognition and classification system for municipal solid waste and an automatic recognition and classification method for municipal solid waste are provided.
As shown in fig. 2, the automatic recognition and classification system for the urban household garbage comprises front-end equipment and a management background.
The front-end equipment comprises an image acquisition module (namely a camera), a delivery channel and a dustbin. Specifically:
the image acquisition module is used for shooting or recording the garbage delivery process by using a camera so as to acquire images of garbage, and the camera is arranged on the delivery channel and can shoot garbage articles delivered on the delivery channel;
the dustbin has four major classes rubbish branch casees, does respectively: the garbage can comprises a recyclable garbage can, a wet garbage can, a harmful garbage can and other garbage cans, and each garbage can is provided with a delivery channel.
The management background comprises a preprocessing module, an image recognition module, a login module, a recording module and an early warning module. Specifically:
the preprocessing module is used for denoising the image and normalizing the size so that the image quality is suitable for algorithm analysis;
the image recognition module is used for recognizing objects in the image to distinguish different garbage types;
the login module is used for managing account login of the user, entering a management main interface and checking delivery records in delivery management;
the recording module is used for recording the information of the garbage delivery, and the garbage delivery classification information comprises IEMI (information and information management information) of equipment, a delivery number, a delivery time, a delivery picking-up piece, the type of garbage in the picture analyzed by the step S3, and specific coordinates and labels of the garbage identification items;
and the early warning module is used for early warning and reminding timely treatment of the rubbish delivery errors.
The system of the invention also comprises an online input module which is arranged in the front-end equipment and is used for people to select the garbage types to be delivered online before delivering the garbage.
As shown in fig. 1, an automatic recognition and classification method for urban household garbage comprises the following steps:
s1, collecting images of garbage articles: the waste is delivered to a recyclable waste bin, wet waste bin, hazardous waste bin or other waste bin, and a camera shoots the waste delivery process and obtains an image, as shown in fig. 3.
S2, image preprocessing: image denoising and size normalization make image quality suitable for algorithmic analysis.
S3, image identification: adopting a deep learning algorithm, using a YOLOv5 architecture, using a neural network artificial intelligent system based on PyTroch and an Open-CV computer vision as main parts, independently training a garbage identification model, and identifying objects in the image by using the garbage identification model to distinguish different garbage types;
the image recognition is described in detail:
the model of the detect. Py file is selected as the garbage recognition training model, and using the command python detect. Py—source 0, the system will automatically invoke the camera and the system will begin automatically recognizing items that are coming into range of the camera.
In image recognition, the YOLOv5 architecture is used and the following training strategy is adopted: multiscale training, automatic tuning, warm-up and cosine LR schedulers, EMA, hybrid accuracy, evolution hyper-parameters, computational loss, balance loss, grid sensitivity cancellation, and target establishment.
The YOLOv5 architecture adopts a lightweight CSPNet architecture, can ensure higher detection precision and improve the running speed of a network, and the YOLOv5 architecture comprises the following steps: the first step: inputting an image, adjusting the image to be a proper size, and transmitting the image in a network; and a second step of: the image input passes through a series of convolution layers and a CSPNet module to extract characteristic information in the image; and a third step of: outputting predicted frames, categories and confidence scores through a branch on the feature map extracted by the CSPNet; fourth step: and screening and adjusting the prediction frame to obtain a detection result. In order to further improve detection accuracy, the YOLOv5 architecture also adopts strategies such as multi-scale detection and data enhancement, targets are detected from multiple scales and angles, and robustness and generalization capability of the model are improved. The entire process is end-to-end, without the need to extract features or use other algorithms.
The penalty of the YOLOv5 architecture consists of three parts: classification loss, target existence loss, positioning loss, calculation of the YOLOv5 architecture loss is as follows:
Loss=λ 1 L cls2 L obj3 L loc
wherein L is cls To classify the loss: representing cross entropy loss between the category of the predicted frame and the category of the actual frame, namely, loss of classification errors; l (L) obj There is a penalty for the target, representing the binary cross entropy penalty between the confidence score of the predicted bounding box and the actual bounding box, i.e., the penalty of the target's presence/absence; l (L) loc To locate the loss: representing the square loss between the coordinates of the predicted bounding box and the coordinates of the actual bounding box, i.e. the loss of positioning error.
The weights of the target existence losses of the three prediction layers (P3, P4, P5) are different, the balance weights are [4.0,1.0,0.4], respectively, the balance losses are:
wherein the method comprises the steps ofThe target presence loss values for small, medium and large objects, respectively, and the parameters 4.0,1.0 and 0.4 represent weight coefficients for different sized objects, respectively, which can be adjusted to accommodate different data sets or training targets.
In the YOLOv5 architecture, the cancellation grid sensitivity is:
b x =(2·σ(t x )-0.5)+c x
b y =(2·σ(t y )-0.5)+c y
b w =p w ·(2·σ(t w )) 2
b h =p h ·(2·σ(t h )) 2
wherein b x An abscissa representing a center point of the prediction frame, c x Abscissa, t, representing top left pixel of prediction box in feature map x Represents the offset parameters learned by the model during training (2. Sigma (t) x ) -0.5) let t x Mapped to [ -0.5,0.5]To obtain an offset of the center point, wherein σ is a sigmoid function;
wherein b y Representing the ordinate of the central point of the prediction frame, c y Ordinate, t, representing the top left pixel of the prediction box in the feature map y Represents the offset parameters learned by the model during training (2. Sigma (t) y ) -0.5) let t y Mapped to [ -0.5,0.5]To obtain an offset of the center point, wherein σ is a sigmoid function;
wherein b w Representing the width of the prediction box, where p w Representing a priori frame width, t, of the prediction frame w Offset parameter, 2·σ (t), representing frame width learned during model training w ) Will offset t w Mapped to [0,1 ]]The secondary party in the formula maps the offset to a larger range, so that the frame width change is smoother;
wherein b h Representing the height of the prediction box, where p h A priori frame height, t, representing the prediction frame h Offset parameter, 2·σ (t), representing frame height learned during model training h ) Will offset t h Mapped to [0,1 ]]In between, the quadratic in the formula maps the offset to a larger range, making the box height variation smoother.
The YOLOv5 architecture includes a train. Py code-set file, a val. Py code-set file, and a detect. Py code-set file. The operation mode is designed based on the train. Py code group file, the val. Py code group file, and the detect. Py code group file: the train.py code group file, the val.py code group file and the detect.py code group file are sequentially operated so that different file models can be operated; wherein the training garbage identification model is run using the train. Py code set file; detecting the garbage identification model by using the val.py code group file, thereby selecting a correct garbage identification model; the detection is performed using the detect. Py code set file for the image acquired by the real-time camera.
S4, delivery classification record: storing and displaying garbage delivery classification information, wherein the garbage delivery classification information comprises IEMI (information and information management information) of equipment, delivery number, delivery time, delivery picking, types of garbage in the picture analyzed by the step S3, specific coordinates of the garbage item and a label, as shown in fig. 4. Thus, classification information is available from the delivery of the refuse, which refuse bin the refuse was delivered into, when it was delivered, the image of the refuse, the type of refuse, whether the refuse was classified correctly, etc.
In step S4, an imei screening function is set, and the device is used for inquiring the garbage delivery classification information under the device through the imei of the device; setting a time index screening function, and searching garbage delivery classification information in the time range through the starting date and the ending date of the time.
Therefore, the working process of the invention is further described based on the automatic recognition and classification system and method of the urban household garbage:
the method comprises the steps that recyclable garbage 'plastic bottles' are directly delivered to delivery channels of recyclable garbage sub-boxes, a camera shoots a garbage delivery process, images are obtained and transmitted to a management background, and after the images are denoised and normalized in size through a preprocessing module, the images are analyzed and identified by an image identification module to distinguish the plastic bottles. At this time, in the front-end equipment, the "plastic bottle" enters the recyclable waste bin; in a management background, a recording module records garbage delivery classification information of a ' plastic bottle ', wherein the garbage delivery classification information comprises corresponding equipment IEMI delivered by the ' plastic bottle ', a delivery number of the ' plastic bottle ', a delivery time of the ' plastic bottle ', a delivery picture of the ' plastic bottle (the type and the delivery time of garbage are displayed on the picture), the type of the ' plastic bottle ' is recyclable garbage, a position coordinate of the ' plastic bottle ' and finally the classification of the ' plastic bottle ' is recorded correctly;
the method comprises the steps that harmful garbage 'batteries' are directly delivered to a delivery channel of a recyclable garbage sub-box, a camera shoots a garbage delivery process, an image is obtained and is transmitted to a management background, and after the image is denoised and normalized in size through a preprocessing module, the image recognition module analyzes and recognizes articles in the image to distinguish the images as 'batteries'. At this time, in the front-end equipment, the "battery" enters the recyclable waste bin; in the management background, the recording module records garbage delivery classification information of the battery, wherein the garbage delivery classification information comprises corresponding equipment IEMI delivered by the battery, the serial number delivered by the battery, the delivery time of the battery, pictures (the types of garbage and the delivery time are displayed on the pictures) delivered by the battery, the types of garbage of the battery are harmful garbage, the position coordinates of the battery and finally the classification error of the battery is recorded. At the moment, the early warning module reminds a manager to process in time, and the manager can purposefully pick out the battery with the delivery error when arriving at the dustbin according to the record of the system;
in addition, when the harmful garbage is delivered to the correct harmful garbage sub-bin, the early warning module can remind the manager to timely treat the harmful garbage when a certain number of batteries are stored or the battery is stored for a certain time or the storage temperature is greatly changed so as to avoid explosion.
The above description is only of the preferred embodiment of the present invention, and is not intended to limit the present invention in any way. Any person skilled in the art can make many possible variations and modifications to the technical solution of the present invention or modifications to equivalent embodiments using the methods and technical contents disclosed above, without departing from the scope of the technical solution of the present invention. Therefore, all equivalent changes according to the shape, structure and principle of the present invention are covered in the protection scope of the present invention.

Claims (10)

1. The automatic urban household garbage identification and classification system comprises an image acquisition module and a login module, and is characterized by further comprising a preprocessing module, an image identification module and a recording module;
the image acquisition module is used for shooting or recording the garbage delivery process by using a camera so as to acquire images of garbage;
the preprocessing module is used for denoising the image and normalizing the size so that the image quality is suitable for algorithm analysis;
the image recognition module is used for recognizing objects in the image to distinguish different garbage types;
the login module is used for managing account login of the user, entering a management main interface and checking delivery records in delivery management;
and the recording module is used for recording the information of garbage delivery.
2. The automatic urban household garbage identifying and classifying method is characterized by comprising the following steps:
s1, collecting images of garbage articles: delivering the garbage into a garbage can, and shooting the garbage delivering process by a camera to obtain an image;
s2, image preprocessing: image denoising and size normalization make the image quality suitable for algorithm analysis;
s3, image identification: adopting a deep learning algorithm, using a YOLOv5 architecture, using a neural network artificial intelligent system based on PyTroch and an Open-CV computer vision as main parts, independently training a garbage identification model, and identifying objects in the image by using the garbage identification model to distinguish different garbage types;
s4, delivery classification record: storing and displaying garbage delivery classification information, wherein the garbage delivery classification information comprises IEMI (information and information management information) of equipment, delivery number, delivery time, delivery picking, types of garbage in the picture analyzed in the step S3, specific coordinates of the garbage item and a label.
3. The automatic urban household garbage recognition and classification method according to claim 2, wherein in step S3, the model of the detect. Py code set file is selected as a model specially trained for garbage recognition, and the system automatically invokes the camera using the command python detect. Py-source 0, and the system automatically starts to recognize the objects entering the camera range.
4. The automatic urban household garbage recognition and classification method according to claim 2, wherein the image recognition in step S3 uses YOLOv5 architecture and adopts the following training strategy: multiscale training, automatic tuning, warm-up and cosine LR schedulers, EMA, hybrid accuracy, evolution hyper-parameters, computational loss, balance loss, grid sensitivity cancellation, and target establishment.
5. The automatic urban household garbage identification and classification method according to claim 4, wherein the YOLOv5 architecture comprises the following steps:
inputting an image, adjusting the size of the image, and transmitting the image in a network;
the image input passes through a series of convolution layers and a CSPNet module to extract characteristic information in the image;
outputting predicted frames, categories and confidence scores through a branch on the feature map extracted by the CSPNet;
and screening and adjusting the prediction frame to obtain a detection result.
6. The automatic urban household garbage identification and classification method according to claim 5, wherein the loss of YOLOv5 architecture consists of three parts: classification loss, target existence loss, positioning loss, calculation of the YOLOv5 architecture loss is as follows:
Loss=λ 1 L cls2 L obj3 L loc
wherein L is cls To classify the loss: representing a cross entropy loss between the class of the predicted bounding box and the class of the actual bounding box; l (L) obj Loss for the target, representing a binary cross entropy loss between the confidence score of the predicted frame and the actual frame; l (L) loc To locate the loss: representing the square loss between the coordinates of the predicted bounding box and the coordinates of the actual bounding box.
7. The automatic urban household garbage recognition and classification method according to claim 5, wherein the weights of the target existence losses of the three prediction layers (P3, P4, P5) are different, the balance weights are [4.0,1.0,0.4], and the balance losses are:
wherein the method comprises the steps ofThe target existence loss values of the small object, the medium object and the large object are respectively represented, and the parameters 4.0,1.0 and 0.4 respectively represent the weight coefficients corresponding to the objects with different sizes.
8. The automatic urban household garbage identification and classification method according to claim 5, wherein in the YOLOv5 architecture, the grid sensitivity elimination is as follows:
b x =(2·σ(t x )-0.5)+c x
b y =(2·σ(t y )-0.5)+c y
b w =p w ·(2·σ(t w )) 2
b h =p h ·(2·σ(t h )) 2
wherein b x An abscissa representing a center point of the prediction frame, c x Abscissa, t, representing top left pixel of prediction box in feature map x Represents the offset parameters learned by the model during training (2. Sigma (t) x ) -0.5) let t x Mapped to [ -0.5,0.5]To obtain an offset of the center point, wherein σ is a sigmoid function;
wherein b y Representing the ordinate of the central point of the prediction frame, c y Ordinate, t, representing the top left pixel of the prediction box in the feature map y Represents the offset parameters learned by the model during training (2. Sigma (t) y ) -0.5) let t y Mapped to [ -0.5,0.5]To obtain an offset of the center point, wherein σ is a sigmoid function;
b w representing the width of the prediction box, where p w Representing a priori frame width, t, of the prediction frame w Representing learned during model trainingOffset parameter of frame width, 2. Sigma (t w ) Will offset t w Mapped to [0,1 ]]Between them;
b h representing the height of the prediction box, where p h A priori frame height, t, representing the prediction frame h Offset parameter, 2·σ (t), representing frame height learned during model training h ) Will offset t h Mapped to [0,1 ]]Between them.
9. The automatic recognition and classification method for municipal solid waste according to claim 5, wherein the YOLOv5 architecture comprises a train. Py code group file, a val. Py code group file and a detect. Py code group file;
the file running mode is designed based on the train. Py code-group file, the val. Py code-group file, and the detect. Py code-group file:
the train.py code group file, the val.py code group file and the detect.py code group file are sequentially operated so that different file models can be operated; wherein the training garbage identification model is run using the train. Py code set file; detecting the garbage identification model by using the val.py code group file, thereby selecting a correct garbage identification model; the detection is performed using the detect. Py code set file for the image acquired by the real-time camera.
10. The automatic urban household garbage identification and classification method according to claim 9, wherein in step S4, an imei screening function is set, and the device' S imei is used for inquiring garbage delivery classification information under the device; setting a time index screening function, and searching garbage delivery classification information in the time range through the starting date and the ending date of the time.
CN202310562501.8A 2023-05-18 2023-05-18 Automatic urban household garbage identification and classification system and method Pending CN116682098A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310562501.8A CN116682098A (en) 2023-05-18 2023-05-18 Automatic urban household garbage identification and classification system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310562501.8A CN116682098A (en) 2023-05-18 2023-05-18 Automatic urban household garbage identification and classification system and method

Publications (1)

Publication Number Publication Date
CN116682098A true CN116682098A (en) 2023-09-01

Family

ID=87777870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310562501.8A Pending CN116682098A (en) 2023-05-18 2023-05-18 Automatic urban household garbage identification and classification system and method

Country Status (1)

Country Link
CN (1) CN116682098A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117292207A (en) * 2023-11-24 2023-12-26 杭州臻善信息技术有限公司 Garbage identification method and system based on big data image processing
CN117522388A (en) * 2023-11-08 2024-02-06 永昊环境科技(集团)有限公司 Intelligent sanitation processing method for urban environment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117522388A (en) * 2023-11-08 2024-02-06 永昊环境科技(集团)有限公司 Intelligent sanitation processing method for urban environment
CN117522388B (en) * 2023-11-08 2024-04-12 永昊环境科技(集团)有限公司 Intelligent sanitation processing method for urban environment
CN117292207A (en) * 2023-11-24 2023-12-26 杭州臻善信息技术有限公司 Garbage identification method and system based on big data image processing
CN117292207B (en) * 2023-11-24 2024-03-15 杭州臻善信息技术有限公司 Garbage identification method and system based on big data image processing

Similar Documents

Publication Publication Date Title
CN116682098A (en) Automatic urban household garbage identification and classification system and method
US11610185B2 (en) System and method for waste management
Yang et al. WasNet: a neural network-based garbage collection management system
Zhang et al. Computer vision based two-stage waste recognition-retrieval algorithm for waste classification
AU2018355910A1 (en) Systems and methods for detecting waste receptacles using convolutional neural networks
CN110263675A (en) A kind of the rubbish target identification system and its recognition methods of community security robot
CN110697273A (en) Intelligent household garbage identification and automatic classification system and method based on iterative learning control
CN110087193A (en) Information uploading method, device, electronic equipment and the readable storage medium storing program for executing of dustbin
CN110238078A (en) Method for sorting, device, system and storage medium
CN114275416B (en) Kitchen waste classification method, device, equipment and medium based on image recognition
CN107480643A (en) A kind of robot of Intelligent refuse classification processing
CN110466911A (en) Automatic sorting garbage bin and classification method
CN111611970A (en) Urban management monitoring video-based disposable garbage behavior detection method
CN110516768A (en) A kind of method, apparatus and artificial intelligence robot of garbage classification management
CN110110752A (en) A kind of identification of rubbish and classification method, device and terminal device
CN113003054A (en) Garbage classification method
CN113469264A (en) Construction method of automatic garbage classification model, garbage sorting method and system
Sirawattananon et al. Designing of IoT-based smart waste sorting system with image-based deep learning applications
CN113076805A (en) Robot-based garbage treatment method and system
Sultana et al. Trash and recycled material identification using convolutional neural networks (CNN)
Faria et al. Classification of organic and solid waste using deep convolutional neural networks
Gupta et al. Smart robot for collection and segregation of garbage
CN113371363A (en) Classified garbage can, intelligent classified garbage can based on deep learning and classification method
CN114283387B (en) Intelligent garbage point cleaning work order generation method and device and related medium
CN217576626U (en) Classification garbage can

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination