CN111217062A - Garbage can garbage identification method based on edge calculation and deep learning - Google Patents

Garbage can garbage identification method based on edge calculation and deep learning Download PDF

Info

Publication number
CN111217062A
CN111217062A CN202010168548.2A CN202010168548A CN111217062A CN 111217062 A CN111217062 A CN 111217062A CN 202010168548 A CN202010168548 A CN 202010168548A CN 111217062 A CN111217062 A CN 111217062A
Authority
CN
China
Prior art keywords
garbage
box
prediction
confidence
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010168548.2A
Other languages
Chinese (zh)
Inventor
李清秋
张鹏程
赵齐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hohai University HHU
Original Assignee
Hohai University HHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hohai University HHU filed Critical Hohai University HHU
Priority to CN202010168548.2A priority Critical patent/CN111217062A/en
Publication of CN111217062A publication Critical patent/CN111217062A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/14Other constructional features; Accessories
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/0033Refuse receptacles; Accessories therefor specially adapted for segregated refuse collecting, e.g. receptacles with several compartments; Combination of receptacles
    • B65F1/0053Combination of several receptacles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/0033Refuse receptacles; Accessories therefor specially adapted for segregated refuse collecting, e.g. receptacles with several compartments; Combination of receptacles
    • B65F2001/008Means for automatically selecting the receptacle in which refuse should be placed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/128Data transmitting means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/138Identification means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/176Sorting means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/178Steps

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a garbage bin garbage identification method based on edge calculation and deep learning, which comprises the following steps of: scanning and shooting garbage thrown by a thrower through a high-resolution camera arranged on a garbage can at a fixed angle to finish data collection; recognizing the thrown garbage by using a distributed edge server carrying a garbage recognition model, displaying a garbage recognition detection result and voice broadcasting on a display screen, storing garbage picture data, and finishing uploading unmarked garbage data to a cloud server; the cloud server trains the garbage detection model by using an SSD algorithm, and the garbage detection model is periodically updated by the new data uploaded by the edge server, so that the recognition result is accurate and real-time.

Description

Garbage can garbage identification method based on edge calculation and deep learning
Technical Field
The invention relates to a garbage identification method for a street garbage can, in particular to an unknown garbage detection method based on mobile edge calculation and deep learning, and belongs to the technical field of information processing.
Background
The garbage classified collection can reduce the garbage treatment amount and treatment equipment, reduce the treatment cost, reduce the consumption of land resources and has social, economic and ecological benefits. However, due to the fact that the types of garbage are numerous, a lot of time and cost are consumed when people classify the garbage, especially in streets with large pedestrian volume, people often have difficulty in accurately judging the types of the thrown garbage and putting the garbage into garbage cans beside the streets, and therefore researchers modify the conventional street garbage cans, and a high-resolution camera is added on the garbage can and used for identifying the types of the thrown garbage and collecting garbage images.
With the rapid development of the internet of things and the arrival of 5G networks, the linearly-increasing centralized cloud computing capability cannot be matched with the explosively-increasing massive edge data. Therefore, edge computing is becoming a support platform for the development of the internet of things. The edge computing refers to processing data at the edge of a network, so that the request response time is reduced, the battery endurance is improved, the network bandwidth is reduced, and the safety and the privacy of the data are ensured.
Disclosure of Invention
The purpose of the invention is as follows: aiming at the problems of the existing method, the invention provides a garbage can garbage identification method based on edge calculation and deep learning. The method mainly comprises the steps that the garbage shown by a thrower is scanned and image collection is carried out through a high-resolution camera arranged on a garbage can; completing identification and temporary storage by using an edge server; displaying the garbage type detection result on a display screen; the data are transmitted to the cloud server regularly, meanwhile, the image recognition SSD algorithm is used for recognizing garbage categories, and the garbage training model is updated by using the results, so that the recognition results are more accurate and efficient.
A garbage bin garbage identification method based on edge calculation and deep learning comprises the following steps:
step 1: the cloud server trains a garbage detection model by applying an SSD algorithm according to a garbage classification standard;
the SSD algorithm is called a Single Shot MultiBox Detector;
step 2: a thrower puts the garbage to be thrown in front of a high-resolution camera for scanning; the high-resolution camera is arranged on the dustbin;
and step 3: the distributed edge server identifies garbage to be thrown in according to the trained garbage detection model, simultaneously stores garbage picture data, and finishes uploading unmarked garbage data to the cloud server;
and 4, step 4: displaying a garbage detection result and voice broadcast on a display screen;
and 5: the cloud server updates the garbage training model periodically.
The step 1 of training the garbage detection model by the cloud server according to the garbage classification standard by using the SSD algorithm comprises the following steps of:
step 11: when the cloud server carries out garbage detection model training, an SSD framework structure is used, the last two full-connection layers are changed into convolutional layers in a VGG16 network, 4 convolutional layers are added to construct a network structure, feature maps are sequentially extracted from the 6 layers, and finally 8732 default detection frames are obtained.
Step 12: the prior box is a bounding box of the used default detection box, and the prior box and the actual target box are matched according to the size of the IOU value. When matching, starting from the perspective of the actual target frame, finding the default detection frame with the largest IOU value matched with the target to enter the candidate positive sample set. The IOU value refers to the overlapping rate between the real frame of the target and the prediction frame obtained by the detection algorithm, and the specific calculation formula is as follows:
Figure BDA0002408307230000021
step 13: since the number of positive samples determined by step 12 is too small, then from the perspective of the default detection box, in the sample set where the default detection box and the actual target box satisfy IOU > 0.5, the largest default detection box is selected and also placed in the positive sample set.
Step 14: and finally, sequencing the confidence degrees of the default detection frames corresponding to each object, selecting a certain number of default detection frames from low to high, storing the default detection frames into a negative sample set, and enabling the ratio of positive samples to negative samples in a prior box to be 1:3, thereby obtaining a good training sample.
Step 15: and solving a loss function, wherein the loss function is a weighted sum of the position error and the confidence error, and the formula is as follows:
Figure BDA0002408307230000022
wherein: l isconfAnd Llocthe method comprises the steps of respectively representing a category confidence loss function and a search box position loss function, wherein N represents the number of prior boxes matched with a real box, c is confidence, l is a garbage prediction box, g is a garbage real box, and α parameter is used for adjusting the proportion between confidence loss and location loss, and the default α is 1;
the penalty function for the search box position is:
Figure BDA0002408307230000023
wherein the content of the first and second substances,
Figure BDA0002408307230000031
whether the ith garbage prediction box is matched with the jth garbage real box in terms of the category k or not is represented by 0, and the ith garbage prediction box is matched with the jth garbage real box in terms of 1; cx, cy, w, h respectively represents the central abscissa of the garbage boundary box, the central ordinate of the garbage boundary box, the width of the boundary box and the height of the boundary box; m has no specific meaning, and the parameter m belongs to cx, cy, w and h, namely four values of cx, cy, w and h are respectively taken;
smoothL1(x) The loss function is defined as
Figure BDA0002408307230000032
lmIndicates the offset, g, of the predicted target box relative to defaultboxmRepresents the offset of the real target box relative to defaultbox; calculating gmThe formula of (1) is:
Figure BDA0002408307230000033
wherein d represents the parameters of the real garbage box, and g represents the parameters of the garbage prediction box;
the class confidence penalty function is:
Figure BDA0002408307230000034
wherein the content of the first and second substances,
Figure BDA0002408307230000035
the garbage prediction box i and the garbage real box j are matched with respect to the category p, and the higher the probability prediction of p is, the smaller the loss is;
Figure BDA0002408307230000036
the garbage prediction frame has no object, and the higher the probability of predicting as the background is, the smaller the loss is; the confidence c is generated by SoftMax.
Step 16: and (4) pre-training. For each prediction box, firstly, the class and the confidence value of the prediction box are determined according to the class confidence, and the prediction boxes belonging to the background are filtered out. The prediction blocks with lower thresholds are then filtered out according to IOU (═ 0.5). And decoding the residual prediction frame, and obtaining the real position parameter of the prediction frame according to the prior frame. After decoding, it needs to be sorted in descending order according to confidence, and then only Top-k (═ 400) prediction boxes are kept. And finally, carrying out a non-maximum inhibition algorithm and filtering out a prediction box with a large overlapping degree. The last remaining prediction box is the detection result.
The step 2 comprises the following steps: a high-resolution camera is arranged on the dustbin, and the shooting resolution is 300 multiplied by 300; the method comprises the steps that a camera uploads a scanned image to an edge server through a socket, the edge server detects whether the image is marked in an existing garbage model or not through an SSD algorithm, the image is not marked and is stored in a nonvolatile memory of the edge server in a binary system mode and uploaded to a cloud server, and the cloud server trains a garbage detection model through an SSD framework and updates the model. The specific method after the camera scans the image comprises the following steps:
step 320: the distributed edge server cuts the scanned image, and the size of the image is 300 multiplied by 300;
step 330: and the distributed edge server identifies the garbage to be thrown in according to the existing model, stores the garbage picture data and completes uploading unmarked garbage to the cloud server.
Step 331: taking a garbage image with high resolution of 300 multiplied by 300 as input;
step 332: performing feature extraction based on an SSD reference neural network VGG16 to obtain a feature map layer;
step 333: predicting the attribution category scores of the garbage under different feature layers; while predicting the default detection box on feature map using small (3 x 3) convolution kernels;
step 334: in order to improve the precision of the detection result, when predicting the garbage types on feature maps of different levels, the predicted values of different aspect ratios are obtained at the same time;
step 335: and inputting the prediction result into a loss layer, and obtaining a final prediction result through non-maximum suppression.
Step 336: and uploading the garbage image data which cannot be matched to the type to a cloud server.
The step 3 of deploying the edge server comprises the following steps:
step 31: firstly, determining the number t of mobile edge servers according to the storage requirement of a garbage can and the problem of network delay;
step 32: calculating the distance between every two garbage cans in the garbage can set R in the designated area, finding out the two garbage cans with the closest distance, and forming a set TiT is deleted from the set R, i 1, 2, 3i
Step 33: calculate each remaining garbage bin and T in the set RiDistance from center position, finding T from set RiDeleting the garbage can with the nearest center position in R and storing the garbage can into TiPerforming the following steps;
step 34: repeat step 33 until TiThe quantity of the garbage cans reaches a certain threshold value;
step 35: repeating step 34 until i equals T, forming T TiGathering, i.e. gathering in all refuse receptaclesT groups of classes;
step 36: according to the formula
Figure BDA0002408307230000041
Recalculating the center position of the trash can position in each cluster, where x belongs to the ith data set TiData item of miIs TiThe center of (a);
step 37: repeating step 36 until the criterion function based on the squared error and the minimum objective function converges;
step 38: calculating the final Euclidean distance between each central position and each small base station, and sequentially selecting the small base station with the closest central position;
step 39: step 38 is repeated to complete the selection of the edge server location.
Drawings
FIG. 1 is a frame diagram of a garbage bin garbage identification method based on edge calculation and deep learning.
FIG. 2 is a flowchart of a garbage bin garbage identification method based on edge calculation and deep learning.
Detailed description of the preferred embodiments
As shown in fig. 1, a garbage bin garbage identification method based on edge calculation and deep learning includes the following steps:
step 1: a thrower puts the garbage to be thrown in before a high-resolution camera for scanning, and the high-resolution camera is arranged on a garbage can;
step 2: the edge server identifies garbage to be thrown in according to the trained garbage detection model, stores garbage picture data and uploads the unmarked garbage data to the cloud server;
and step 3: and displaying the garbage category detection result and voice broadcast on the display screen.
The cloud server trains the garbage detection model by applying an SSD algorithm according to the garbage classification standard, and the method specifically comprises the following steps:
step 11: an image of 3-channel RGB with 300 × 300 resolution is input, using 5-layer networks in front of VGG16, then converting fully connected layers fc6 and fc7 of VGG16 into 3 × 3 convolutional layers Conv6 and 1 × 1 convolutional layers Conv7, and adding 3 convolutional layers and one mean pooling layer (Conv 8_2, Conv9_2, Conv10_2 and Conv11_2, respectively).
First, feature maps are extracted from the Conv4_3 layer in the VGG16, and the size of each feature map is 38 × 38, and then feature maps having sizes of 19 × 19, 10 × 10, 5 × 5, 3 × 3, and 1 × 1 are sequentially extracted from the Conv8_2, Conv9_2, Conv10_2, and Conv11_2 layers, respectively, to finally obtain 38 × 38 × 4+19 × 19 × 6+10 × 6+5 × 5 × 6+3 × 4+1 × 1 × 4, 8732 default detect boxes (default detect box).
Step 12: the prior box is a bounding box of the used default detection box, and the prior box is matched with the actual target box according to the size of the IOU value; when matching, starting from the angle of an actual target frame, finding a default detection frame with the maximum IOU value matched with the target and entering a candidate positive sample set; the IOU value refers to the overlapping rate between the real frame of the target and the prediction frame obtained by the detection algorithm, and the specific calculation formula is as follows:
Figure BDA0002408307230000061
step 13: since the number of positive samples determined by step 12 is too small, then from the perspective of the default detection box, in the sample set where the default detection box and the actual target box satisfy IOU > 0.5, the largest default detection box is selected and also placed in the positive sample set.
Step 14: and finally, sequencing the confidence degrees of the default detection frames corresponding to each object, selecting a certain number of default detection frames from low to high, storing the default detection frames into a negative sample set, and enabling the ratio of positive samples to negative samples in a prior box to be 1:3, thereby obtaining a good training sample.
Step 15: and solving a loss function, wherein the loss function is a weighted sum of the position error and the confidence error, and the formula is as follows:
Figure BDA0002408307230000062
wherein: l isconfAnd LlocRespectively representing the category confidencethe method comprises the steps of obtaining a confidence coefficient, a rubbish prediction frame, a rubbish real frame, a degree loss function and a search frame position loss function, wherein N represents the number of prior boxes matched with the real frame, c is the confidence coefficient, l is the rubbish prediction frame, g is the rubbish real frame, and α is used for adjusting the proportion between confidence loss and location loss, and the default alpha is 1;
the penalty function for the search box position is:
Figure BDA0002408307230000063
wherein the content of the first and second substances,
Figure BDA0002408307230000064
whether the ith garbage prediction box is matched with the jth garbage real box in terms of the category k or not is represented by 0, and the ith garbage prediction box is matched with the jth garbage real box in terms of 1; cx, cy, w, h respectively represents the central abscissa of the garbage boundary box, the central ordinate of the garbage boundary box, the width of the boundary box and the height of the boundary box; m has no specific meaning, and the parameter m belongs to cx, cy, w and h, namely four values of cx, cy, w and h are respectively taken;
smoothL1(x) The loss function is defined as
Figure BDA0002408307230000065
lmIndicates the offset, g, of the predicted target box relative to defaultboxmRepresents the offset of the real target box relative to defaultbox; calculating gmThe formula of (1) is:
Figure BDA0002408307230000066
wherein d represents the parameters of the real garbage box, and g represents the parameters of the garbage prediction box;
the class confidence penalty function is:
Figure BDA0002408307230000071
wherein the content of the first and second substances,
Figure BDA0002408307230000072
the garbage prediction box i and the garbage real box j are matched with respect to the category p, and the higher the probability prediction of p is, the smaller the loss is;
Figure BDA0002408307230000073
the garbage prediction frame has no object, and the higher the probability of predicting as the background is, the smaller the loss is; the confidence c is generated by SoftMax;
step 16: and (4) pre-training. For each prediction frame, firstly determining the category and the confidence value of the prediction frame according to the category confidence, and filtering the prediction frames belonging to the background; then, filtering out prediction frames with lower threshold values according to the IOU (0.5), decoding the remaining prediction frames, obtaining real position parameters of the prediction frames according to the prior frames, performing descending order arrangement according to confidence degrees after decoding, and then only reserving Top-k (400) prediction frames; and finally, carrying out a non-maximum inhibition algorithm, filtering out the prediction boxes with larger overlapping degree, wherein the last residual prediction boxes are the detection results.
As shown in fig. 2, a high-resolution camera is mounted on the dustbin, and the shooting resolution is 300 × 300; the method comprises the steps that a camera uploads a scanned image to an edge server through a socket, the edge server detects whether the image is marked in an existing garbage model or not through an SSD algorithm, the image is not marked and is stored in a nonvolatile memory of the edge server in a binary system mode and uploaded to a cloud server, and the cloud server trains a garbage detection model through an SSD framework and updates the model.
Set up the waste classification room according to the national waste classification standard in the dustbin, divide into: the garbage sorting chamber can be recovered, the harmful garbage sorting chamber, the kitchen garbage sorting chamber, other garbage sorting chambers and the like. The thrower hears the voice broadcast to inform the type of the garbage to be thrown, and only the garbage to be thrown is required to be placed in the corresponding garbage classification room.
A passerby puts a milk tea cup to be put in the hand in front of a high-resolution camera carried by a dustbin, the camera uploads a scanned image to a distributed edge server, the distributed edge server judges the type of the garbage of the milk tea cup to be recyclable garbage according to an existing model, the result is displayed on a screen, and the passerby puts the milk tea cup into a recyclable garbage classification room according to the result; if the distributed edge server is not matched with the type of the milk tea cup, displaying that the garbage is not judged, uploading image data of the milk tea cup to a cloud center, training and updating a model by the cloud center, and automatically judging which recycling room of the garbage can the garbage is put into by passers-by according to results displayed on a screen or broadcasted by voice.
The specific method after the camera scans the image comprises the following steps:
step 320: the distributed edge server cuts the scanned image, and the size of the image is 300 multiplied by 300;
step 330: and the distributed edge server identifies the garbage to be thrown in according to the existing model, stores the garbage picture data and completes uploading unmarked garbage to the cloud server.
Step 331: taking a garbage image with high resolution of 300 multiplied by 300 as input;
step 332: performing feature extraction based on an SSD reference neural network VGG16 to obtain a feature map layer;
step 333: predicting the attribution category scores of the garbage under different feature layers; while predicting the default detection box on feature map using small (3 x 3) convolution kernels;
step 334: in order to improve the precision of the detection result, when predicting the garbage types on feature maps of different levels, the predicted values of different aspect ratios are obtained at the same time;
step 335: inputting the prediction result into a loss layer, obtaining the final prediction result that the milk tea cup is empty and the tea cup is recyclable garbage through non-maximum inhibition, and displaying the result on a screen.
Step 336: and if the distributed edge server is not matched with the type of the milk tea cup, displaying that the garbage is not judged, uploading the image data of the milk tea cup to a cloud server, and training and updating the model by the cloud server.
A passerby puts a milk tea cup to be put in the hand in front of a high-resolution camera carried by a dustbin, the camera uploads a scanned image to a distributed edge server, the distributed edge server judges the type of the garbage of the milk tea cup to be recyclable garbage according to an existing model, the result is displayed on a screen, and the passerby puts the milk tea cup into a recyclable garbage classification room according to the result; and if the distributed edge server is not matched with the type of the milk tea cup, displaying that the garbage is not judged, uploading the image data of the milk tea cup to a cloud center, and training and updating the model by the cloud center. (passerby need self-judgment)
The setting of the distributed edge server specifically comprises the following steps:
step 31: firstly, determining the number t of mobile edge servers according to the storage requirement of a garbage can (intelligent terminal) and the problem of network delay;
step 32: calculating the distance between every two garbage cans in the garbage can set R in the designated area, finding out the two garbage cans with the nearest residence cards, and forming a set TiT is deleted from the set R, i 1, 2, 3i
Step 33: calculate each remaining garbage bin and T in the set RiDistance from center position, finding T from set RiDeleting the garbage can with the nearest center position in R and storing the garbage can into TiPerforming the following steps;
step 34: repeat step 33 until TiThe quantity of the garbage cans reaches a certain threshold value;
step 35: repeating step 34 until i equals T, forming T TiAggregation, i.e. clustering t clusters in all garbage cans;
step 36: according to the formula
Figure BDA0002408307230000091
Recalculating the center position of the trash can position in each cluster, where x belongs to the ith data set TiData item of miIs TiThe center of (a);
step 37: repeating step 36 until the criterion function based on the squared error and the minimum objective function converges;
step 38: calculating the final Euclidean distance between each central position and each small base station, and sequentially selecting the small base station with the closest central position;
step 39: step 38 is repeated to complete the selection of the edge server location.

Claims (6)

1. A garbage bin garbage identification method based on edge calculation and deep learning is characterized by comprising the following steps:
step 1: a thrower puts the garbage to be thrown in before a high-resolution camera for scanning, and the high-resolution camera is arranged on a garbage can;
step 2: the edge server identifies garbage to be thrown in according to the trained garbage detection model, stores garbage picture data and uploads the unmarked garbage data to the cloud server;
and step 3: and displaying the garbage category detection result and voice broadcast on the display screen.
2. The garbage bin garbage identification method based on edge computing and deep learning as claimed in claim 1, wherein the cloud server trains the garbage detection model by using an SSD algorithm according to the garbage classification standard, specifically:
step 11: changing the last two fully-connected layers into convolutional layers in a VGG16 network by using an SSD frame structure, adding 4 convolutional layers to construct a network structure, sequentially extracting feature maps from the 6 layers, and finally obtaining 8732 default detection frames;
step 12: the prior box is a bounding box of the used default detection box, and the prior box is matched with the actual target box according to the size of the IOU value; when matching, starting from the angle of an actual target frame, finding a default detection frame with the maximum IOU value matched with the target and entering a candidate positive sample set; the IOU value refers to the overlapping rate between the real frame of the target and the prediction frame obtained by the detection algorithm, and the specific calculation formula is as follows:
Figure FDA0002408307220000011
step 13: since the number of positive samples determined in step 12 is too small, then from the perspective of the default detection box, in the sample set where the default detection box and the actual target box satisfy IOU > 0.5, the largest default detection box is selected and also placed in the positive sample set;
step 14: finally, the confidence degrees of the default detection frames corresponding to each object are sequenced, a certain number of default detection frames are selected from low to high, and are stored in a negative sample set, so that the ratio of positive samples to negative samples in a prior box is 1:3, and therefore a good training sample can be obtained;
step 15: and solving a loss function, wherein the loss function is a weighted sum of the position error and the confidence error, and the formula is as follows:
Figure FDA0002408307220000012
wherein: l isconfAnd Llocthe method comprises the steps of respectively representing a category confidence loss function and a search box position loss function, wherein N represents the number of prior boxes matched with a real box, c is confidence, l is a garbage prediction box, g is a garbage real box, and α parameter is used for adjusting the proportion between confidence loss and location loss, and the default α is 1;
the penalty function for the search box position is:
Figure FDA0002408307220000021
wherein the content of the first and second substances,
Figure FDA0002408307220000022
whether the ith garbage prediction box is matched with the jth garbage real box in terms of the category k or not is represented by 0, and the ith garbage prediction box is matched with the jth garbage real box in terms of 1; cx, cy, w, h respectively represents the central abscissa of the garbage boundary box, the central ordinate of the garbage boundary box, the width of the boundary box and the height of the boundary box; m has no specific meaning, and the parameter m belongs to cx, cy, w and h, namely four values of cx, cy, w and h are respectively taken;
smoothL1(x) Loss boxThe definition of a number is
Figure FDA0002408307220000023
lmIndicates the offset, g, of the predicted target box relative to defaultboxmRepresents the offset of the real target box relative to defaultbox; calculating gmThe formula of (1) is:
Figure FDA0002408307220000024
wherein d represents the parameters of the real garbage box, and g represents the parameters of the garbage prediction box;
the class confidence penalty function is:
Figure FDA0002408307220000025
wherein the content of the first and second substances,
Figure FDA0002408307220000026
the garbage prediction box i and the garbage real box j are matched with respect to the category p, and the higher the probability prediction of p is, the smaller the loss is;
Figure FDA0002408307220000027
the garbage prediction frame has no object, and the higher the probability of predicting as the background is, the smaller the loss is; the confidence c is generated by SoftMax;
step 16: pre-training; for each prediction frame, firstly determining the category and the confidence value of the prediction frame according to the category confidence, and filtering the prediction frames belonging to the background; then, filtering out prediction frames with lower threshold values according to the IOU (0.5), decoding the remaining prediction frames, obtaining real position parameters of the prediction frames according to the prior frames, performing descending order arrangement according to confidence degrees after decoding, and then only reserving Top-k (400) prediction frames; and finally, carrying out a non-maximum inhibition algorithm, filtering out the prediction boxes with larger overlapping degree, wherein the last residual prediction boxes are the detection results.
3. The garbage bin garbage identification method based on edge calculation and deep learning as claimed in claim 1, wherein the step 1 specifically comprises: a high-resolution camera is arranged on the dustbin, and the shooting resolution is 300 multiplied by 300; the method comprises the steps that a camera uploads a scanned image to an edge server through a socket, the edge server detects whether the image is marked in an existing garbage model or not through an SSD algorithm, the image is not marked and is stored in a nonvolatile memory of the edge server in a binary system mode and uploaded to a cloud server, and the cloud server trains a garbage detection model through an SSD framework and updates the model.
4. The trash recognition method based on edge calculation and deep learning of claim 3, wherein the specific practice after the camera scans the image includes:
step 320: the distributed edge server cuts the scanned image, and the size of the image is 300 multiplied by 300;
step 330: and the distributed edge server identifies the garbage to be thrown in according to the existing model, stores the garbage picture data and completes uploading unmarked garbage to the cloud server.
Step 331: taking a garbage image with high resolution of 300 multiplied by 300 as input;
step 332: performing feature extraction based on an SSD reference neural network VGG16 to obtain a feature map layer;
step 333: predicting the attribution category scores of the garbage under different feature layers; while predicting the default detection box on feature map using small (3 x 3) convolution kernels;
step 334: in order to improve the precision of the detection result, when predicting the garbage types on feature maps of different levels, the predicted values of different aspect ratios are obtained at the same time;
step 335: and inputting the prediction result into a loss layer, and obtaining a final prediction result through non-maximum suppression.
Step 336: and uploading the garbage image data which cannot be matched to the type to a cloud server.
5. The garbage can garbage identification method based on edge calculation and deep learning as claimed in claim 1, wherein the edge server is deployed based on a K-means clustering algorithm, and the K-means clustering algorithm is a K-means clustering algorithm.
6. The garbage bin garbage recognition method based on edge calculation and deep learning as claimed in claim 5, wherein: the deployment of the edge server specifically comprises the following steps:
step 31: firstly, determining the number t of mobile edge servers according to the storage requirement of a garbage can and the problem of network delay;
step 32: calculating the distance between every two garbage cans in the garbage can set R in the designated area, finding out the two garbage cans with the closest distance, and forming a set TiT is deleted from the set R, i 1, 2, 3i
Step 33: calculate each remaining garbage bin and T in the set RiDistance from center position, finding T from set RiDeleting the garbage can with the nearest center position in R and storing the garbage can into TiPerforming the following steps;
step 34: repeat step 33 until TiThe quantity of the garbage cans reaches a certain threshold value;
step 35: repeating step 34 until i equals T, forming T TiAggregation, i.e. clustering t clusters in all garbage cans;
step 36: according to the formula
Figure FDA0002408307220000041
Recalculating the center position of the trash can position in each cluster, where x belongs to the ith data set TiData item of miIs TiThe center of (a);
step 37: repeating step 36 until the criterion function based on the squared error and the minimum objective function converges;
step 38: calculating the final Euclidean distance between each central position and each small base station, and sequentially selecting the small base station with the closest central position;
step 39: step 38 is repeated to complete the selection of the edge server location.
CN202010168548.2A 2020-03-12 2020-03-12 Garbage can garbage identification method based on edge calculation and deep learning Pending CN111217062A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010168548.2A CN111217062A (en) 2020-03-12 2020-03-12 Garbage can garbage identification method based on edge calculation and deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010168548.2A CN111217062A (en) 2020-03-12 2020-03-12 Garbage can garbage identification method based on edge calculation and deep learning

Publications (1)

Publication Number Publication Date
CN111217062A true CN111217062A (en) 2020-06-02

Family

ID=70826425

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010168548.2A Pending CN111217062A (en) 2020-03-12 2020-03-12 Garbage can garbage identification method based on edge calculation and deep learning

Country Status (1)

Country Link
CN (1) CN111217062A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111768005A (en) * 2020-06-19 2020-10-13 北京百度网讯科技有限公司 Training method and device for lightweight detection model, electronic equipment and storage medium
CN112802006A (en) * 2021-02-07 2021-05-14 南通大学 Deep learning-based edge calculation motor oil stain identification method
CN112884033A (en) * 2021-02-06 2021-06-01 浙江净禾智慧科技有限公司 Household garbage classification detection method based on convolutional neural network

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120118226A (en) * 2011-04-18 2012-10-26 윤원섭 Apparatus for separate garbage collection using image process technique
CN109344894A (en) * 2018-09-28 2019-02-15 广州大学 Garbage classification recognition methods and device based on Multi-sensor Fusion and deep learning
CN109389161A (en) * 2018-09-28 2019-02-26 广州大学 Rubbish identification evolutionary learning method, apparatus, system and medium based on deep learning
CN109784190A (en) * 2018-12-19 2019-05-21 华东理工大学 A kind of automatic Pilot scene common-denominator target Detection and Extraction method based on deep learning
CN110569874A (en) * 2019-08-05 2019-12-13 深圳大学 Garbage classification method and device, intelligent terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120118226A (en) * 2011-04-18 2012-10-26 윤원섭 Apparatus for separate garbage collection using image process technique
CN109344894A (en) * 2018-09-28 2019-02-15 广州大学 Garbage classification recognition methods and device based on Multi-sensor Fusion and deep learning
CN109389161A (en) * 2018-09-28 2019-02-26 广州大学 Rubbish identification evolutionary learning method, apparatus, system and medium based on deep learning
CN109784190A (en) * 2018-12-19 2019-05-21 华东理工大学 A kind of automatic Pilot scene common-denominator target Detection and Extraction method based on deep learning
CN110569874A (en) * 2019-08-05 2019-12-13 深圳大学 Garbage classification method and device, intelligent terminal and storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
WEI LIU 等: "SSD:Single Shot MultiBox Detector", 《ARXIV》 *
周瑶: "基于深度学习的舰船目标检测与识别", 《中国优秀硕博士学位论文全文数据库(硕士) 信息科技辑》 *
彭昕昀 等: "基于SSD算法的垃圾识别分类研究", 《韶关学院学报·自然科学》 *
薛端: "基于K_means算法的边缘服务器部署研究", 《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111768005A (en) * 2020-06-19 2020-10-13 北京百度网讯科技有限公司 Training method and device for lightweight detection model, electronic equipment and storage medium
CN111768005B (en) * 2020-06-19 2024-02-20 北京康夫子健康技术有限公司 Training method and device for lightweight detection model, electronic equipment and storage medium
CN112884033A (en) * 2021-02-06 2021-06-01 浙江净禾智慧科技有限公司 Household garbage classification detection method based on convolutional neural network
CN112884033B (en) * 2021-02-06 2021-10-22 浙江净禾智慧科技有限公司 Household garbage classification detection method based on convolutional neural network
CN112802006A (en) * 2021-02-07 2021-05-14 南通大学 Deep learning-based edge calculation motor oil stain identification method
CN112802006B (en) * 2021-02-07 2024-03-22 南通大学 Edge calculation motor oil stain identification method based on deep learning

Similar Documents

Publication Publication Date Title
CN111217062A (en) Garbage can garbage identification method based on edge calculation and deep learning
CN110941594B (en) Splitting method and device of video file, electronic equipment and storage medium
CN108921083B (en) Illegal mobile vendor identification method based on deep learning target detection
CN102346847B (en) License plate character recognizing method of support vector machine
CN110458082B (en) Urban management case classification and identification method
CN104239867B (en) License plate locating method and system
CN109684922B (en) Multi-model finished dish identification method based on convolutional neural network
CN107480643B (en) Intelligent garbage classification processing robot
CN104463196A (en) Video-based weather phenomenon recognition method
US11335086B2 (en) Methods and electronic devices for automated waste management
CN106339657B (en) Crop straw burning monitoring method based on monitor video, device
CN112560576B (en) AI map recognition garbage classification and intelligent recovery method
CN109242826B (en) Mobile equipment end stick-shaped object root counting method and system based on target detection
CN116189099B (en) Method for detecting and stacking exposed garbage based on improved yolov8
CN109063619A (en) A kind of traffic lights detection method and system based on adaptive background suppression filter and combinations of directions histogram of gradients
CN111186656A (en) Target garbage classification method and intelligent garbage can
CN112241692B (en) Channel foreign matter intelligent detection and classification method based on aerial image super-pixel texture
CN111597875A (en) Traffic sign identification method, device, equipment and storage medium
Djamaluddin et al. The simulation of vehicle counting system for traffic surveillance using Viola Jones method
WO2022104798A1 (en) 5g-based unmanned electronic traffic police duty system
CN114506591A (en) Intelligent garbage can design method based on image classification technology
CN112508103B (en) Perishable garbage image identification and assessment management method based on garbage collection and transportation vehicle
CN112620165B (en) Garbage classification method
CN113239962A (en) Traffic participant identification method based on single fixed camera
CN112875077A (en) Garbage classification method and classification system for large garbage station

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200602