CN108682006A - Contactless canned compost maturity judgment method - Google Patents

Contactless canned compost maturity judgment method Download PDF

Info

Publication number
CN108682006A
CN108682006A CN201810379431.1A CN201810379431A CN108682006A CN 108682006 A CN108682006 A CN 108682006A CN 201810379431 A CN201810379431 A CN 201810379431A CN 108682006 A CN108682006 A CN 108682006A
Authority
CN
China
Prior art keywords
compost
data
layer
network
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810379431.1A
Other languages
Chinese (zh)
Other versions
CN108682006B (en
Inventor
薛卫
胡雪娇
徐阳春
韦中
梅新兰
陈行健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Agricultural University
Original Assignee
Nanjing Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Agricultural University filed Critical Nanjing Agricultural University
Priority to CN201810379431.1A priority Critical patent/CN108682006B/en
Publication of CN108682006A publication Critical patent/CN108682006A/en
Application granted granted Critical
Publication of CN108682006B publication Critical patent/CN108682006B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/44Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of contactless canned compost maturity judgment methods, include the following steps:S1, extraction t moment image data;S2, pretreatment;S3, the data obtained based on S2, structure convolutional neural networks CNN are carried out compost image characteristics extraction and obtain the feature vector of 255 dimensions;S4, the combination of eigenvectors that thermograph color histogram data in S1 is exported with image characteristics extraction convolutional neural networks in S3 is formed into compost real-time characteristic, and normalized together;S5, it is predicted based on shot and long term memory network LSTM;S6, output judging result.It is accurate using methods and results of this method based on temperature, the real-time detection compost state of appearance, operation difficulty is small.

Description

Contactless canned compost maturity judgment method
Technical field
The graphic images that the present invention shoots the natural image information and thermal imaging system of fertile heap pass through deep learning nerve Network carries out the rotten degree that then method that feature extraction is classified carrys out real-time judge compost, belongs to Agricultural information field.
Background technology
The decomposed growth to crops of fertilizer has particularly important meaning, and the not decomposed fertilizer of use not only will not be right The growth of crop has positive effect, can also cause fly worm oviposition corrosion crop root, and the mass propagation of microorganism will also result in soil Earth anoxic.Decomposed fertilizer will not cause environment harmful effect convenience in transport simultaneously completely, be conducive to increase soil fertility and Promote plant growth.
Rotten degree is the index for reflecting Composting Process stable degree, and existing rotten degree discriminant criterion has physics to refer to Mark, chemical index, Biological indicators.Under normal circumstances using the temperature change in physical index as the important finger of evaluation compost maturity Mark.In composting process, the heat that microorganism decomposition organic substance generates makes compost internal temperature increase, hot stage up to 60~ 70 DEG C even 80 DEG C, subsequent compost temperature is gradually reduced.But temperature will rise again after stirring, then decline again.By Stirring several times and temperature are repeatedly after raising and lowering, and relatively easy decomposition of organic matter fades away in composting material, even if later Temperature is stirred for also no longer to rise.Traditional contact temperature judges that decomposed method buries temperature for artificial thermometric and in compost Sensor.In addition, decomposed fertilizer is with not decomposed fertilizer surface shape feature, there is also differents.
Invention content
The present invention is directed to the problem of background technology, and the natural image feature and graphic images on compost surface is special The feature description as compost is levied, the characteristics of image for learning compost with the method for deep learning realizes rotten degree prediction.
Technical solution:A kind of contactless canned compost maturity judgment method, includes the following steps:
S1, extraction t moment image data, including 255 × 3 dimension thermograph color histogram datas and compost surface RGB Image data;
Compost surface rgb image data is carried out median filter process by S2, pretreatment;
S3, the data obtained based on S2, structure convolutional neural networks CNN are carried out compost image characteristics extraction and obtain 255 dimensions Feature vector;
S4, by 255 × 3 dimension data of thermograph color histogram in S1 with image characteristics extraction convolutional neural networks in S3 255 dimensional feature vectors of output are grouped together into 255 × 4 dimension compost real-time characteristics, and normalized;
S5, it is predicted based on shot and long term memory network LSTM, data is obtained as input quantity using S4, input gate forgets door It is used as input data simultaneously with the data at moment before out gate;Shot and long term memory network LSTM prediction models are by constantly instructing Practice network undated parameter value to obtain;
S6, output judging result.
Specifically, in S1, thermograph color histogram data is expressed as:
In formula, QR、QG、QBColoration probability value on respectively R, G, B component;When shooting, thermal imaging system is placed on composting tank Face compost surface is pushed up, apart from compost surface 15-100cm.
Specifically, in S1, t moment compost surface rgb image data is extracted by following formula:
Pt is the RGB color image matrix of compost natural image, and ordinary numbers camera is in tank deck face compost when shooting Surface is shot, and shooting image middle region 90*90 pictures are taken apart from compost surface 15-100cm by LED light light filling in tank Element, n=90.
Specifically, in S2, the median filtering process is:Pass through filter window at (i, j) on three channels of RGB image Convolution results are added, and activation primitive is then taken to be worth to the value of filtered image L (i, j) again.
Specifically, in S3, CNN feature extractions are made of two steps:
It is trained first, specific practice is that N marked decomposed, not decomposed sample image is sent into CNN to pass through under gradient Drop algorithm trains to obtain characteristics of image and extracts the parameters of network;
Then trained CNN model parameters are used for the feature extraction of monitoring process;
CNN includes that 3 convolutional layers, 3 pond layers, 2 full articulamentums and 1 classify layer, second full-mesh layer 255 dimensional vectors are the final feature of image.
Specifically, in CNN, input be triple channel rgb image data matrix, convolutional layer export dimension of picture calculate it is public Formula is:
In formula 1, W1For input matrix width, W2To input by the matrix width after convolutional layer convolution, F is convolution kernel Size, P are whether to use zero padding, use zero padding P values as 1, and it for 0, S is step-length not use zero padding P values;
Pond layer exports dimension of picture calculation formula:
In formula 2, W1For input matrix width, W2For the matrix width for inputting after the layer of pond, F is filter size, S is step-length.
Specifically, in S4, normalized function is as follows:
X1=(X0- μ)/σ (formula 3)
Wherein, X0、X1It is the forward and backward compost real-time characteristic vector of normalization respectively, μ is the mean value of all sample datas, and σ is The standard deviation of all sample datas.
Specifically, in S5, shot and long term memory network LSTM includes three layers:Input layer, hidden layer and output layer input dimension 255 × 4=1020, hidden layer neuron 500 are single to export;Network parameter is trained when first time using LSTM, by t moment Compost state feature vector xt, while data in door will be forgotten and be multiplied by Forgetting coefficient ft, input data be multiplied by input gate attenuation coefficient it, output gated data be multiplied by output factor ot, by tanh activation primitives carry out Data Integration obtain network output, network according to Mark value carries out backpropagation, calculates network error, reduces error by constantly training network undated parameter value, obtains LSTM prediction models.
Specifically, back-propagation algorithm is:
Wf、bf、Wi、bi, Wo, io be network parameter, initial value is random value, is constantly trained by network, by reversed Propagation algorithm finds out the update of the Grad realization parameter value of each parameter.
In LSTM, pass through hidden state h(t)GradientAnd C(t)GradientIt propagates forward step by step, to loss function It carries out derivation and obtains gradient formula:
Gradient determined by the output gradient error of this layer, i.e.,:
Reversed gradient error by preceding layerGradient error and this layer slave h(t)The gradient error sent back, Two parts form, and gradient formula is:
It is based onWithEasily obtain the gradient of parameter.
Specifically, back-propagation algorithm process is:
1) door parameter W is forgotten in initializationf、bf, out gate parameter Wo, io, input gate parameter Wi、bi, index output parameter V, c
2) data prediction converts image data to accessible Tensor tensors, and is normalized
3) for iter to 1to train iterative steps
4) for start=1to training set datas length
5) propagated forward algorithm is utilized to calculate t moment predicted value y^(t)
6) counting loss function L
7) the local derviation value that output layer node is utilized by chain type Rule for derivation, calculates the local derviation value of all hidden layer nodes
8) W is updated by majorized function step by stepf、bf、Wi、bi, Wo, io parameter value
Cycle terminates
Cycle terminates
Terminate.
Specifically, in S6, LSTM propagated forwards realize the predictive information of output fertilizer rotten degree, by the compost shape of t moment State feature vector xtInput network, output predicted value y^(t)
Beneficial effects of the present invention
The present invention is that compost method is research object in canister, and the characteristics of pot type is more closed, two kinds of conventional contact Thermometric mode operation difficulty is big.Thermal imaging system is a kind of contactless temperature-measuring method, be capable of captured in real-time body surface heat at As figure, different colours can represent different temperature in figure.Although only surface temperature, the artificial thermometric that is compared to inside heap body There is certain loss with temperature sensor thermometric, but by a large amount of temperature measuring points, can reflect the temperature and variation letter of compost substantially Breath, and thermometric accurately facilitates installation.In pot type compost, chemistry, biology etc. judge that the method operation difficulty of compost maturity is big, single One evaluation index error in judgement is big.Using the contactless methods and results based on temperature, the real-time detection compost state of appearance Accurately, operation difficulty is small.
Description of the drawings
Fig. 1 is the decomposed judgement flow of the present invention.
Fig. 2 is CNN filtering principle figures.
Fig. 3 is CNN network structures.
Fig. 4 is LSTM internal structure charts.
Fig. 5 is compost surface natural image in embodiment.
Specific implementation mode
With reference to embodiment, the invention will be further described, and but the scope of the present invention is not limited thereto:
By taking certain composting plant as an example, composting material be animal wastes and waste dish, at the top of fermentation tank install camera and heat at As instrument monitoring one production cycle of fermentation tank, data acquisition intervals are set as 2 hours, collect graphic images, compost surface altogether 1500 datas.Wherein 300 test samples and 1200 training samples.
In conjunction with Fig. 1, a kind of contactless canned compost maturity judgment method includes the following steps:
S1, extraction t moment image data, including 255 × 3 dimension thermograph color histogram datas and compost surface RGB Image data.
Installation has the function of network communication function camera for shooting Natural compost image and thermal imaging system for shooting Graphic images, the data such as acquisition graphic images title, natural image title, time that program preserve in the database with Just data dispatch, image are preserved with the file format of JPG.
S1-1 extracts t moment thermograph color histogram data:
Thermal imaging system is placed on compost tank deck face compost surface to shoot in t moment apart from compost surface 15-100cm To compost surface graphic images.Overview of the t moment graphic images on tri- components of RGB is described with color histogram, is obtained Get different colorations shared ratio in the picture.
The value range of tri- components of RGB is [0,255], so each component has the data of 255 dimensions, QR、QG、QB Coloration probability value on respectively R, G, B component.
S1-2, extraction t moment compost surface rgb image data:
PtIt is the RGB color image matrix of compost natural image, ordinary numbers camera is in tank deck face compost when shooting Surface is shot, and shooting image middle region 90*90 pictures are taken apart from compost surface 15-100cm by LED light light filling in tank Element, n=90.
Compost surface rgb image data is carried out median filter process by S2, pretreatment:
Medium filtering is carried out using trained convolutional filtering window to compost surface RGB image.As shown in Fig. 2, with window For the filter window that mouth size is 3 × 3, the value that filtered image L (i, j) goes out is by three channels of RGB image (i, j) Then place is added by filter window convolution results takes activation primitive to be worth to again.Obtained image is put into full articulamentum to obtain To the feature vector of 255 dimensions.
S3, the data obtained based on S2, structure convolutional neural networks CNN are carried out compost image characteristics extraction and obtain 255 dimensions Feature vector:
Convolutional neural networks simulate human brain Vision information processing from low order feature to the pattern of high-order feature, using more A continuous convolution layer extracts the feature of progressive complexity.Front end convolutional layer filter detection low order feature, rear end is with convolutional layer Increase, filter bank goes out more complicated image feature representation.Therefore this method extracts compost image spy with convolutional neural networks Sign, CNN feature extractions are made of two steps, are trained first, specific practice is by N marked decomposed, not decomposed sample Image is sent into CNN and trains to obtain characteristics of image by gradient descent algorithm and extract the parameters of network;Then it will train CNN model parameters be used for monitoring process feature extraction.
Convolutional neural networks contain multilayer neural network, and every layer is made of multiple planes, and each level has several independent nerves The module types such as member, specially input, convolutional layer, pond layer, full articulamentum and output.The volume that compost image characteristics extraction uses Product neural network includes 3 convolutional layers, 3 pond layers, 2 full articulamentums and 1 classification layer.Second full-mesh layer 255 dimensional vectors are the final feature of image.Network structure is as shown in Figure 3.
What is inputted first in Fig. 2 is the rgb image data matrix of triple channel, and convolution kernel size is 5 in first layer convolutional layer The filter size of the Feature Mapping that 32 dimension size of × 5 output is 88 × 88, first layer pond layer is 2 × 2 output, 32 dimension size For 44 × 44 Feature Mapping.Convolution kernel size is 3 × 3 in second layer convolutional layer, the feature that 64 dimension size of output is 44 × 44 Mapping, the filter size of second layer pond layer are the Feature Mapping that 2 × 2 output, 64 dimension size is 22 × 22.Third layer convolution Convolution kernel size is 3 × 3 in layer, the Feature Mapping that 128 dimension size of output is 22 × 22, the filter size of third layer pond layer The Feature Mapping for being 11 × 11 for 2 × 2 output, 128 dimension size.The data that last two layers of full articulamentum is respectively are 15488 peacekeepings 255 dimensions.
Convolutional layer exports dimension of picture calculation formula:
In formula 1, W1For input matrix width, W2To input by the matrix width after convolutional layer convolution, F is convolution kernel Size, P are whether to use zero padding, use zero padding P values as 1, and it for 0, S is step-length not use zero padding P values.
Pond layer exports dimension of picture calculation formula:
In formula 2, W1For input matrix width, W2For the matrix width for inputting after the layer of pond, F is filter size, S is step-length.
S4, by 255 × 3 dimension data of thermograph color histogram in S1 with image characteristics extraction convolutional neural networks in S3 255 dimensional feature vectors of output are grouped together into 255 × 4 dimension compost real-time characteristics, and normalized:
So far, graphic images have been collected into and have been characterized as that the distribution of color probability temperature data on tri- components of RGB is big Small is the feature vector of 3 255 dimensions, and natural image is characterized as the feature vector of one 255 dimension, it is fat to be grouped together into heap Shi Tezheng.In view of being had differences on the order of magnitude between these data different dimensions, before data feeding model is trained, Standard deviation first is carried out to data and standardizes normalized.Treated data fit is just distributed very much, average value 0, standard Difference is 1, and normalized function is as follows:
X1=(X0- μ)/σ (formula 3)
Wherein, X0、X1It is the front and back compost real-time characteristic vector of normalization respectively, μ is the mean value of all sample datas, and σ is institute There is the standard deviation of sample data.
S5, it is predicted based on shot and long term memory network LSTM, data is obtained as input quantity using S4, input gate forgets door It is used as input data simultaneously with the data at moment before out gate;Shot and long term memory network LSTM prediction models are by constantly instructing Practice network undated parameter value to obtain:
Deep learning model development is rapid in recent years, and shot and long term memory network (LSTM) combines in network structure design The concept of sequential and forgetting makes it show stronger adaptability in time series data and the longer data analysis of time interval. The characteristics of LSTM, is by increasing input threshold, forgeing thresholding and output thresholding so that the weight of self-loopa is variation, this Sample one comes fixed in model parameter, and the integral scale of different moments can dynamically change, and disappear so as to avoid gradient The problem of mistake or gradient expand.In sometime point, LSTM receives compost state input vector, and input gate forgets door and defeated It gos out the significant data at moment before while being used as input data, and current significant data is preserved.Data pass through Activation primitive updates network hidden layer node state, is made prediction by output layer.Each layer of hidden state all can backward always Transmit, therefore the state of its hidden layer stores the information of compost historical juncture, can excavate history and current time information it Between relationship.
1) forget door operation principle
When the network information at t-1 moment be passed to t moment network when, first have to the forgetting degree for determining it, by t moment it Preceding memory state is multiplied by the attenuation coefficient between one 0~1, and then adds the memory that t moment is acquired and is passed to t+1 moment nets The mnemon of network.The computational methods of attenuation coefficient are that the network at t-1 moment is exported ht-1With the network inputs x of this steptKnot It does linear transformation again altogether, finally passes through sigmoid activation primitives, result is mapped between 0~1 as memory network Attenuation coefficient is denoted as ft.Attenuation coefficient calculation formula is:
ft=σ (Wf·[ht-1,xt]+bf)
2) input gate operation principle
First, the content that calculating current time learnsIt corresponds to an attenuation coefficient it.What current state learnt MemoryIt is obtained by linear transformation and tanh activation primitives.The content that current time learnsCalculation formula be:
Attenuation coefficient itIt is identical with door attenuation coefficient computational methods are forgotten, attenuation coefficient itCalculation formula is:
it=σ (Wi·[ht-1,xt]+bi)
Finally by the attenuation coefficient f at t-1 momenttIt is multiplied by the memory C at t-1 momentt-1, in addition the memory acquired under moment tIt is multiplied by its decay formula and obtains t moment memory state Ct, calculation formula is:
3) out gate operation principle
First the coefficient o that input gate obtains is calculated using similar to the method for calculating Forgetting coefficientt, this coefficient determines Output number, then according to coefficient otObtain output valve ht.Coefficient otH is exported with networktCalculation formula be:
ot=σ (Wo·[ht-1,xt]+bo)
ht=ot*tanh(Ct)
Current sequence index prediction output is updated, V is coefficient vector, and c is biasing:
y^(t)=σ (Vh(t)+c)
Use for the first time needs to train LSTM network parameters.
In conjunction with Fig. 4, the shot and long term Memory Neural Networks LSTM that compost maturity prediction technique of the present invention uses includes three layers:It is defeated Enter layer, hidden layer and output layer, input dimension 255 × 4=1020 dimensions, hidden layer neuron 500, time step 10.
The setting of LSTM network parameters is as follows:
LSTM (input_size=1020, hidden_size=500, num_layers=2, batch_first= True)
Linear (hidden_size=500, n_class=2)
Wherein, what input_size was indicated is the data dimension of input;What hidden_size was indicated is output dimension; Num_layers indicates the several layers of LSTM of stacking, and acquiescence is 1;Batch_first True or False, because of nn.lstm () The data input of receiving is (sequence length, batch input dimension), and using batch_first, we can become input (batch, sequence length input dimension);N_class represents classification.
S6, output judging result:Under detecting state, LSTM propagated forwards realize the prediction letter of output fertilizer rotten degree Breath, by compost information, that is, x of t momenttInput network, output predicted value y^(t)
Specific embodiment described herein is only to be illustrated to spirit of that invention.Technology belonging to the present invention is led The technical staff in domain can make various modifications or additions to the described embodiments or replace by a similar method In generation, however, it does not deviate from the spirit of the invention or beyond the scope of the appended claims.

Claims (11)

1. a kind of contactless canned compost maturity judgment method, it is characterised in that include the following steps:
S1, extraction t moment image data, including 255 × 3 dimension thermograph color histogram datas and compost surface RGB image Data;
Compost surface rgb image data is carried out median filter process by S2, pretreatment;
S3, the data obtained based on S2, structure convolutional neural networks CNN are carried out compost image characteristics extraction and obtain the spy of 255 dimensions Sign vector;
S4,255 × 3 dimension data of thermograph color histogram in S1 is exported with image characteristics extraction convolutional neural networks in S3 255 dimensional feature vectors be grouped together into 255 × 4 dimension compost real-time characteristics, and normalized;
S5, it is predicted based on shot and long term memory network LSTM, data is obtained as input quantity using S4, input gate forgets door and defeated Before going out the data at moment and meanwhile be used as input data;Shot and long term memory network LSTM prediction models pass through constantly training net Network undated parameter value obtains;
S6, output judging result.
2. according to the method described in claim 1, it is characterized in that in S1, thermograph color histogram data is expressed as:
In formula, QR、QG、QBColoration probability value on respectively R, G, B component;When shooting, thermal imaging system is being placed on compost tank deck just To compost surface, apart from compost surface 15-100cm.
3. according to the method described in claim 1, it is characterized in that in S1, t moment compost surface RGB image is extracted by following formula Data:
Pt is the RGB color image matrix of compost natural image, and ordinary numbers camera is on tank deck face compost surface when shooting Shooting, tank is interior to take shooting image middle region 90*90 pixels, n=by LED light light filling apart from compost surface 15-100cm 90。
4. according to the method described in claim 1, it is characterized in that in S2, the median filtering process is:By RGB image three It is added by filter window convolution results at (i, j) on channel, then activation primitive is taken to be worth to filtered image L (i, j) again Value.
5. according to the method described in claim 1, it is characterized in that in S3, CNN feature extractions are made of two steps:
It is trained first, specific practice is that N marked decomposed, not decomposed sample image is sent into CNN to decline calculation by gradient Method trains to obtain characteristics of image and extracts the parameters of network;
Then trained CNN model parameters are used for the feature extraction of monitoring process;
CNN includes that 3 convolutional layers, 3 pond layers, 2 full articulamentums and 1 classify layer, and the 255 of second full-mesh layer Dimensional vector is the final feature of image.
6. according to the method described in claim 5, it is characterized in that in CNN, input be triple channel rgb image data square Battle array, convolutional layer output dimension of picture calculation formula are:
In formula 1, W1For input matrix width, W2To input by the matrix width after convolutional layer convolution, F is convolution kernel size, P is whether to use zero padding, uses zero padding P values as 1, and it for 0, S is step-length not use zero padding P values;
Pond layer exports dimension of picture calculation formula:
In formula 2, W1For input matrix width, W2For the matrix width for inputting after the layer of pond, F is filter size, and S is Step-length.
7. according to the method described in claim 5, it is characterized in that in S4, normalized function is as follows:
X1=(X0- μ)/σ (formula 3)
Wherein, X0、X1It is the forward and backward compost real-time characteristic vector of normalization respectively, μ is the mean value of all sample datas, and σ is all The standard deviation of sample data.
8. according to the method described in claim 1, it is characterized in that in S5, shot and long term memory network LSTM includes three layers:Input Layer, hidden layer and output layer, input 255 × 4=1020 of dimension, and hidden layer neuron 500 is single to export;It is used in first time Network parameter is trained when LSTM, by the compost state feature vector x of t momentt, while data in door will be forgotten and be multiplied by Forgetting coefficient ft, input data be multiplied by input gate attenuation coefficient it, output gated data be multiplied by output factor ot, carried out by tanh activation primitives Data Integration obtains network output, and network carries out backpropagation according to mark value, calculates network error, passes through constantly training net Network undated parameter value reduces error, obtains LSTM prediction models.
9. according to the method described in claim 8, it is characterized in that back-propagation algorithm is:
Wf、bf、Wi、bi, Wo, io be network parameter, initial value is random value, is constantly trained by network, by backpropagation Algorithm finds out the update of the Grad realization parameter value of each parameter.
In LSTM, pass through hidden state h(t)GradientAnd C(t)GradientIt propagates forward step by step, loss function is carried out Derivation obtains gradient formula:
Gradient determined by the output gradient error of this layer, i.e.,:
Reversed gradient error by preceding layerGradient error and this layer slave h(t)The gradient error sent back, two It is grouped as, gradient formula is:
It is based onWithEasily obtain the gradient of parameter.
10. according to the method described in claim 8, it is characterized in that back-propagation algorithm process is:
11. according to the method described in claim 1, it is characterized in that in S6, LSTM propagated forwards realize output fertilizer rotten degree Predictive information, by the compost state feature vector x of t momenttInput network, output predicted value y^(t)
CN201810379431.1A 2018-04-25 2018-04-25 Non-contact type canned compost maturity judging method Expired - Fee Related CN108682006B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810379431.1A CN108682006B (en) 2018-04-25 2018-04-25 Non-contact type canned compost maturity judging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810379431.1A CN108682006B (en) 2018-04-25 2018-04-25 Non-contact type canned compost maturity judging method

Publications (2)

Publication Number Publication Date
CN108682006A true CN108682006A (en) 2018-10-19
CN108682006B CN108682006B (en) 2021-07-20

Family

ID=63801750

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810379431.1A Expired - Fee Related CN108682006B (en) 2018-04-25 2018-04-25 Non-contact type canned compost maturity judging method

Country Status (1)

Country Link
CN (1) CN108682006B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110059759A (en) * 2019-04-25 2019-07-26 南京农业大学 Compost maturity prediction technique based on weighting LBP- color moment
CN110377691A (en) * 2019-07-23 2019-10-25 上海应用技术大学 Method, apparatus, equipment and the storage medium of text classification
CN111028893A (en) * 2019-10-28 2020-04-17 山东天岳先进材料科技有限公司 Crystal growth prediction method and device
CN112378527A (en) * 2020-11-27 2021-02-19 深圳市同为数码科技股份有限公司 Method and device for improving non-contact temperature measurement precision
CN112633292A (en) * 2020-09-01 2021-04-09 广东电网有限责任公司 Method for measuring temperature of oxide layer on metal surface
CN113139342A (en) * 2021-04-23 2021-07-20 上海交通大学 Aerobic compost monitoring system and result prediction method
CN117976081A (en) * 2024-04-02 2024-05-03 北京市农林科学院 Composting formula method, system, equipment and medium based on model predictive optimization

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106442381A (en) * 2016-07-06 2017-02-22 中国农业大学 Characterization method for biogas residue aerobic composting fermentation maturity
CN107463919A (en) * 2017-08-18 2017-12-12 深圳市唯特视科技有限公司 A kind of method that human facial expression recognition is carried out based on depth 3D convolutional neural networks
CN107590799A (en) * 2017-08-25 2018-01-16 山东师范大学 The recognition methods of banana maturity period and device based on depth convolutional neural networks
CN107862326A (en) * 2017-10-30 2018-03-30 昆明理工大学 A kind of transparent apple recognition methods based on full convolutional neural networks

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106442381A (en) * 2016-07-06 2017-02-22 中国农业大学 Characterization method for biogas residue aerobic composting fermentation maturity
CN107463919A (en) * 2017-08-18 2017-12-12 深圳市唯特视科技有限公司 A kind of method that human facial expression recognition is carried out based on depth 3D convolutional neural networks
CN107590799A (en) * 2017-08-25 2018-01-16 山东师范大学 The recognition methods of banana maturity period and device based on depth convolutional neural networks
CN107862326A (en) * 2017-10-30 2018-03-30 昆明理工大学 A kind of transparent apple recognition methods based on full convolutional neural networks

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
宁尚晓: "城市生活垃圾堆肥腐熟度试验与评价研究", 《中国优秀硕士学位论文全文数据库 农业科技辑》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110059759A (en) * 2019-04-25 2019-07-26 南京农业大学 Compost maturity prediction technique based on weighting LBP- color moment
CN110377691A (en) * 2019-07-23 2019-10-25 上海应用技术大学 Method, apparatus, equipment and the storage medium of text classification
CN111028893A (en) * 2019-10-28 2020-04-17 山东天岳先进材料科技有限公司 Crystal growth prediction method and device
CN111028893B (en) * 2019-10-28 2023-09-26 山东天岳先进科技股份有限公司 Crystal growth prediction method and device
CN112633292A (en) * 2020-09-01 2021-04-09 广东电网有限责任公司 Method for measuring temperature of oxide layer on metal surface
CN112378527A (en) * 2020-11-27 2021-02-19 深圳市同为数码科技股份有限公司 Method and device for improving non-contact temperature measurement precision
CN112378527B (en) * 2020-11-27 2022-06-21 深圳市同为数码科技股份有限公司 Method and device for improving non-contact temperature measurement precision
CN113139342A (en) * 2021-04-23 2021-07-20 上海交通大学 Aerobic compost monitoring system and result prediction method
CN113139342B (en) * 2021-04-23 2022-12-27 上海交通大学 Aerobic compost monitoring system and result prediction method
CN117976081A (en) * 2024-04-02 2024-05-03 北京市农林科学院 Composting formula method, system, equipment and medium based on model predictive optimization

Also Published As

Publication number Publication date
CN108682006B (en) 2021-07-20

Similar Documents

Publication Publication Date Title
CN108682006A (en) Contactless canned compost maturity judgment method
Zhou et al. Near infrared computer vision and neuro-fuzzy model-based feeding decision system for fish in aquaculture
CN106202997B (en) A kind of cell division detection method based on deep learning
CN104217214B (en) RGB D personage's Activity recognition methods based on configurable convolutional neural networks
CN106203331B (en) A kind of crowd density evaluation method based on convolutional neural networks
Chang et al. Artificial intelligence approaches to predict growth, harvest day, and quality of lettuce (Lactuca sativa L.) in a IoT-enabled greenhouse system
CN109508655A (en) The SAR target identification method of incomplete training set based on twin network
Loehle Challenges of ecological complexity
CN107909566A (en) A kind of image-recognizing method of the cutaneum carcinoma melanoma based on deep learning
CN106651830A (en) Image quality test method based on parallel convolutional neural network
CN109102515A (en) A kind of method for cell count based on multiple row depth convolutional neural networks
CN109325495A (en) A kind of crop image segmentation system and method based on deep neural network modeling
CN109117877A (en) A kind of Pelteobagrus fulvidraco and its intercropping kind recognition methods generating confrontation network based on depth convolution
Chen et al. Diagnosing of rice nitrogen stress based on static scanning technology and image information extraction
Li et al. Regression and analytical models for estimating mangrove wetland biomass in South China using Radarsat images
Chopra et al. Analysis of tomato leaf disease identification techniques
Laktionov et al. Planning of remote experimental research on effects of greenhouse microclimate parameters on vegetable crop-producing
Elsherbiny et al. A novel hybrid deep network for diagnosing water status in wheat crop using IoT-based multimodal data
CN110349668A (en) A kind of therapeutic scheme aid decision-making method and its system based on BP neural network
Cao et al. Recognition of common insect in field based on deep learning
CN114942951A (en) Fishing vessel fishing behavior analysis method based on AIS data
Islam et al. HortNet417v1—A deep-learning architecture for the automatic detection of pot-cultivated peach plant water stress
Zhang et al. Deep convolutional neural networks for shark behavior analysis
Lauguico et al. Indirect measurement of dissolved oxygen based on algae growth factors using machine learning models
Alshahrani et al. Chaotic Jaya optimization algorithm with computer vision-based soil type classification for smart farming

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210720