CN116563714A - Method and system for automatically distinguishing growth stage of rice - Google Patents

Method and system for automatically distinguishing growth stage of rice Download PDF

Info

Publication number
CN116563714A
CN116563714A CN202310597792.4A CN202310597792A CN116563714A CN 116563714 A CN116563714 A CN 116563714A CN 202310597792 A CN202310597792 A CN 202310597792A CN 116563714 A CN116563714 A CN 116563714A
Authority
CN
China
Prior art keywords
rice
growth stage
convolution
plant
classification model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310597792.4A
Other languages
Chinese (zh)
Inventor
沈婧芳
魏晋启
李镜希
李燕
龙容
陈秋剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong Agricultural University
Original Assignee
Huazhong Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong Agricultural University filed Critical Huazhong Agricultural University
Priority to CN202310597792.4A priority Critical patent/CN116563714A/en
Publication of CN116563714A publication Critical patent/CN116563714A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/42Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/771Feature selection, e.g. selecting representative features from a multi-dimensional feature space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method and a system for automatically judging the growth stages of rice. The invention can automatically judge the growth stage of the rice and has higher accuracy and identification speed, thereby being capable of being used for serving the automatic cultivation process of the rice; the method is simple and easy to operate, and for large-area planted rice, the rice growth stage can be judged by only providing data related to the basic characters of the rice or pictures of the whole rice, so that the manual workload can be reduced, and the working efficiency can be improved.

Description

Method and system for automatically distinguishing growth stage of rice
Technical Field
The invention belongs to the technical field of automatic identification of plant growth stages, and particularly relates to a method and a system for automatically judging a rice growth stage.
Background
In agricultural production and life and related scientific research, obtaining the properties of crops is often the basis of production and research. In China, mechanized rice planting is not popular, so that the improvement of the accuracy of identifying the phenotypic character of rice plants is a key for realizing automatic rice planting, and has important value for subsequent researches such as rice breeding and yield estimation. For example, in the process of automatic planting, plants have corresponding cultivation means in each different growth stage, and the traditional method is to judge the growth stage of the plants by manually aiming at the actual character expression of the plants, the environmental climate and the time. However, for the modern planting mode of large-area cultivation, the workload of personnel is large, the efficiency is low, and if the manual identification method is adopted, the cultivation time of rice is delayed. And the physicochemical property of the soil, the illumination intensity and the proportion of each component in the air can have a more remarkable effect on the growth of rice, and the change rule is not easy to master, so that great challenges are brought to the growth condition monitoring of crops, and the efficiency of mechanized planting is greatly reduced. Meanwhile, certain relativity can exist among rice traits, and the simple linear model assumption ignores the influences, and the variation which cannot be explained by the model is considered as a random error term. These factors can certainly affect the stability of the model, can not guarantee the fitting precision of the linear regression, and also has certain limitation when the linear model is applied to the intelligent and automatic management of agriculture.
Disclosure of Invention
In order to improve the accuracy of judging the growth stage of the rice and reduce the manual work load, the invention provides a method and a system for automatically judging the growth stage of the rice.
The invention discloses a method for automatically distinguishing the growth stage of rice, which comprises the following steps:
s1, taking a plurality of pictures of the whole rice plant in different growth stages as variables of an input layer of a convolutional neural network, obtaining a plurality of convolutional features through operation of the convolutional layer, and selecting one or a plurality of convolutional features from the plurality of convolutional features;
s2, selecting a plurality of characterization attributes of different growth stages of the rice and training a classification model together with the selected one or more convolution features to obtain a trained classification model, wherein the classification model is used for judging at which growth stage the rice is in.
Further, the characterization attributes include: plant width PW; plant vertical height PH (V); the ratio of the plant vertical height PH (V) to the plant width PW; plant height PH after straightening rice leaves; the ratio of the plant height PH to the plant width PW; projection area SA of side view of rice plants; the ratio of the projection area SA of the side view of the rice plant to the product of the vertical height PH (V) of the plant and the width PW of the plant, namely SA/(PH (V) ×PW); fractal dimension IFD and box-counting dimension SFD based on binary image minimum bounding rectangle; texture parameter g_g; fresh weight; dry weight; plant height in a natural state; green leaf area.
The fractal dimension is a parameter for measuring repeated change of a certain rule of the surface of an object, and describes geometric features of the surface of the object, such as the edge of snowflake, the contour of tree leaves and the like, which are repeated continuously according to the certain rule; in the invention, the profile features of the rice leaves are used for measuring the profile complexity of the rice leaves; by introducing the characterization parameters, the precision of the automatic rice discrimination growth stage can be further improved. The box counting dimension based on the circumscribed rectangle under the binary image is a fractal dimension calculated by adopting a calculation method different from IFD. The calculation method is to cover the graph to be measured with square grids, continuously reduce the grid scale and calculate the number of grids touched by the boundary. The method is suitable for calculating the fractal dimension of the accurate self-similar graph and particle distribution.
The texture parameters, namely gradient information of the rice ears, are used for reflecting the number of the rice ears.
Further, the method for selecting one or more convolution features includes: the variance of each convolution feature is calculated and ordered from large to small, one or more convolution features at the forefront of the sequence are selected, and the variance is larger as the position in the sequence is earlier.
Further, the method for selecting one or more convolution features includes: and calculating the variance of each convolution characteristic, and selecting the convolution characteristic larger than a set value.
Further, in the convolution characteristic selected to be larger than the set value, the optimal value of the set value is 0.03 after repeated experiments.
Further, in order to improve the accuracy, the accuracy of discriminating the rice growth stage is highest when the double-layer neural network is adopted through the following experiments.
Experiment 1 single-layer neural network, the structure is as follows:
TABLE 1
Number of neurons 5
Threshold value 0.01
Maximum accuracy 1e+05
Learning rate 0.01
The rice growth stage is classified by the unit neural network which comprises 5 neurons, has the maximum precision of 1e+05 and the learning rate of 0.01, and the accuracy rate of distinguishing the rice growth stage is 96.62 percent.
Experiment 2 single layer neural network, the structure is as follows:
TABLE 2
Number of neurons 10
Threshold value 0.01
Maximum accuracy 1e+05
Learning rate 0.01
The single-layer neural network with 10 neurons, maximum precision of 1e+05 and learning rate of 0.01 is used for classifying the rice growth stage, and the accuracy of distinguishing the rice growth stage is 97.1%.
Experiment 3 single layer neural network, the structure is as follows:
TABLE 3 Table 3
Number of neurons 15
Threshold value 0.01
Maximum accuracy 1e+05
Learning rate 0.01
The single-layer neural network comprising 15 neurons and having maximum precision of 1e+05 and learning rate of 0.01 is used for classifying the rice growth stage, and the accuracy of distinguishing the rice growth stage is 97.1%.
Experiment 4 a single layer neural network, the structure is as follows:
TABLE 4 Table 4
Number of neurons 15
Threshold value 0.01
Maximum accuracy 1e+05
Learning rate 0.001
The single-layer neural network comprising 15 neurons and having maximum precision of 1e+05 and learning rate of 0.001 is used for classifying the rice growth stage, and the accuracy of distinguishing the rice growth stage is 97.1%.
Experiment 5 double layer neural network, the structure is as follows:
TABLE 5
Number of neurons 10,10
Threshold value 0.01
Maximum accuracy 1e+05
Learning rate 0.001
Classifying the rice growth stage through a double-layer neural network with 10 neurons in each layer, a threshold value of 0.01, maximum precision of 1e+05 and a learning rate of 0.001, wherein the accuracy rate of distinguishing the rice growth stage is 96.62%;
experiment 6 double-layer neural network, the structure is as follows:
TABLE 6
Number of neurons 32,32
Threshold value 0.01
Maximum accuracy 1e+05
Learning rate 0.001
The rice growth stage is classified by the double-layer neural network with 32 neurons in each layer, the threshold value of 0.01, the maximum precision of 1e+05 and the learning rate of 0.001, and the accuracy rate of distinguishing the rice growth stage is 97.58%.
Through the 6 experimental comparisons, the double-layer neural network shown in experiment 6, wherein each layer comprises 32 neurons, the threshold value is 0.01, the maximum precision is 1e+05, and the learning rate is 0.001, is used for classifying the rice growth stage, and the accuracy of distinguishing the rice growth stage is highest.
Further, the neural network is a two-layer neural network.
Further, the number of neurons of the convolutional layer is 16.
Further, the classification model is a double-layer neural network adopting BP algorithm.
Further, the number of neurons per layer in the two-layer neural network is 32.
The system for automatically judging the rice growth stage comprises a characterization attribute selection module and a rice growth stage identification module, wherein the characterization attribute selection module is used for selecting a plurality of characterization attributes related to the growth stage from a plurality of characterization attributes of different growth stages of rice; the rice growth stage identification module is used for training the classification model with the multiple characterization attributes of the selected rice at different growth stages, so as to obtain a classification model after training, and the classification model after training is used for judging and obtaining the growth stage of the rice according to the input characterization attributes of the rice.
Further, the method comprises a convolution characteristic acquisition module, wherein the convolution characteristic acquisition module is used for taking a plurality of pictures of the whole rice plant in different growth stages as variables of an input layer of a convolution neural network, obtaining a plurality of convolution characteristics through operation of the convolution layer, and selecting one or more convolution characteristics from the convolution characteristics, wherein the one or more convolution characteristics are used for training a classification model together with a plurality of characterization attributes of the selected rice plant in different growth stages.
Further, the classification model in the rice growth stage identification module is a double-layer neural network adopting BP algorithm.
The beneficial effects are that:
the invention can automatically judge the growth stage of the rice and has higher accuracy and identification speed, thereby serving the automatic cultivation process of the rice; and the method is simple and easy to operate, and for large-area planted rice, the discrimination can be performed only by providing basic characters of the rice or pictures of the whole rice, so that the manual workload is reduced, and the working efficiency is improved.
Drawings
Fig. 1: a flow diagram of the method of the invention;
fig. 2: a flow of extracting convolution characteristics;
fig. 3: a method of extracting convolution features;
fig. 4: relation between the extracted convolution characteristics and the growth stage of rice.
Detailed Description
The following detailed description is presented to explain the claimed invention and to enable those skilled in the art to understand the claimed invention. The scope of the invention is not limited to the following specific embodiments. It is also within the scope of the invention to include the claims of the present invention as made by those skilled in the art, rather than the following detailed description.
One embodiment of the method of the present invention is described below in conjunction with fig. 1.
Selecting a plurality of characterization attributes of different growth stages of rice, and training a classification model to obtain a classification model after training, wherein the classification model is used for judging the growth stage of the rice;
the multiple characterization attributes of the rice at different growth stages in this embodiment are derived from RAP (rice phenotype automatic measurement system) developed by Yang (2014) to perform trait measurement on rice planted in a pot plant of the agricultural university in China. The RAP can measure 32 rice characters such as plant width, plant vertical height, plant height/plant width, projection area of side view of rice plant, projection area and the like. The characteristics of fresh weight, dry weight, plant height, tillering number, green leaf area, leaf number and the like of the rice need to be measured manually. RAP also shoots a rice plant image (Yang et al 2014) under the condition that the surrounding environment of the rice is relatively fixed, and fits the phenotypic characteristic data such as fresh weight, dry weight, texture parameters and the like of the rice, which can be obtained only by manual experiments or high-precision instruments.
The original phenotype characteristic data set comprises a large number of external characteristic attributes of rice, and the data set comprises the results of measuring the characteristics of three different growth stages of 521 varieties of rice by RAP. The first stage is a rice tillering stage, the second stage is a booting stage and a jointing stage of rice, and the third stage is a heading stage and a grouting stage of rice. The 14 variables associated with discriminating rice growth stage were extracted for model fitting as shown in Table 7 below:
TABLE 7
The classification model may employ the following algorithm: the naive bayes method, the logic strauss, the support vector machine, the random forest, the decision tree and the neural network are compared through experiments shown in table 8, and the prediction accuracy is highest when the neural network is adopted.
Neural networks are widely interconnected networks of simple units with adaptability, whose organization is able to simulate the interactive response of biological systems to real world objects.
Similar to biological neuron structures, neural networks define neurons as central processing units that perform mathematical operations to generate a set of inputs to a set of outputs. The output of a neuron is a function of the weighted sum of the inputs plus the bias. If the total received signal exceeds the activation threshold, the neuron performs an operation.
The error back propagation (error BackPropagation, abbreviated as BP) algorithm is a training algorithm adopted by the neural network in this embodiment, and is mostly used for performing regression prediction and classification analysis, by training a large amount of data, and continuously adjusting parameters in the process, a prediction model with smaller error is obtained.
TABLE 8
Model Accuracy rate of
Naive Bayes 88.6%
Logic Studies 96.4%
Support vector machine 94.9%
Random forest 92.3%
Decision tree 93.8%
Neural network 94.7%
Preferably, when a double-layer neural network is adopted, the accuracy is further improved, and the accuracy of distinguishing the rice growth stage is 97.58%.
Before determining to train with the two-layer neural network, further comprising: and taking a plurality of pictures of the whole rice plant in different growth stages as variables of an input layer of the convolutional neural network, obtaining a plurality of convolutional features through operation of the convolutional layer, and selecting one or more convolutional features from the plurality of convolutional features, wherein the one or more convolutional features are used for training a classification model together with a plurality of characterization attributes of the selected rice plant in different growth stages. As shown in table 9, after the convolution feature is introduced, the rice data is divided into a training set and a test set according to the proportion of 1:9, the classification model is trained by the training set, the test set is used for testing the classification model, and the accuracy of the model is improved through experiments:
TABLE 9
Model \accuracy Original accuracy rate Accuracy after adding convolution feature
Naive Bayes 88.6% 94.6%
Logic Studies 96.4% 97.8%
Support vector machine 94.9% 97.5%
Random forest 92.3% 96.1%
Decision tree 93.8% 96.4%
Neural network 94.7% 99%
The Convolutional Neural Network (CNN) has the advantage that the characteristics of the rice growth stage can be judged to the maximum degree by automatically extracting the characteristics required by a machine, and the characteristic is obtained through an operation layer named as a convolutional layer in the CNN. In the original algorithm, the convolutional neural network performs the judgment of the growth stage by extracting the characteristics capable of judging the growth stage of rice from the convolutional layer and then sending the characteristics into an operation layer named as a full-connection layer, which is essentially a full-connection neural network, which is the whole-connection layer. The present invention proposes the extracted features directly without going through the full connection layer, these features being referred to herein as "convolution features"; by arranging the convolution features according to the variance, the distribution of the convolution features with large variance has close relation with the rice growth stage, as shown in fig. 4, and further the convolution features with close relation are selected and introduced into the training of the classification model, so that the judgment of the precision of the rice growth stage can be remarkably improved.
The method of ranking these convolution characteristics by variance size is shown in table 10 below:
table 10
V1 V2 V3
1 0.030251 0.153884 0.181889
2 0.028547 0.155679 0.181633
3 0.026744 0.155695 0.178999
4 0.026886 0.154684 0.178141
5 0.028747 0.152735 0.178065
6 0.032206 0.159223 0.181642
…… …… …… ……
It shows the first three "convolution features" V1, V2, V3 extracted by the convolution layer. The first column of numbers indicates the numbers of rice images. Taking these three convolution characteristics as an example, each rice plant has corresponding V1, V2 and V3 characteristics. In order to screen out the characteristics related to the growth stage of rice from hundreds of thousands of convolution characteristics, we use variance as clues, and find those convolution characteristics with values closely related to the growth stage of rice by calculating the variance of each row of convolution characteristics, and the calculation formula of the variance of each row of convolution characteristics is as follows:
wherein:
n is the number of samples, i.e. the number of rice used, 1035 in this example;
x i is the value of convolution characteristic, 1 is less than or equal to i is less than or equal to n;
x mean value of Is the average of the convolution characteristics.
Through trial and error, as shown in fig. 2 and 3, the convolutional layer parameters of the convolutional neural network are set as follows:
TABLE 11
Convolutional layer 16 neurons, input size (256,256,3) picture size
Activation function Relu
Pooling layer Size (2, 2)
Flat layer
The convolution characteristics extracted from the convolution layer are directly extracted without passing through the full connection layer. And then calculating the variance of all convolution characteristics (namely rice parameters), selecting the characteristics closely related to the rice growth stages from the convolution characteristics with the variance larger than a set value, and taking the characteristics and a plurality of characterization attributes of the rice at different growth stages shown in table 7 as input variables of a classification model, wherein the optimal value of the set value is 0.03 through multiple experiments. The classification model outputs the growth stage of the rice according to the characteristics of the rice, wherein the growth stage comprises the following steps: the first stage of rice tillering stage, the second stage of rice booting stage and jointing stage, and the third stage of rice heading stage and grouting stage.

Claims (10)

1. The method for automatically distinguishing the growth stage of the rice is characterized by comprising the following steps of:
s1, taking a plurality of pictures of the whole rice plant in different growth stages as variables of an input layer of a convolutional neural network, obtaining a plurality of convolutional features through operation of the convolutional layer, and selecting one or a plurality of convolutional features from the plurality of convolutional features;
s2, selecting a plurality of characterization attributes of different growth stages of the rice and training a classification model together with the selected one or more convolution features to obtain a trained classification model, wherein the classification model is used for judging at which growth stage the rice is in.
2. The method for automatically determining the growth stage of rice of claim 1, wherein said characterization attributes comprise: plant width; plant vertical height; the ratio of plant vertical height to plant width; plant height of the rice leaves after being straightened; the ratio of the plant height to the plant width; the projection area of the side view of the rice plant; the ratio of the projected area of the side view of the rice plant to the product of the vertical height of the plant and the width of the plant; fractal dimension; box counting dimension based on circumscribed rectangle under binary diagram; texture parameters; fresh weight; dry weight; plant height in a natural state; green leaf area.
3. The method for automatically determining the growth stage of rice of claim 1, wherein the method for selecting one or more convolution characteristics comprises: the variance of each convolution feature is calculated and ordered from large to small, and one or more convolution features located at the forefront of the sequence are selected.
4. The method for automatically determining the growth stage of rice of claim 1, wherein the method for selecting one or more convolution characteristics comprises: and calculating the variance of each convolution characteristic, and selecting the convolution characteristic larger than a set value.
5. The method for automatically determining a rice growth stage according to claim 4, wherein said set value is 0.03.
6. The method for automatically determining a stage of rice growth according to any one of claims 1 to 5, wherein the number of neurons in the convolutional layer is 16.
7. The method for automatically determining the growth stage of rice according to any one of claims 1 to 5, wherein the classification model is a two-layer neural network.
8. The method for automatically determining the growth stage of rice of claim 7, wherein the number of neurons per layer in the two-layer neural network is 32.
9. A system for automatically discriminating a rice growth stage according to claim 1 including a characterization attribute selection module for selecting a plurality of characterization attributes associated with a growth stage from a plurality of characterization attributes of different growth stages of rice and a rice growth stage identification module; the rice growth stage identification module is used for training the classification model with a plurality of characterization attributes of selected rice at different growth stages to obtain a classification model after training, and the classification model after training is used for obtaining the growth stage of the rice according to the input characterization attributes of the rice.
10. The system for automatically determining the growth stage of rice according to claim 9, comprising a convolution feature acquisition module, wherein the convolution feature acquisition module is configured to use a plurality of pictures of the whole rice plant in different growth stages as variables of an input layer of a convolution neural network, obtain a plurality of convolution features through operation of the convolution layer, and select one or more convolution features from the plurality of convolution features, wherein the one or more convolution features are used for training a classification model together with a plurality of characterization attributes of the selected rice plant in different growth stages.
CN202310597792.4A 2023-05-25 2023-05-25 Method and system for automatically distinguishing growth stage of rice Pending CN116563714A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310597792.4A CN116563714A (en) 2023-05-25 2023-05-25 Method and system for automatically distinguishing growth stage of rice

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310597792.4A CN116563714A (en) 2023-05-25 2023-05-25 Method and system for automatically distinguishing growth stage of rice

Publications (1)

Publication Number Publication Date
CN116563714A true CN116563714A (en) 2023-08-08

Family

ID=87486058

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310597792.4A Pending CN116563714A (en) 2023-05-25 2023-05-25 Method and system for automatically distinguishing growth stage of rice

Country Status (1)

Country Link
CN (1) CN116563714A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117726052A (en) * 2024-02-08 2024-03-19 北京市农林科学院信息技术研究中心 Yield prediction method, device, equipment and medium based on fractal dimension

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117726052A (en) * 2024-02-08 2024-03-19 北京市农林科学院信息技术研究中心 Yield prediction method, device, equipment and medium based on fractal dimension
CN117726052B (en) * 2024-02-08 2024-06-11 北京市农林科学院信息技术研究中心 Yield prediction method, device, equipment and medium based on fractal dimension

Similar Documents

Publication Publication Date Title
CN111476713B (en) Intelligent weather image identification method and system based on multi-depth convolution neural network fusion
CN107316289B (en) Method for dividing rice ears in field based on deep learning and superpixel division
CN108346142B (en) Plant growth state identification method based on plant illumination image
CN106845497B (en) Corn early-stage image drought identification method based on multi-feature fusion
CN102524024B (en) Crop irrigation system based on computer vision
CN110222215B (en) Crop pest detection method based on F-SSD-IV3
CN116563714A (en) Method and system for automatically distinguishing growth stage of rice
CN112347894A (en) Single-plant vegetation extraction method based on transfer learning and Gaussian mixture model separation
CN111291686B (en) Extraction method and system for crop root-fruit phenotype parameters and root-fruit phenotype discrimination method and system
CN114693616A (en) Rice disease detection method, equipment and medium based on improved target detection model and convolutional neural network
CN115860581A (en) Method, device, equipment and storage medium for evaluating suitability of crop variety
CN115393631A (en) Hyperspectral image classification method based on Bayesian layer graph convolution neural network
CN106326914B (en) A kind of more classification methods of pearl based on SVM
Zhao et al. Transient multi-indicator detection for seedling sorting in high-speed transplanting based on a lightweight model
CN114140403A (en) Plant leaf disease detection method based on convolutional neural network
Kolhar et al. Phenomics for Komatsuna plant growth tracking using deep learning approach
CN104990891A (en) Method for establishing seed near infrared spectrum and spectral image qualitative analysis model
Suwarningsih et al. Ide-cabe: chili varieties identification and classification system based leaf
Miao et al. Crop weed identification system based on convolutional neural network
CN112364710A (en) Plant electric signal classification and identification method based on deep learning algorithm
CN116310338A (en) Single litchi red leaf tip segmentation method based on examples and semantic segmentation
CN114511850A (en) Method for identifying image of fruit size and granule of sunshine rose grape
CN111723737B (en) Target detection method based on multi-scale matching strategy deep feature learning
Gao et al. Classification Method of Rape Root Swelling Disease Based on Convolution Neural Network
CN109308936B (en) Grain crop production area identification method, grain crop production area identification device and terminal identification equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination