CN114741548A - Mulberry leaf disease and insect pest detection method based on small sample learning - Google Patents

Mulberry leaf disease and insect pest detection method based on small sample learning Download PDF

Info

Publication number
CN114741548A
CN114741548A CN202210458831.8A CN202210458831A CN114741548A CN 114741548 A CN114741548 A CN 114741548A CN 202210458831 A CN202210458831 A CN 202210458831A CN 114741548 A CN114741548 A CN 114741548A
Authority
CN
China
Prior art keywords
model
dcnet
training
feature map
data set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210458831.8A
Other languages
Chinese (zh)
Inventor
吴琪
吴云志
曾涛
乐毅
张友华
余克健
胡楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Agricultural University AHAU
Original Assignee
Anhui Agricultural University AHAU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Agricultural University AHAU filed Critical Anhui Agricultural University AHAU
Priority to CN202210458831.8A priority Critical patent/CN114741548A/en
Publication of CN114741548A publication Critical patent/CN114741548A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Library & Information Science (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a mulberry leaf disease and insect pest detection method based on small sample learning, which comprises the following steps of: step 1, data set acquisition: acquiring a base class data set and a new class data set; step 2, preprocessing a data set; step 3, constructing an RP-DCNet model; step 4, training the RP-DCNet model by the base class data set in a meta-learning stage; step 5, training the RP-DCNet model by the new class data sets and the base class data sets with the number equivalent to that of the new class data sets in a meta-fine tuning stage; step 6, adjusting parameters of the RP-DCNet model through training to obtain optimal configuration parameters; and 7, detecting diseases and insect pests of the mulberry leaves based on the model under the optimal configuration parameters. The invention can still keep higher detection precision under the condition of less sample quantity.

Description

Mulberry leaf disease and insect pest detection method based on small sample learning
Technical Field
The invention relates to a disease and insect pest image detection method, in particular to a mulberry leaf disease and insect pest detection method based on small sample learning.
Background
In the planting process of the mulberry, the mulberry disease is the phenomenon of poor growth and development, low mulberry leaf yield and poor quality caused by pathogenic microorganism infection or unsuitable environmental conditions. Therefore, the pest control of the mulberry leaves is always a main problem, and if the pest control is not timely performed, the pest control can reduce the income of mulberry leaf planters. Therefore, the timely prevention and control of the plant diseases and insect pests are the key for fundamentally reducing the loss. Common diseases and insect pests of mulberry leaves are as follows: mulberry atrophy, mulberry blight, mulberry brown spot and the like, and the parasitic diseases comprise mulberry root rot, mulberry stem blight, mulberry plaster disease, mulberry powdery mildew, mulberry leaf blight, mulberry sclerotinia, and the like. With the increasing computing power of computers, the application of the computer field to agriculture is also becoming more extensive. The artificial intelligence algorithm can be effectively applied to specific agricultural scenes, and helps agricultural workers to scientifically improve the product quality and yield.
Because mulberry leaves have some rare pests, the number of samples collected from the net or field is extremely small. In the field of mulberry leaf pest identification, a larger pest data set for training a machine learning model is not available at present, some pest types are only provided with a plurality of pictures, and the accuracy rate of a conventional target detection framework on the condition that the number of samples is small is not good.
At present, a large number of researchers apply computer vision methods to the identification of plant diseases and insect pests of crops. Such as tomato leaf disease images and rice disease and pest images, etc. However, the technologies of the technologies for disease images of crops, especially mulberry leaves, are not mature enough, and conventional deep learning often needs to be supported by a large number of data sets to more accurately judge the detection category of the diseases and insect pests. In addition, due to the fact that the number of people concerned in the fields is small, the collection cost is high and the types of diseases and pests are rare, related disease and pest type data sets are not available for researchers to refer to, and the situations of low model accuracy, low efficiency and high cost are caused.
Disclosure of Invention
The invention aims to provide a mulberry leaf disease and insect pest detection method based on small sample learning, and aims to solve the problems of low model accuracy, low efficiency and high cost in the case of small sample number when the prior art uses machine learning to detect and identify mulberry leaf disease and insect pest.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a mulberry leaf disease and pest detection method based on small sample learning comprises the following steps:
step 1, data acquisition:
acquiring an existing pest and disease image data set of other crop leaves as a base class data set, and acquiring a mulberry leaf pest and disease image data set as a new class data set;
step 2, preprocessing a data set:
respectively dividing the base class data set and the new class data set obtained in the step (1) into a training set and a testing set, and respectively preprocessing the training set and the testing set of the base class data set and the new class data set;
and 3, constructing an RP-DCNet model based on the DCNet model:
the DCNet model comprises a feature extractor, a dense relation distillation module and a context sensing polymerization module, wherein the query feature extractor in the DCNet model takes a query picture as input to obtain a query feature map, and the query feature map is processed by a key encoder and a value encoder of the dense relation distillation module to obtain a queried key feature map and a queried value feature map. A support feature extractor in the DCNet model takes a support picture and binary mask pictures corresponding to the support picture as input to obtain a support feature map, and the support feature map is processed by a key encoder and a value encoder of a dense relation distillation module to obtain a supported key feature map and a supported value feature map;
adding a relative position coding module in a dense relation distillation module in the DCNet model, thereby obtaining an RP-DCNet model; a key encoder and a value encoder in the dense relation distillation module generate corresponding key feature maps and value feature maps on the query feature map and the support feature map, and the relative position encoding module encodes by taking the dimensions of the key feature maps generated by the query feature map and the support feature map as the reference, so as to establish relative position encoding on the key feature maps;
and 4, training the DCNet model to have two stages of meta learning and meta fine adjustment. Inputting a training set in the base class data set to an RP-DCNet model for training in a meta-learning stage, performing error calculation on an output result of the RP-DCNet model during training and a test set in the base class data set, and adjusting parameters of the RP-DCNet model to optimal parameters based on an error calculation result;
step 5, inputting the training sets in the new class data set and the training sets in the base class data set with the number equivalent to that of the training sets in the new class data set into the RP-DCNet model for training at the meta-fine tuning stage, performing error calculation on the output result of the RP-DCNet model during training and the test set in the new class data set, and adjusting the parameters of the RP-DCNet model to the optimal parameters again based on the error calculation result;
and 6, detecting the mulberry leaf disease and insect pest image to be detected by adopting the RP-DCNet model adjusted to the optimal parameters in the steps 4 and 5 to obtain a disease and insect pest detection result.
Further, the preprocessing in step 2 includes Mosaic data expansion, random inversion, random cropping, and scaling.
Further, during the preprocessing in step 2, the Mosaic data is used for enhancing, meanwhile, random inversion is carried out according to a set probability, one of the multiple scales is randomly selected to zoom the data in the training set, and a part of the picture is randomly cut out to be used as a new picture.
Further, in step 3, a key feature map and a value feature map are extracted from the query feature map and the support feature map, respectively. The embedded position of the relative position coding module is after matrix multiplication is carried out on the extracted key characteristic diagram and the key characteristic diagram enters a Softmax function to be output;
the relative position coding module establishes relative position codes in the last two dimensions of the key feature map by calculating relative coordinates of the current position and other positions on the basis of the last two dimensions of the key feature map;
and then the relative position coding module performs matrix addition on the formed relative position code and the output of the Softmax function in the relationship intensive distillation module to obtain an output result.
Further, in the meta-learning stage in step 4, the base class data set is input into the RP-DCNet model, and in this stage, the query feature extractor and the support feature extractor perform joint training. Similarly, in the meta-fine tuning stage, the dense relation distillation module, the context-aware aggregation module and other basic model components in the RP-DCNet model are also learned with training in this stage.
Further, the error calculation in steps 4 and 5 includes a classification error calculation and a regression error calculation.
Further, in steps 4 and 5, the training round of the meta trimming stage is less than the training round of the meta learning stage.
Compared with the prior art, the invention has the advantages that:
the invention provides a mulberry leaf disease and insect pest detection method with small samples, which can still effectively identify the types of the mulberry leaf disease and insect pest under the condition of small sample number, and help mulberry leaf farmers to accurately identify rare disease and insect pest, so that corresponding prevention measures are taken, the loss of crops caused by the wrong type identification of the disease and insect pest is avoided, and higher detection accuracy can still be maintained even under the condition that the sample number is only a few.
Drawings
FIG. 1 is a block diagram of the process flow of the present invention.
FIG. 2 is a diagram of the RP-DCNet framework proposed by the method of the present invention.
FIG. 3 is a flow chart of the construction of relative position codes proposed by the method of the present invention.
FIG. 4 is a specific structure diagram of the relative position code proposed by the method of the present invention in RP-DCNet.
FIG. 5 is a flow chart of the training steps of the method of the present invention.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
As shown in figure 1, the mulberry leaf pest and disease damage detection method based on small sample learning in the embodiment of the invention comprises the following steps:
1. a large number of existing pest image data sets of other crop leaves are searched from the Internet to serve as base class data sets of the model, and meanwhile, mulberry leaf pest image data sets are collected from the Internet and on the spot to serve as new class data sets of the model.
2. And dividing the data in the base class data set into a training set, a testing set and a verification set according to the proportion of 7:2:1, and dividing the data in the new class data set into the training set, the testing set and the verification set according to the proportion of 7:2: 1.
3. And respectively carrying out Mosaic data amplification, random turning, random cutting and scaling on the training set and the test set of the base class data set and the new class data set.
The Mosaic data enhancement is to cut four pictures randomly, to be spliced into a new picture as new data, to turn over randomly at the same time with a probability of 0.5, to cut out a part of the pictures randomly as a new picture, and to select one of several scales randomly to scale the data in the training set.
4. And (3) constructing an RP-DCNet model, wherein the integral structure of the RP-DCNet model is shown in figure 2, and the RP-DCNet model is improved by taking the DCNet model as a basic model. The DCNet model comprises a feature extractor, a dense relation distillation module and a context perception polymerization module, wherein the feature extractor in the DCNet model takes a query picture and a support picture and binary mask pictures corresponding to the query picture and the support picture as input, and a query feature picture and a support feature picture are obtained through corresponding 3-by-3 convolution layers sharing weight in the feature extractor. And respectively inputting the query characteristic diagram and the support characteristic diagram into a key encoder and a value encoder in the dense relation distillation module to obtain a key characteristic diagram and a value characteristic diagram which respectively correspond to the query characteristic diagram and the support characteristic diagram, wherein the value characteristic is used for quantifying the characteristic similarity in the query picture and the support picture set so as to activate the region in the corresponding value characteristic diagram.
And adding a relative position coding module in a dense relation distillation module in the DCNet model, thereby obtaining the RP-DCNet model. And respectively extracting a key feature map and a value feature map from the query feature map and the support feature map. The position embedded by the relative position coding module is coded by taking the dimension after the extracted key feature map is subjected to matrix multiplication and enters the output of the Softmax function as a reference, and therefore the relative position code is established on the key feature map.
For the query feature map, let the key feature map after key encoding be kqWherein k isq∈RC/8×H×W. For the supporting feature map, let the key feature map after key coding be ksWherein k iss∈RN×C/8×H×W. After matrix multiplication and Softmax function operation, the output dimension is kq,s∈RN×H×W×H×W. Thus, a relative position code R is established on the key profile in the support profile, where R ∈ RN×H×W×H×W
The relative position coding in the RP-DCNet model is obtained by calculating the relative coordinates of the current position and other positions and performing mathematical operation. When the relative position code is calculated in the last two dimensions of the key feature map, and therefore there is a position with the calculated coordinate (p, q), the calculation formula for the corresponding position (i, j) is as follows:
Pi,j,p,q=(p-i,j-i),
Pi,j,p,qis the p-th row and the j-th columnAnd the relative position difference with the ith row and the jth column. Where (p, i ═ {0, 1 …, W-1} q, j ═ 0, 1 …, H-1}), H denotes the height on the key feature map that supports feature map generation, and W denotes the width on the key feature map that supports feature map generation. And adding W-1 to the row marks and the column marks to ensure that the values of the row marks and the column marks are not less than 0, multiplying the row marks by 2H-1, adding the row marks and the column marks, and splicing W x H matrixes, thereby obtaining the relative position code with the dimension of WH x WH. The relative position code ranges from 0 to (2H-1) × (2H-1) +2 (W-H).
The generation of the relative position code needs a relative position code table guidance, and the relative position code table is obtained by randomly initializing a learnable parameter matrix. The dimension of the relative position code table is (2H-1) × (2H-1) +2 (W-H). And inquiring the value of the corresponding position through a relative position code table to be used as a final relative position code. Since there are N classes in the support matrix, N relative position encoding tables are required. The dimension of the relative position encoding table obtained finally is WH × WH × N. The generation structure of the relative position code is shown in fig. 3, in which W ═ 2 and H ═ 2 are exemplified in the present invention.
Adding a module of relative position codes on the basis of DCNet, wherein the embedded positions of the relative position codes are subjected to matrix multiplication on the key features of the query features and the key features of the support features. Therefore, the feature similarity of the query feature and the support feature pixel level is obtained, and the calculation formula is as follows:
F(kqi,ksj)=φ(kqi)Tφ′(ksj)
wherein: i and j are position indices of the query and support features, phi and phi' denote different linear operations, F (k)qi,ksj) Denotes a similarity calculation function, phi (k)qi)T=akqi+b,φ′(ksj) Represents ckqi+d,kqiI-th value, k, representing the query key feature mapsjJ-th value, representing a graph of supporting key features, linear operation phi (k)qi)TAnd phi' (k)sj) A, b, c, and d in (1) can be continuously learned by the model in training. Then, after Softmax normalization processing, final similarity weight is outputWij. Wherein WijThe calculation formula of (a) is as follows:
Figure BDA0003619749580000061
and carrying out matrix addition on the relative position code r with the dimension of WH multiplied by N and the similarity weight W and outputting to the next module. The modified portion is shown in fig. 4.
7. And in the meta-learning stage, a large number of base class data sets constructed by other crop disease and insect pest leaves are input into the model. In this stage, the query feature extractor and the support feature extractor are jointly trained, and similarly, the dense relation distillation module, the context-aware aggregation module and other basic model components in the RP-DCNet model are learned along with the training in this stage.
In the meta-fine adjustment stage, because the data of the mulberry leaf pest class is less, and the number of other crop leaf pest data sets is more, in order to balance the sample difference between the two data sets, the invention selects samples with the same number as the new class data set marking detection frames in the base class data set. And inputting the new class data set and the selected base class data set into the model for training. And similarly, as in the Yuan learning stage, when the Yuan fine tuning stage trains, the basic module in the model continuously learns and updates parameters, so that new mulberry leaf pest and disease damage categories are detected. To avoid overfitting, the training round of the meta-fine phase is less than the training round of the meta-learning phase. The method of model meta-learning and meta-fine tuning training is shown in fig. 5.
8. After the RP-DCNet model generates a prediction target, respectively calculating a classification error and a regression error in a meta-learning stage and a meta-fine tuning stage, reflecting error results to each parameter of the RP-DCNet model, and updating network parameters in the RP-DCNet model so as to generate expected output; and (5) taking the verification set as input to verify the model, testing the robustness of the model and taking the model as a final detection model.
9. And finally, detecting the mulberry leaf disease and insect pest image to be detected through the RP-DCNet model adjusted to the optimal parameters to obtain a disease and insect pest detection result.
The described embodiments of the present invention are only for describing the preferred embodiments of the present invention, and do not limit the concept and scope of the present invention, and the technical solutions of the present invention should be modified and improved by those skilled in the art without departing from the design concept of the present invention, and the technical contents of the present invention which are claimed are all described in the claims.

Claims (7)

1. A mulberry leaf disease and pest detection method based on small sample learning is characterized by comprising the following steps:
step 1, data acquisition:
acquiring an existing pest image dataset of other crop leaves as a base class dataset, and acquiring a mulberry leaf pest image dataset as a new class dataset;
step 2, preprocessing a data set:
respectively dividing the base class data set and the new class data set obtained in the step (1) into a training set and a testing set, and respectively preprocessing the training set and the testing set of the base class data set and the new class data set;
and 3, constructing an RP-DCNet model based on the DCNet model:
the DCNet model comprises a feature extractor, a dense relation distillation module and a context sensing polymerization module, wherein the query feature extractor in the DCNet model takes a query picture as input to obtain a query feature map, and the query feature map is processed by a key encoder and a value encoder of the dense relation distillation module to obtain a queried key feature map and a queried value feature map;
a support feature extractor in the DCNet model takes a support picture and binary mask pictures corresponding to the support picture as input to obtain a support feature map, and the support feature map is processed by a key encoder and a value encoder of a dense relation distillation module to obtain a supported key feature map and a supported value feature map;
adding a relative position coding module in a dense relation distillation module in the DCNet model, thereby obtaining an RP-DCNet model; a key encoder and a value encoder in the dense relation distillation module generate corresponding key feature maps and value feature maps on the query feature map and the support feature map, and the relative position encoding module encodes by taking the dimensionality of the key feature maps generated by the query feature map and the support feature map as a reference, so that relative position encoding is established on the key feature maps;
step 4, training the DCNet model comprises two stages of meta-learning and meta-fine tuning, wherein a training set in the base class data set is input into the RP-DCNet model for training in the meta-learning stage, error calculation is carried out on an output result of the RP-DCNet model during training and a test set in the base class data set, and parameters of the RP-DCNet model are adjusted to be optimal parameters based on an error calculation result;
step 5, inputting the training sets in the new class data set and the training sets in the base class data set with the number equivalent to that of the training sets in the new class data set into an RP-DCNet model for training in a meta-fine tuning stage, performing error calculation on an output result of the RP-DCNet model during training and a test set in the new class data set, and adjusting the parameters of the RP-DCNet model to optimal parameters again based on an error calculation result;
and 6, detecting the mulberry leaf disease and insect pest image to be detected by adopting the RP-DCNet model adjusted to the optimal parameters in the steps 4 and 5 to obtain a disease and insect pest detection result.
2. The mulberry leaf pest detection method based on small sample learning according to claim 1, wherein the preprocessing in the step 2 comprises Mosaic data amplification, random inversion, random cutting and scaling.
3. The mulberry leaf pest and disease damage detection method based on small sample learning as claimed in claim 2, wherein during pretreatment in step 2, Mosaic data is used for enhancement, random inversion is performed at a set probability, one of several scales is randomly selected to zoom data in a training set, and a part of a picture is randomly cut out to be used as a new picture.
4. The mulberry leaf pest detection method based on small sample learning according to claim 1, characterized in that in step 3, key feature maps and value feature maps are respectively extracted from the query feature map and the support feature map, and the position where the relative position coding module is embedded is after matrix multiplication is performed on the extracted key feature maps and the extracted key feature maps are output by a Softmax function;
the relative position coding module establishes relative position codes in the last two dimensions of the key feature map by calculating relative coordinates of the current position and other positions on the basis of the last two dimensions of the key feature map;
and then the relative position coding module performs matrix addition on the formed relative position code and the output of the Softmax function in the relationship intensive distillation module to obtain an output result.
5. The mulberry leaf pest detection method based on small sample learning as claimed in claim 1, wherein in the meta-learning stage in step 4, the base class data set is input into the RP-DCNet model, in this stage, the query feature extractor and the support feature extractor perform joint training, and similarly, in the meta-fine tuning stage, the dense relation distillation module, the context awareness aggregation module and other basic model components in the RP-DCNet model also perform learning along with training in this stage.
6. The mulberry leaf pest detection method based on small sample learning according to claim 1, wherein the error calculation in the steps 4 and 5 comprises classification error calculation and regression error calculation.
7. The mulberry leaf pest and disease detection method based on small sample learning according to claim 1, wherein in steps 4 and 5, the training round in the meta-fine tuning stage is less than the training round in the meta-learning stage.
CN202210458831.8A 2022-04-27 2022-04-27 Mulberry leaf disease and insect pest detection method based on small sample learning Pending CN114741548A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210458831.8A CN114741548A (en) 2022-04-27 2022-04-27 Mulberry leaf disease and insect pest detection method based on small sample learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210458831.8A CN114741548A (en) 2022-04-27 2022-04-27 Mulberry leaf disease and insect pest detection method based on small sample learning

Publications (1)

Publication Number Publication Date
CN114741548A true CN114741548A (en) 2022-07-12

Family

ID=82284254

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210458831.8A Pending CN114741548A (en) 2022-04-27 2022-04-27 Mulberry leaf disease and insect pest detection method based on small sample learning

Country Status (1)

Country Link
CN (1) CN114741548A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115457042A (en) * 2022-11-14 2022-12-09 四川路桥华东建设有限责任公司 Method and system for detecting surface defects of thread bushing based on distillation learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210142097A1 (en) * 2017-06-16 2021-05-13 Markable, Inc. Image processing system
CN113609941A (en) * 2021-07-27 2021-11-05 江苏师范大学 Crop disease and insect pest identification method based on deep learning
US20210349945A1 (en) * 2020-05-11 2021-11-11 Arizona Board Of Regents On Behalf Of Arizona State University Selective sensing: a data-driven nonuniform subsampling approach for computation-free on-sensor data dimensionality reduction
KR20210153270A (en) * 2020-06-10 2021-12-17 비전커넥트 주식회사 Logistics picking monitoring system using image recognition based on artificial intelligence and method for processing thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210142097A1 (en) * 2017-06-16 2021-05-13 Markable, Inc. Image processing system
US20210349945A1 (en) * 2020-05-11 2021-11-11 Arizona Board Of Regents On Behalf Of Arizona State University Selective sensing: a data-driven nonuniform subsampling approach for computation-free on-sensor data dimensionality reduction
KR20210153270A (en) * 2020-06-10 2021-12-17 비전커넥트 주식회사 Logistics picking monitoring system using image recognition based on artificial intelligence and method for processing thereof
CN113609941A (en) * 2021-07-27 2021-11-05 江苏师范大学 Crop disease and insect pest identification method based on deep learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
温长吉;娄月;张笑然;杨策;刘淑艳;于合龙;: "基于改进稠密胶囊网络模型的植物识别方法", 农业工程学报, no. 08, 23 April 2020 (2020-04-23) *
温长吉等: "基于改进稠密胶囊网络模型的植物识别方法", 《农业工程学报》, 30 April 2020 (2020-04-30), pages 143 - 149 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115457042A (en) * 2022-11-14 2022-12-09 四川路桥华东建设有限责任公司 Method and system for detecting surface defects of thread bushing based on distillation learning

Similar Documents

Publication Publication Date Title
CN112364931B (en) Few-sample target detection method and network system based on meta-feature and weight adjustment
CN112070069A (en) Method and device for identifying remote sensing image
EP3971767A1 (en) Method for constructing farmland image-based convolutional neural network model, and system thereof
CN114708903A (en) Method for predicting distance between protein residues based on self-attention mechanism
CN114676769A (en) Visual transform-based small sample insect image identification method
CN114741548A (en) Mulberry leaf disease and insect pest detection method based on small sample learning
CN117557914A (en) Crop pest identification method based on deep learning
CN113723541B (en) Slope displacement prediction method based on hybrid intelligent algorithm
CN112884135B (en) Data annotation correction method based on frame regression
CN116383437A (en) Cross-modal material recommendation method based on convolutional neural network
CN116778391A (en) Multi-mode crop disease phenotype collaborative analysis model and device
CN111563180A (en) Trademark image retrieval method based on deep hash method
CN114519402B (en) Citrus disease and insect pest detection method based on neural network
CN116340039A (en) Log anomaly detection method based on pretrained BERT sentence vector and Informar-encoder
CN114764827A (en) Mulberry leaf disease and insect pest detection method under self-adaptive low-illumination scene
CN115100246A (en) Cross-modal retrieval method and system for language-vision target tracking
CN114549536A (en) Microbial colony segmentation method based on attention mechanism
CN114818945A (en) Small sample image classification method and device integrating category adaptive metric learning
CN114723998A (en) Small sample image classification method and device based on large-boundary Bayes prototype learning
CN111259176B (en) Cross-modal Hash retrieval method based on matrix decomposition and integrated with supervision information
CN111882441A (en) User prediction interpretation Treeshap method based on financial product recommendation scene
Lohi et al. Empirical Analysis of Crop Yield Prediction and Disease Detection Systems: A Statistical Perspective
CN110795591A (en) Image retrieval method based on discrete gradient back propagation
CN113378936B (en) Faster RCNN-based few-sample target detection method
CN117011719B (en) Water resource information acquisition method based on satellite image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination