CN114782840A - Real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images - Google Patents

Real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images Download PDF

Info

Publication number
CN114782840A
CN114782840A CN202210414686.3A CN202210414686A CN114782840A CN 114782840 A CN114782840 A CN 114782840A CN 202210414686 A CN202210414686 A CN 202210414686A CN 114782840 A CN114782840 A CN 114782840A
Authority
CN
China
Prior art keywords
features
aerial vehicle
unmanned aerial
feature
period
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210414686.3A
Other languages
Chinese (zh)
Inventor
姚霞
周萌
杨涛
刘鹏
郑恒彪
李栋
程涛
朱艳
曹卫星
王雪
郭彩丽
张羽
马吉峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Agricultural University
Original Assignee
Nanjing Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Agricultural University filed Critical Nanjing Agricultural University
Priority to CN202210414686.3A priority Critical patent/CN114782840A/en
Publication of CN114782840A publication Critical patent/CN114782840A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/211Selection of the most significant subset of features
    • G06F18/2113Selection of the most significant subset of features by ranking or filtering the set of features, e.g. using a measure of variance or of feature cross-correlation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images, which comprises the following steps: (1) acquiring time sequence high-spatial-resolution RGB images according to actual growth conditions of wheat fields processed in different sowing periods, and preprocessing the images to obtain unmanned aerial vehicle images in the same area in different years; (2) extracting spectral information and texture information of the time sequence unmanned aerial vehicle image, and taking all derived spectral features and texture features as a feature complete set; (3) sorting all feature importance by a feature selection algorithm based on a compact-separation principle, and determining the optimal features and the number of the features; (4) and (3) automatically classifying and identifying the characteristics of different phenological stages by using an mRVM classifier to obtain the overall classification precision and the classification precision of each period. The classification method constructed by the invention is simple and efficient, can obtain timely crop phenology information, and provides a basis for effectively guiding agricultural management decisions, such as irrigation, fertilization, pesticide management activities and the like at specific stages.

Description

Real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images
Technical Field
The invention belongs to the technical field of precision agriculture, mainly relates to a wheat phenological period monitoring method, and particularly relates to a wheat phenological period classification method based on RGB images and machine learning.
Background
Accurate information related to field management, including crop water consumption, phenology periods, and yield data, is critical to managing crop growth using sustainable and accurate farming methods. One aspect of farm management relates to information about the phenological period of crops, which is one of the most important applications in agriculture. The phenological period is a seasonal biological life stage driven by environmental factors and is considered to be a sensitive and accurate indicator of climate change. In china, climate warming has accelerated since the 1980 s, affecting crop development and productivity, and crop phenology monitoring can be used as a measure of climate change. Second, changes in the phenological period may have a wide impact on the terrestrial ecosystem and human society, for example, by altering global carbon, water and nitrogen cycles, crop yield (risk of frost damage), duration of pollination season, disease, and the like. Therefore, the study of the phenological period has also become an important focus of recent ecological studies. Wheat is an important grain crop worldwide, the planting area is the first of various crops, and the wheat is the third grain crop in China. Under the new situation, the grain safety becomes the core of national grain policy, and the position of wheat in the grain safety will be increasingly highlighted. In view of this, the evaluation of the wheat phenological period has important significance for stably improving the wheat production capacity of China and ensuring the grain safety.
At present, the threshold value method, the maximum slope method and the curve fitting method based on optical remote sensing are used for effectively detecting the crop phenology. However, these methods can only detect the beginning of growth, the peak of greening and the end of the season according to the inflection point, and thus cannot detect other critical stages (such as the jointing stage, the flowering stage and the filling stage, etc.) that are closely related to agricultural practice. Secondly, empirical models and curve fitting methods based on time series vegetation index curves are sensitive to environmental noise. The method still has a most inevitable problem that the phenological period can be monitored from historical data only after the whole growth period is finished, hysteresis exists, and guidance cannot be provided for field management in time. Therefore, to overcome this bottleneck of the phenological period, real-time monitoring is important.
In recent years, the rapid development of the unmanned aerial vehicle technology provides favorable conditions for acquiring real-time high-precision remote sensing image data. The plant phenological period is observed by a digital camera multi-temporal shooting means, so that the artificial resources are saved, phenological period data with high frequency and high spatial resolution can be provided, the device is suitable for complex farmland environments, the technical efficiency is high in field crop phenotypic information rapid analysis, and the cost is low. In this case, the digital camera carried on the unmanned aerial vehicle is an effective tool for monitoring the crop phenological period.
Disclosure of Invention
The invention aims to provide a real-time classification method for the phenological period of wheat based on RGB images of an unmanned aerial vehicle, which realizes real-time discrimination of the phenological period by using the atlas features after high-resolution image screening and an advanced classification algorithm.
In order to achieve the purpose, the invention adopts the technical scheme that: a real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images is characterized in that original RGB images obtained by an unmanned aerial vehicle are spliced, spectrum and texture features of different growth stages are extracted through pretreatment such as radiation correction and geometric correction, the importance of each feature is measured by using a feature selection algorithm based on a compact-separation principle, and then the optimal features and the feature quantity are determined. The mRVM classifier is used for classifying eight important growth periods (seedling emergence period, tillering period, jointing period, booting period, heading period, flowering period, filling period and mature period) of the wheat; the method comprises the following specific steps:
step 1: acquiring time sequence high spatial resolution RGB images according to actual growth conditions of wheat fields processed in different sowing periods and preprocessing the images to obtain unmanned aerial vehicle images from the same area in different years;
step 2: extracting spectral information and texture information of the time sequence unmanned aerial vehicle image, and taking all derived spectral features and texture features as a feature complete set;
and 3, step 3: sorting all feature importance by a feature selection algorithm based on a compact-separation principle, and determining the optimal features and the number of the features;
and 4, step 4: and (3) automatically classifying and identifying the characteristics of different phenological stages by using an mRVM classifier to obtain the overall classification precision and the classification precision of each period.
Further, in the step 1, the pretreatment specifically includes:
step 1-1: performing image splicing on the single high-spatial-resolution images at the same time to obtain an orthoimage of a complete test area;
step 1-2: and respectively carrying out geographic registration and radiometric calibration on the orthographic images of each period to obtain spectral reflectivity data of the unmanned aerial vehicle.
Furthermore, the ground resolution of the acquired image data is 2-3 cm, after image splicing, the first image is selected as a reference image, and all the rest images are sequentially subjected to image registration with the reference image, so that the coordinate error caused by the influence of environmental conditions in actual shooting is reduced.
Further, in the step 2, the specific step of generating the feature complete set includes:
step 2-1: calculating color spectral index (including R) according to the reflectivity of each pretreated wave bandcc、GccGRVI, VARI and ExG, and raw band reflectivity data as a spectral feature corpus;
step 2-2: respectively calculating gray level co-occurrence matrixes of an R channel, a G channel and a B channel of the corrected image;
step 2-3: respectively calculating texture features under different gray level co-occurrence matrixes with a window of 3 x 3 and a motion direction of D1(0 degrees), wherein the texture features comprise: mean (MEA), Variance (VAR), Homogeneity (HOM), Contrast (CON), angular second moment (SEC), Correlation (COR), heterogeneity (DIS) and Entropy (ENT), and then calculating normalized texture index between every two texture features, wherein all the features are used as a texture feature complete set.
Further, in step 3, the specific step of determining the optimal features and the number of features includes:
step 3-1: taking the spectral features and the texture features as input data;
step 3-2: applying a feature selection algorithm, respectively calculating compact-separation coefficients feature by feature, obtaining importance scores of each feature and sequencing;
step 3-3: and determining the optimal feature quantity by taking the obtained highest classification precision as a final target.
Further, in the step 3-2, the compact-separation coefficient is calculated by the following formula:
Figure BDA0003605130130000031
Figure BDA0003605130130000032
ρcs(k)=αρs(k)-(1-α)ρc(k)
wherein the tightening coefficient ρcAnd separation coefficient ρsRespectively measuring the compactness of each stage and the separation degree of different stages; given a set of training samples, the training samples,
Figure BDA0003605130130000033
xi∈RNn is the number of training samples, and N is the number of features; the category label of phenological stage Y ═ Y1,y2,…,yMM is the number of phenological stages, yiA training sample set belonging to the ith stage; the number of training samples per stage is { beta }i},i=1,2,…,M;ρcsFor the compact-separation coefficient, α is a factor of 0 to 1, and ρ can be adjustedcAnd ρsThe relationship between; characteristic rhocsThe higher the value, the greater the distinction of features over the phenological phase.
Further, in the step 4, the specific step of obtaining the classification result includes:
step 4-1: taking the optimal feature subsets and the birth period labels of all the sample points as input data;
step 4-2: constructing a wheat phenological period classification model by using an mRVM classifier, wherein a Gaussian kernel function is used as an inner kernel; the Gaussian kernel function parameter of the mRVM model is set to be 1/D, and D is the characteristic quantity; the verification method adopts a ten-fold cross verification mode, and finally, the optimal overall classification precision and the classification precision of each period are obtained and stored.
Compared with the prior art, the technical scheme adopted by the invention has the following technical effects:
(1) the real-time wheat phenological period classification and identification method based on the unmanned aerial vehicle RGB images is low in cost, simple and efficient to operate, capable of achieving automation and easy to popularize.
(2) The real-time wheat phenological period classification and identification method based on the unmanned aerial vehicle RGB images overcomes the hysteresis of the previous growth period monitoring, and realizes real-time phenological period discrimination;
additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
FIG. 1 is a schematic of the experimental design of the present invention; wheat tests in two years are carried out in 2017-2018 and 2018-2019 in the grain industry park of the Kyoho city of Taizhou city in Jiangsu province. The test involved different stages of sowing (2017-. The total number of the field blocks is 12, and the area of each field block in the first year is 2700m2(90 m.times.30 m), 1350m in the next year2(45m×30m)。
Fig. 2 is a schematic diagram of the real-time wheat phenological period classification method based on the unmanned aerial vehicle RGB image.
FIG. 3 is a spatial distribution diagram of the original RGB image and the spectrum and texture characteristics of the unmanned aerial vehicle along with the variation of the phenological period; taking the field of the 23-variety of the poplar and wheat under the conditions of seeding date I in 2017 and high nitrogen level treatment as an example; wherein a1-a8 is original RGB image, b1-b8 is spectral feature (GRVI), c1-c8 is a texture feature (NDTI (CON)G,SECG))。
FIG. 4 is a response rule of spectrum and texture characteristics derived from the unmanned aerial vehicle along with a phenological period; wherein (a) is a spectral feature (GRVI) and (b) is a texture feature (NDTI (CON)G,SECG))。
FIG. 5 is a graph of the scores of all features generated by the compaction-separation principle based feature selection algorithm of the present invention; the black part is a spectral feature, and the gray part is a texture feature, and the total number is 308. The score is located at the dotted line (p)cs0.5560) the above features are the most preferable features.
Fig. 6 is a confusion matrix result diagram of real-time classification of wheat phenological period based on unmanned aerial vehicle RGB images.
Fig. 7 is a comparison of the classification results of the phenology classification based on different feature inputs according to the present invention.
In fig. 8: (a) the overall classification precision curve graph along with the feature input quantity under different values of parameters in the feature selection algorithm is used for determining the optimal feature quantity; wherein the top dot represents the fastest arriving optimal estimation accuracy value based on different values of the parameter α in the first 60 features. (b) Is a graph of the overall classification accuracy with the number of feature inputs obtained by the other feature selection methods (mRMR, ReliefF, RFE, RF _ GINI) and the feature selection method of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are exemplary only for explaining the present invention and are not construed as limiting the present invention.
In the implementation of the invention, a wheat growing area is taken as an example, a research area is shown in fig. 1, and adopted unmanned aerial vehicle data is an unmanned aerial vehicle image obtained by an RGB camera (FC300X, shenzhen, china) carried by maing fairy 3(DJI Phantom 3, shenzhen, china). The unmanned aerial vehicle automatically sets a flight path for flying, the field angle is 94 degrees, the flying height is 60-70 meters, the endurance time is 5-20 minutes, fixed-point hovering shooting is carried out, and the side direction repetition rate and the heading repetition rate are 82% and 75% respectively.
The technical solution of the present invention is further described in detail with reference to the accompanying drawings and specific embodiments.
Fig. 2 is a schematic flow chart of a real-time wheat phenological period monitoring method based on unmanned aerial vehicle RGB images, which may include the following steps:
the method comprises the following steps: and acquiring a time sequence RGB image with high spatial resolution in the whole growth period of the wheat and preprocessing the time sequence RGB image.
Step 1-1: performing image splicing on the single high-spatial-resolution images at the same time period to obtain an orthoimage of a complete test area;
step 1-2: respectively carrying out geographic registration on the ortho-images in each period to obtain spectral reflectivity data of the unmanned aerial vehicle;
it should be added that the ground resolution of the acquired image data is 2 to 3 cm. After the images are spliced, the first image is selected as a reference image, and all the rest images are sequentially subjected to image registration with the reference image, so that the coordinate error caused by the influence of environmental conditions in actual shooting is reduced.
Step two: the method comprises the following steps of extracting spectral information and texture information of time sequence unmanned aerial vehicle images, and taking all derived spectral features and texture features as a feature corpus, wherein the specific steps comprise:
step 2-1: calculating a color spectral index including R from the corrected reflectances of the respective wavelength bandscc、GccGRVI, VARI and ExG, and raw band reflectivity data as a spectral feature corpus;
step 2-2: respectively calculating gray level co-occurrence matrixes of an R channel, a G channel and a B channel of the corrected image;
step 2-3: respectively calculating texture features under different gray level co-occurrence matrixes with the window size of 3 x 3 and the moving direction of D1(0 degrees), wherein the texture features comprise: mean (MEA), Variance (VAR), Homogeneity (HOM), Contrast (CON), angular second moment (SEC), Correlation (COR), heterogeneity (DIS) and Entropy (ENT), and then calculating normalized texture index (NDTI) between every two texture features, wherein all the features are used as a texture feature complete set;
it is added that the ENVI/IDL software is used to extract the original DN values of the cells (ROI) in the digital image and to compute the spectral features, the method of which is shown in Table 1.
TABLE 1 spectral features based on unmanned aerial vehicle RGB image acquisition
Figure BDA0003605130130000051
The gray level co-occurrence matrix involves three important parameters, which are the moving step size, the moving direction and the size of the window. The window size is related to the image resolution, with smaller windows characterizing finer texture features and larger windows characterizing coarse texture features. The step size of the movement depends on the texture thickness of the image. There are four directions of movement, 0 °, 45 °, 90 ° and 135 °, respectively. Texture features extracted perpendicular to the row direction have proven effective in monitoring row crop growth parameters. In the invention, the test fields are planted in the north-south vertical lines, so that only the textural features in the D1 direction are extracted, and the calculation method of the textural features is shown in the table 2. In addition, the normalized texture index (NDTI) can improve the sensitivity to the physical and the physical, and 276 NDTIs are obtained in total. The total number of texture features is 300.
TABLE 2 texture features based on unmanned aerial vehicle RGB image acquisition
Figure BDA0003605130130000061
And 3, step 3: the feature selection algorithm based on the compact-separation principle ranks the importance of all features and determines the optimal features and the number of the features, and the specific steps comprise:
step 3-1: taking the spectral characteristics and the texture characteristics as input data, and carrying out normalization processing;
step 3-2: applying a feature selection algorithm, respectively calculating compact-separation coefficients feature by feature, obtaining importance scores of each feature and sequencing;
it should be added that the specific calculation process of the compaction-separation coefficient is as follows:
Figure BDA0003605130130000062
Figure BDA0003605130130000071
ρcs(k)=αρs(k)-(1-α)ρc(k)
wherein the compactness coefficient ρcAnd separation coefficient ρsRespectively, for measuring the compactness between each stage and the separation between different stages. Given a set of training samples, the training samples,
Figure BDA0003605130130000072
xi∈RN(where N is the number of training samples and N is the number of features); the object-waiting stage class label Y ═ Y1,y2,…,yM](wherein M is the number of phenological stages, yiIs a training sample set belonging to the ith stage); the number of training samples in each stage is { betai},i=1,2,…,M;ρcsFor the compact-separation coefficient, α is a factor of 0 to 1, which can adjust ρcAnd ρsIn the present invention, α has a value of 0.6. Characteristic pcsThe higher the value, the greater the distinction of features over the phenological phase.
Step 3-3: and determining the optimal feature quantity by taking the obtained highest classification precision as a final target.
And 4, step 4: the method comprises the following steps of automatically classifying and identifying the characteristics of different phenological stages by using an mRVM classifier to obtain the overall classification precision and the classification precision of each period, wherein the method comprises the following specific steps:
step 4-1: taking the optimal feature subsets and the birth period labels of all sample points as input data;
step 4-2: and training and verifying the model by using an mRVM classifier, wherein the kernel uses a Gaussian kernel function. The Gaussian kernel function parameter of the mRVM model is set to be 1/D, and D is the characteristic quantity. The verification method adopts a ten-fold cross verification mode, and finally obtains and stores the optimal overall classification precision and the classification precision of each period.
The specific feature screening results of this example are shown in fig. 5 and table 3:
TABLE 3 optimal feature selection results
Figure BDA0003605130130000073
The texture characteristics and the spectral characteristics can play a role in classifying the phenological stages of the wheat. And finally, selecting the first 13 characteristics as optimal characteristics. The performance of the optimal characteristic sensitive to the phenological period screened by the characteristic selection method FS-CS provided by the invention is compared with the performance of the optimal characteristic sensitive to the phenological period screened by the characteristic selection method mRMR, Relieff, RFE and RF _ GINI. Fig. 8(b) summarizes the above evaluation of the wheat phenological stage by the feature selection method, and the feature selection method of the present invention can obtain the highest classification accuracy when the feature input amount is the minimum.
The classification result of the phenological stage of wheat in this example is shown in fig. 6, and it can be seen from the confusion matrix chart that mRVM can well classify the seedling stage, tillering stage, filling stage and mature stage. The classification algorithm mRVM provided by the invention is compared with the performances of BN, KNN and SVM algorithms in the wheat phenological period classification based on the unmanned aerial vehicle RGB image. The classification accuracy and the overall accuracy ratio of the eight important phenological stages obtained by the prior art and the method of the present invention are shown in table 4. As can be seen from Table 4, the overall classification accuracy obtained by the method of the present invention is the highest. The method can improve the accuracy of wheat phenological monitoring and provide a basis for management decision of large-area fields in time.
TABLE 4 comparison of classification results for different classifiers
Figure BDA0003605130130000081
The foregoing is directed to embodiments of the present invention and, more particularly, to a method and apparatus for controlling a power converter in a power converter, including a power converter, a display and a display panel.
The parts not involved in the present invention are the same as or can be implemented using the prior art.

Claims (7)

1. A wheat phenological period real-time classification method based on unmanned aerial vehicle RGB images is characterized by comprising the following steps:
step 1: acquiring time sequence high spatial resolution RGB images according to actual growth conditions of wheat fields processed in different sowing periods, and preprocessing to obtain unmanned aerial vehicle images from the same region in different years;
and 2, step: extracting spectral information and texture information of the time sequence unmanned aerial vehicle image, and taking all derived spectral features and texture features as a feature complete set;
and 3, step 3: sorting all feature importance by a feature selection algorithm based on a compact-separation principle, and determining the optimal features and the number of the features;
and 4, step 4: and (3) automatically classifying and identifying the characteristics of different phenological stages by using an mRVM classifier to obtain the overall classification precision and the classification precision of each period.
2. The real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images as claimed in claim 1, wherein in step 1, the pretreatment comprises the following specific steps:
step 1-1: performing image splicing on the single high-spatial-resolution images at the same time period to obtain an orthoimage of a complete test area;
step 1-2: and respectively carrying out geographic registration and radiometric calibration on the ortho-images in each period to obtain spectral reflectivity data of the unmanned aerial vehicle.
3. The real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images as claimed in claim 2, wherein the ground resolution of the acquired image data is 2-3 cm, after image stitching, the first image is selected as a reference image, and all the other images are sequentially subjected to image registration with the reference image, so that coordinate errors caused by the influence of environmental conditions in actual shooting are reduced.
4. The real-time wheat phenology period classification method based on unmanned aerial vehicle RGB images as claimed in claim 1, wherein in step 2, the specific step of generating a feature complete set includes:
step 2-1: calculating color spectral index including R according to the reflectivity of each pretreated wave bandcc、GccGRVI, VARI and ExG, and original waveband reflectivity data as a spectral feature corpus;
step 2-2: respectively calculating gray level co-occurrence matrixes of an R channel, a G channel and a B channel of the corrected image;
step 2-3: respectively calculating texture features under different gray level co-occurrence matrixes with a window of 3 x 3 and a motion direction of D1(0 degrees), wherein the texture features comprise: mean (MEA), Variance (VAR), Homogeneity (HOM), Contrast (CON), angular second moment (SEC), Correlation (COR), heterogeneity (DIS) and Entropy (ENT), and calculating normalized texture index between every two texture features, wherein all the above features are used as a texture feature complete set.
5. The real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images as claimed in claim 1, wherein in step 3, the specific steps of determining the optimal features and the number of features include:
step 3-1: taking the spectral features and the texture features as input data;
step 3-2: applying a feature selection algorithm, respectively calculating compact-separation coefficients feature by feature, obtaining importance scores of each feature and sequencing;
step 3-3: and determining the optimal feature quantity by taking the obtained highest classification precision as a final target.
6. The real-time wheat phenology period classification method based on unmanned aerial vehicle RGB images as claimed in claim 4, wherein in the step 3-2, the calculation formula of the compaction-separation coefficient is as follows:
Figure FDA0003605130120000021
Figure FDA0003605130120000022
ρcs(k)=αρs(k)-(1-α)ρc(k)
wherein the compactness coefficient ρcAnd separation coefficient psRespectively measuring the compactness of each stage and the separation degree of different stages; given a set of training samples, the training samples,
Figure FDA0003605130120000023
n is the number of training samples, and N is the number of features; the category label of phenological stage Y ═ Y1,y2,…,yMM is the number of phenological stages, yiA training sample set belonging to the ith stage; the number of training samples per stage is { beta }i},i=1,2,…,M;ρcsFor the compact-separation coefficient, α is a factor of 0 to 1, and ρ can be adjustedcAnd ρsThe relationship between; characteristic rhocsThe higher the value, the greater the distinction of features over the phenological phase.
7. The real-time wheat phenology period classification method based on unmanned aerial vehicle RGB images as claimed in claim 1, wherein in step 4, the specific step of obtaining the classification result includes:
step 4-1: taking the optimal feature subsets and the birth period labels of all the sample points as input data;
step 4-2: constructing a wheat phenological period classification model by using an mRVM classifier, wherein a Gaussian kernel function is used as an inner kernel; the Gaussian kernel function parameter of the mRVM model is set to be 1/D, and D is the characteristic quantity; the verification method adopts a ten-fold cross verification mode, and finally obtains and stores the optimal overall classification precision and the classification precision of each period.
CN202210414686.3A 2022-04-20 2022-04-20 Real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images Pending CN114782840A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210414686.3A CN114782840A (en) 2022-04-20 2022-04-20 Real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210414686.3A CN114782840A (en) 2022-04-20 2022-04-20 Real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images

Publications (1)

Publication Number Publication Date
CN114782840A true CN114782840A (en) 2022-07-22

Family

ID=82430391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210414686.3A Pending CN114782840A (en) 2022-04-20 2022-04-20 Real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images

Country Status (1)

Country Link
CN (1) CN114782840A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116597157A (en) * 2023-07-11 2023-08-15 中国科学院地理科学与资源研究所 Plant climate extraction method and device based on characteristic spectrum change
CN117689959A (en) * 2024-01-30 2024-03-12 中交第二公路勘察设计研究院有限公司 Remote sensing classification method for fusing vegetation life cycle features

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116597157A (en) * 2023-07-11 2023-08-15 中国科学院地理科学与资源研究所 Plant climate extraction method and device based on characteristic spectrum change
CN116597157B (en) * 2023-07-11 2023-09-26 中国科学院地理科学与资源研究所 Plant climate extraction method and device based on characteristic spectrum change
CN117689959A (en) * 2024-01-30 2024-03-12 中交第二公路勘察设计研究院有限公司 Remote sensing classification method for fusing vegetation life cycle features

Similar Documents

Publication Publication Date Title
Tetila et al. Automatic recognition of soybean leaf diseases using UAV images and deep convolutional neural networks
CN106372592B (en) A kind of winter wheat planting area calculation method based on winter wheat area index
CN105678281B (en) Remote sensing monitoring method for mulching film farmland based on spectrum and texture characteristics
CN111241912A (en) Multi-vegetation index rice yield estimation method based on machine learning algorithm
CN114782840A (en) Real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images
CN114821362B (en) Multi-source data-based rice planting area extraction method
CN105758806B (en) Remote sensing monitoring method for mulching film farmland based on spectral characteristics
CN112986158B (en) Beet nitrogen nutrition detection method and system based on unmanned aerial vehicle multispectral data
CN114140695B (en) Prediction method and system for tea tree nitrogen diagnosis and quality index determination based on unmanned aerial vehicle multispectral remote sensing
Liu et al. Estimating maize seedling number with UAV RGB images and advanced image processing methods
Zhou et al. Wheat phenology detection with the methodology of classification based on the time-series UAV images
CN116188793A (en) Astragalus sinicus planting area monitoring method based on satellite remote sensing image
CN109960972A (en) A kind of farm-forestry crop recognition methods based on middle high-resolution timing remotely-sensed data
CN117036861A (en) Corn crop line identification method based on Faster-YOLOv8s network
Wei et al. Application of remote sensing technology in crop estimation
CN115205688A (en) Tea tree planting area extraction method and system
Yang et al. Feature extraction of cotton plant height based on DSM difference method
CN112488230A (en) Crop water stress degree judging method and device based on machine learning
Ding et al. Rice lodging area extraction based on YCbCr spatial and texture features
Guo et al. High-throughput estimation of plant height and above-ground biomass of cotton using digital image analysis and Canopeo
Liu et al. The estimation of wheat tiller number based on UAV images and gradual change features (GCFs)
CN111027523A (en) Satellite remote sensing monitoring method for carotenoid content in cotton canopy
CN117391315B (en) Agricultural meteorological data management method and device
CN115909062A (en) Wheat yield estimation method based on multi-point fusion
CN117371603A (en) Wheat lodging parameter prediction method based on multi-mode data and ensemble learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination