CN114282609B - Method for extracting indexes of area and phenology of crops - Google Patents

Method for extracting indexes of area and phenology of crops Download PDF

Info

Publication number
CN114282609B
CN114282609B CN202111591712.1A CN202111591712A CN114282609B CN 114282609 B CN114282609 B CN 114282609B CN 202111591712 A CN202111591712 A CN 202111591712A CN 114282609 B CN114282609 B CN 114282609B
Authority
CN
China
Prior art keywords
crop
image
time sequence
convolution
phenological
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111591712.1A
Other languages
Chinese (zh)
Other versions
CN114282609A (en
Inventor
张瑞
王婷
庞嘉泰
展润青
李涛
王晓文
张伦宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest Jiaotong University
Original Assignee
Southwest Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest Jiaotong University filed Critical Southwest Jiaotong University
Priority to CN202111591712.1A priority Critical patent/CN114282609B/en
Publication of CN114282609A publication Critical patent/CN114282609A/en
Application granted granted Critical
Publication of CN114282609B publication Critical patent/CN114282609B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a crop area and phenological index extraction method, which comprises the steps of preprocessing an SAR image, including orbit correction, thermal noise removal, radiometric calibration, speckle filtering, terrain correction and decibel processing to obtain a backscattering coefficient of crop VV/VH; constructing a characteristic vector according to backscattering coefficients of VV/VH of crops, providing a full convolution neural network model based on a time sequence, and simultaneously drawing a crop planting area according to SAR image data of the time sequence; combining an SAR-VV/VH method, and extracting crop phenological indexes according to the growth cycle of regional crops; compared with the existing random forest algorithm, the method for extracting the spatial texture of the crop land based on the full convolution neural network has the advantages that the precision of extracting the crop is high, the spatial texture information of the crop land can be relatively and completely reserved, the purpose of accurately drawing the crop is achieved, and the method has high applicability in crop extraction.

Description

Method for extracting crop area and phenological index
Technical Field
The invention relates to the technical field of agricultural remote sensing spatial information data measurement, in particular to a method for extracting crop area and phenological indicators.
Background
Nowadays, the planting mode of crops plays an important role in the dynamic monitoring of urban vegetation, wherein the dynamic monitoring and yield estimation of the growth of crops are always the core problems of agricultural remote sensing. The development of scholars at home and abroad on a crop identification algorithm of a spectral image is mature. In summary, the following categories are mainly included: the method comprises a threshold value method, a Support Vector Machine (SVM), a Decision Tree (DT), a Random Forest (RF) and the like, wherein a full convolution neural network is improved on the idea of a spectral neural network and is combined with an SAR image to carry out crop identification and extraction.
The advantages and disadvantages of the crop extraction method are as follows:
the method can directly utilize the gray characteristic of the image to identify the target based on the threshold algorithm, has simple calculation, high operation efficiency and high speed, but is sensitive to noise, and has no obvious gray difference and no obvious segmentation of overlapping gray values of different targets.
The support vector machine algorithm has the advantages of simple algorithm and better robustness, and the key points are that key samples can be captured and a large number of redundant samples can be eliminated. However, the SVM is difficult to rapidly classify large-scale training samples, and the performance of the support vector machine depends on the selection of parameters and kernel functions, and is greatly influenced by human factors.
The decision tree algorithm has the advantages of high processing efficiency, easiness in understanding and realizing, capability of processing irrelevant characteristic data and insensitivity to missing values. However, the data processing method is not good in performance when processing data with stronger characteristic relevance, classification can be performed only according to one field during classification, and much preprocessing work is needed for data with time sequence.
The random forest algorithm based method has the advantages of high training speed, simplicity in implementation and capability of processing high-latitude data, characteristic missing data and unbalanced data. But if the number of training samples exceeds the training set data range, it may result in an overfitting when modeling data of a particular noise.
Therefore, it is necessary to develop a new method for extracting the index of crop area and phenological index to solve the above problems.
Disclosure of Invention
The invention aims to solve the problems and designs a method for extracting the crop area and phenological index.
The invention realizes the purpose through the following technical scheme:
a method for extracting crop area and phenological indicators comprises the following steps:
s1, preprocessing the SAR image to obtain a backscattering coefficient of the crop in a VV/VH polarization mode; the preprocessing comprises track correction, thermal noise removal, radiometric calibration, coherent speckle filtering, terrain correction and decibel processing;
s2, constructing a feature vector according to a backscattering coefficient of the crop in a VV/VH polarization mode, introducing the feature vector and a time sequence SAR image into a full convolution neural network model based on a time sequence, and obtaining a classification result of the crop, namely a crop planting condition, through internal training of the full convolution neural network model; finally, the classification result is led into drawing software to add a graph name legend, and a crop planting area graph can be obtained;
s3, extracting the climatic indexes of the crops by combining the VV and VH ratio in the SAR image according to the growth cycle condition of the crops in the area.
Specifically, step S2 includes:
s21, inputting the time sequence vector of each pixel in the whole SAR image into a full convolution neural network model as an input layer, and then performing convolution on the input time sequence vector to extract the time sequence characteristics of all pixels in the SAR image;
s22, inputting the timing sequence characteristics extracted from the convolution layer into a pooling layer, and pooling the timing sequence characteristics;
s23, combining the first three layers in the constructed frequency spectrum FCN into a combined layer, adding the combined layers point by point, combining the fourth convolution layer, and performing convolution and classification on the fourth convolution layer;
s24, when the final classification is carried out in the full convolution neural network classification model and the number of training samples input in the hyperspectral image classification is sparse, introducing a mask matrix and filtering image pixels; wherein the number of training samples is sparsely represented as: the number of training samples in each category is less than 10 times of the number of the remote sensing data wave bands.
Specifically, step S21 convolves the input timing vector, and the calculation formula adopted by the convolved image size is:
Figure BDA0003430071800000031
in the above formula, W is the image length, H is the image width, F is the convolution kernel size, S is the stride, and P is the number of layers of each input edge complement 0;
each channel of the convolutional layer is represented as:
Figure BDA0003430071800000032
in the formula C i Is the ith channel of the convolutional signature; w is a i Is the ith convolution kernel X j Is the ith channel of the previous layer; b i Is the bias term for the ith feature map.
Specifically, the formula for pooling the time-series characteristics of the extracted pixels in step S22 is as follows:
Figure BDA0003430071800000041
where W is the image length, H is the image width, F is the convolution kernel size, and S is the stride.
Specifically, the formula for performing the convolution operation on the pooled data in step S23 is as follows:
Figure BDA0003430071800000042
in the above formula, the first and second carbon atoms are,
Figure BDA0003430071800000043
is the ith channel of the fourth convolutional layer, where
Figure BDA0003430071800000044
Is the jth channel of the kth convolutional layer; w is a 4 And b 4 Are the corresponding weight and bias terms.
Specifically, in step S24, a mask matrix is introduced, and the following formula is adopted for image pixel filtering:
Figure BDA0003430071800000045
where M (u, v) represents the training set, y (u, v) represents the true value of the pixel at the image (u, v),
Figure BDA0003430071800000046
representing the predicted values of the pixels at the image (u, v), r and c are the number of rows and columns, respectively, of the image.
Specifically, step S3 includes:
s31, reconstructing a time sequence by utilizing the backscattering coefficient images of the VV/VH polarization mode, and carrying out S-G filtering on the backscattering time sequence;
s32, deducing a phenological indicator of the crop according to the smooth backscattering time sequence; the method comprises the following steps of establishing a corresponding relation between a phenological period and a time sequence specific time period by using priori knowledge acquired from ground phenological observation: according to the smooth backscattering coefficient graph and the time series change of the slope of the smooth backscattering coefficient graph, the difference between the VH polarization and the VV polarization of the crops, and the time series slope at the ratio of the VH image decibel value and the VV image decibel value of the crops are utilized to analyze the phenological characteristics of the crops by combining the growth cycle of the crops.
Specifically, in step S31, the following formula is adopted for S-G filtering of the backscatter time series:
Figure BDA0003430071800000051
in the above formula, the first and second carbon atoms are,
Figure BDA0003430071800000052
is the S-G filtered value, X is the set of all data points in the filtering window, Y is the filtered value after K-1 fits to X, X T Refers to the transpose of matrix X.
The invention has the beneficial effects that:
compared with the existing random forest algorithm, the method for extracting the spatial texture of the crop land based on the full convolution neural network has the advantages that the precision of extracting the crop is high, the spatial texture information of the crop land can be relatively and completely reserved, the purpose of accurately drawing the crop is achieved, and the method has high applicability in crop extraction.
Drawings
FIG. 1 is a block flow diagram of the present application;
FIG. 2 is a diagram of a full convolution neural network model as used in the present application.
FIG. 3 is a graph showing the results of the first example using the full convolution network neural network (taking rice in Fujin city as an example);
FIG. 4 is the results of example one, wherein (a) is a chart of a rice crop from the northeast of the group of researchers; (b) drawing a random forest model rice crop; (c) drawing the rice planting information of the full convolution network model;
FIG. 5 is a graph showing the results of the second example (taking rice in Fujin city as an example), in which (a) is the time series change of the mean value and slope of VH/VV of rice; (b) and carrying out smoothing filtering on the rice VV/VH time sequence and SG.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present invention, it is to be understood that the terms "upper", "lower", "inside", "outside", "left", "right", and the like, indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, or the orientations or positional relationships that the products of the present invention are conventionally placed in use, or the orientations or positional relationships that are conventionally understood by those skilled in the art, and are used for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the devices or elements referred to must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention.
Furthermore, the terms "first," "second," and the like are used merely to distinguish one description from another, and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it is also to be noted that, unless otherwise explicitly stated or limited, the terms "disposed" and "connected" are to be interpreted broadly, and for example, "connected" may be a fixed connection, a detachable connection, or an integral connection; can be mechanically or electrically connected; the connection may be direct or indirect via an intermediate medium, and may be a communication between the two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
The following detailed description of embodiments of the invention refers to the accompanying drawings.
As shown in fig. 1, a method for extracting crop area and phenology indexes includes the following steps:
s1, preprocessing the SAR image, including orbit correction, thermal noise removal, radiometric calibration, speckle filtering, terrain correction and decibel processing, to obtain backscattering coefficients of VV and VH of crops;
s2, constructing a feature vector according to a backscattering coefficient of a crop in a VV/VH polarization mode, introducing the feature vector and a time sequence SAR image into a full convolution neural network model based on a time sequence, obtaining a classification result of the crop, namely a crop planting situation (shown in figure 2) through internal training of the full convolution neural network model (an input layer → a convolution layer → a pooling layer → a convolution layer → a mask matrix → a classifier), and finally introducing the classification result into drawing software (ArcGIS software) and adding a diagram name legend to obtain a crop planting area diagram;
s3, extracting the climatic indexes of the crops by combining the VV and VH ratio in the SAR image according to the growth cycle condition of the crops in the area.
Step S2 specifically includes:
s21, inputting the time sequence vector of each pixel in the whole SAR image into a full convolution neural network model as an input layer, and then performing convolution on the input time sequence vector to extract the time sequence characteristics of all pixels in the SAR image;
s22, inputting the timing sequence characteristics extracted from the convolution layer into a pooling layer, and pooling the timing sequence characteristics;
s23, combining the first three layers in the constructed frequency spectrum FCN into a combined layer, then adding point by point, combining the fourth convolution layer, and then performing convolution and classification on the fourth convolution layer.
S24, when final classification is carried out in the full convolution neural network classification model, because the number of training samples input in HSI (hyper-spectral image) classification is sparse, in order to ensure the classification precision of crops, a mask matrix is introduced here to filter image pixels, so that the function of highlighting the ground objects to be extracted is achieved.
S25, and according to S21 and S22, after SAR images are input into a full convolution neural network model, a crop planting condition is finally obtained through a series of processing such as input, convolution, pooling, convolution, classification and the like, and then the result is led into mapping software (ArcGIS software) to add a picture name legend to the result, so that a crop planting area map is obtained.
Specifically, in step S21, the input SAR image is convolved, and the image size after convolution is calculated by the following formula:
Figure BDA0003430071800000091
let the size of the image be W × H, where W is the image length, H is the image width, F is the convolution kernel size (the size of the convolution kernel in this scheme is 1 × 1), S is the step, P is the number of layers of each input edge supplemented by 0, and C is the number of channels. Each channel of the convolutional layer is represented as:
Figure BDA0003430071800000092
in the formula C i Is the ith channel of the convolutional signature; w is a i Is the ith convolution kernel X j Is the ith channel of the previous layer; b is a mixture of i Is a bias term for the ith feature map;
specifically, the formula for pooling the time-series characteristics of the extracted pixels in step S22 is as follows:
Figure BDA0003430071800000093
where W is the image length, H is the image width, F is the convolution kernel size, and S is the stride.
Specifically, in step S23, the pooled result is convolved, and the ith channel of the convolutional layer is represented as:
Figure BDA0003430071800000094
in the above formula, the first and second carbon atoms are,
Figure BDA0003430071800000095
is the ith channel of the fourth convolutional layer, where
Figure BDA0003430071800000096
Is the jth channel of the kth convolutional layer; w is a 4 And b 4 Are the corresponding weight and bias terms.
Specifically, step S24 includes: because the number of training samples input in HSI (high spectral image) classification is sparse, in order to ensure the classification precision of crops, a mask matrix is introduced here, and the following formula is adopted for image pixel filtration:
Figure BDA0003430071800000101
where M (u, v) represents the training set, y (u, v) represents the true value of the pixel at the image (u, v),
Figure BDA0003430071800000102
representing the predicted values of the pixels at the image (u, v), r and c are the number of rows and columns, respectively, of the image.
In step S25, mapping software (ArcGIS software) is imported according to the extracted result to map the area of the crop, as shown in fig. 3.
In order to verify the effect of the time-series-based crop area and phenological index extraction method of the full convolution neural network, the extraction result of the algorithm is compared with the extraction result of the rice planting information by combining a dynamic time warping algorithm (DWT) with a random forest. The basic information of the backscatter coefficient time series images of the experimental data is shown in table 1.
TABLE 1 date of data acquisition (DOY, days of year)
Figure BDA0003430071800000103
The extraction results of the two experiments are shown in fig. 3, and the accuracy of the two experiments is shown in table 2.
TABLE 2 precision comparison of different experimental algorithms
Figure BDA0003430071800000111
From the aspect of rice extraction effect, as can be seen from fig. 4, in the process of processing the SAR image, the speed of identifying the rice information based on the random forest method is high, but the classification result is rough, but the classification precision of the crop area and phenological index extraction method of the time-series full convolution neural network is high, and the texture information of the space of the rice plot is completely retained, so that the purpose of accurately mapping the rice is achieved.
In step S3, the method specifically includes:
s31, reconstructing the time sequence by utilizing the backscattering coefficient images of the VV/VH polarization mode, and carrying out S-G filtering on the backscattering time sequence, as shown in a figure 5(a), wherein the following formula is adopted for the S-G filtering on the backscattering time sequence:
Figure BDA0003430071800000112
in the above formula, the first and second carbon atoms are,
Figure BDA0003430071800000113
is the S-G filtered value, X is the set of all data points in the filtering window, Y is the filtered value after K-1 fits to X, X T Refers to the transpose of matrix X.
S32, deriving the phenological indicator of the crop from the smoothed backscatter time series, as shown in fig. 5 (b). The corresponding relation between the phenological period and the specific time period of the time sequence is established mainly from prior knowledge acquired from ground phenological observation. At different rice phenological stages sigma VH And σ VV The value has obvious wave crest and wave trough, and the regularity is obvious (sigma) VH 、σ VV Decibel values representing VH and VV images). The backward scattering coefficients of VV and VH of the rice field are all reduced from the sowing period to the transplanting period, and are all basically in the rising stage from the transplanting period to the mature period. Wherein the ascending amplitude is larger during transplanting to the booting ear, and the increasing speed of VV and VH backscattering coefficient values is relatively slow during the booting ear to maturity. Therefore, the difference (sigma) between VH polarization and VV polarization of rice can be utilized d ) Ratio of decibel values (sigma) of VH and VV images of rice a ) The temporal behavior of the crops was studied, as follows:
σ d (dB)=σ VHVV
σ a =σ VHVV
wherein sigma d Indicates the difference between VH and VV, σ a Expressing the ratio of the decibel values of the VH and VV images; sigma VH 、σ VV Representing the dB values of the VH and VV images.
Finally according to the smoothingPlot of backscatter coefficients and σ d 、σ a The time series change in slope, i.e., the maximum and minimum values of the VH/VV values, determine which growth cycle (i.e., phenological period) the crop is in, tables 3 and 4.
TABLE 3 crop phenological period
Time 4 month Month 5 6-7 months 8-9 months
Phenological period Seedling stage Transplanting period Growth period Maturity stage
TABLE 4 extraction of the phenological indicators
Figure BDA0003430071800000121
Figure BDA0003430071800000131
The method introduces a neural network model and considers time series SAR image data to carry out rice planting area mapping; extracting a rice phenological index by combining a mathematical method of a ratio of VH to VV of the SAR image and considering a rice growth cycle in northeast; the experimental result effectively proves the application potential of sentinel-1SAR data in crop monitoring and related phenological information monitoring.

Claims (7)

1. A method for extracting crop area and phenological index is characterized by comprising the following steps:
s1, preprocessing the SAR image to obtain a backscattering coefficient of the crop in a VV/VH polarization mode; the preprocessing comprises track correction, thermal noise removal, radiometric calibration, coherent speckle filtering, terrain correction and decibel processing;
s2, constructing a feature vector according to a backscattering coefficient of the crop in a VV/VH polarization mode, introducing the feature vector and a time sequence SAR image into a full convolution neural network model based on a time sequence, and obtaining a classification result of the crop, namely a crop planting condition, through internal training of the full convolution neural network model; finally, the classification result is led into drawing software to add a graph name legend, and a crop planting area graph can be obtained;
s3, extracting the phenological index of the crop according to the growth cycle condition of the crop in the area where the crop is located and by combining the VV and VH ratio method in the SAR image;
step S2 includes:
s21, inputting the time sequence vector of each pixel in the whole SAR image into a full convolution neural network model as an input layer, and then performing convolution on the input time sequence vector to extract the time sequence characteristics of all pixels in the SAR image;
s22, inputting the timing sequence characteristics extracted from the convolution layer into a pooling layer, and pooling the timing sequence characteristics;
s23, combining the first three layers in the constructed frequency spectrum FCN into a combined layer, adding the combined layers point by point, combining the fourth convolution layer, and performing convolution and classification on the fourth convolution layer;
s24, when the final classification is carried out in the full convolution neural network classification model and the number of training samples input in the hyperspectral image classification is sparse, introducing a mask matrix and filtering image pixels; wherein the training sample number is sparsely represented as: the number of training samples in each category is less than 10 times of the number of the remote sensing data wave bands.
2. The method as claimed in claim 1, wherein the step S21 is to convolve the input time sequence vector, and the image size after convolution is calculated by the following formula:
Figure FDA0003793088530000021
in the above formula, W is the image length, H is the image width, F is the convolution kernel size, S is the stride, and P is the number of layers of each input edge complement 0;
each channel of the convolutional layer is represented as:
Figure FDA0003793088530000022
in the formula C i Is the ith channel of the convolutional signature; w is a i Is the ith convolution kernel, X j Is the jth channel of the previous layer; b i Is the bias term for the ith feature map.
3. The method for extracting the crop area and phenology index according to claim 1, wherein the formula for pooling the time series characteristics of the extracted pixels in step S22 is as follows:
Figure FDA0003793088530000023
where W is the image length, H is the image width, F is the convolution kernel size, and S is the stride.
4. The method for extracting the crop area and phenology indicator according to claim 1, wherein the formula for performing the convolution operation on the pooled data in step S23 is as follows:
Figure FDA0003793088530000024
in the above formula, the first and second carbon atoms are,
Figure FDA0003793088530000025
is the ith channel of the fourth convolutional layer, where
Figure FDA0003793088530000026
Is the jth channel of the kth convolutional layer;
Figure FDA0003793088530000027
representing the weight corresponding to the ith channel of the fourth convolutional layer;
Figure FDA0003793088530000028
a bias term representing the ith feature map of the fourth convolutional layer.
5. The method as claimed in claim 1, wherein in step S24, a mask matrix is introduced, and the following formula is used for image pixel filtering:
Figure FDA0003793088530000031
where M (u, v) represents the training set, y (u, v) represents the true value of the pixel at the image (u, v),
Figure FDA0003793088530000032
representing the predicted values of the pixels at the image (u, v), r and c are the number of rows and columns, respectively, of the image.
6. The method as claimed in claim 1, wherein the step S3 includes:
s31, reconstructing a time sequence by utilizing the backscattering coefficient images of the VV/VH polarization mode, and carrying out S-G filtering on the backscattering time sequence;
s32, deducing a phenological indicator of the crop according to the smooth backscattering time sequence; the method comprises the following steps of establishing a corresponding relation between a phenological period and a time sequence specific time period by using priori knowledge acquired from ground phenological observation: according to the smooth backscattering coefficient graph and the time series change of the slope of the smooth backscattering coefficient graph, the difference between the VH polarization and the VV polarization of the crops, and the time series slope at the ratio of the VH image decibel value and the VV image decibel value of the crops are utilized to analyze the phenological characteristics of the crops by combining the growth cycle of the crops.
7. The method as claimed in claim 6, wherein in step S31, the backscatter time series is filtered by S-G using the following formula:
Figure FDA0003793088530000033
in the above formula, the first and second carbon atoms are,
Figure FDA0003793088530000034
is the S-G filtered value, X is the set of all data points in the filtering window, Y is the filtered value after K-1 fits to X, X T Refers to the transpose of matrix X.
CN202111591712.1A 2021-12-23 2021-12-23 Method for extracting indexes of area and phenology of crops Active CN114282609B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111591712.1A CN114282609B (en) 2021-12-23 2021-12-23 Method for extracting indexes of area and phenology of crops

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111591712.1A CN114282609B (en) 2021-12-23 2021-12-23 Method for extracting indexes of area and phenology of crops

Publications (2)

Publication Number Publication Date
CN114282609A CN114282609A (en) 2022-04-05
CN114282609B true CN114282609B (en) 2022-09-20

Family

ID=80874722

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111591712.1A Active CN114282609B (en) 2021-12-23 2021-12-23 Method for extracting indexes of area and phenology of crops

Country Status (1)

Country Link
CN (1) CN114282609B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108509836B (en) * 2018-01-29 2021-10-08 中国农业大学 Crop yield estimation method based on double-polarized synthetic aperture radar and crop model data assimilation
CN110852262A (en) * 2019-11-11 2020-02-28 南京大学 Agricultural land extraction method based on time sequence top-grade first remote sensing image
CN111142106B (en) * 2020-02-26 2021-12-03 北京师范大学 Automatic rice identification method based on synthetic aperture radar time sequence data
CN113505635B (en) * 2021-05-24 2024-05-31 中国农业大学 Winter wheat and garlic mixed seed area identification method and device based on optics and radar

Also Published As

Publication number Publication date
CN114282609A (en) 2022-04-05

Similar Documents

Publication Publication Date Title
Luo et al. ChinaCropPhen1km: a high-resolution crop phenological dataset for three staple crops in China during 2000–2015 based on leaf area index (LAI) products
Chen et al. Mapping croplands, cropping patterns, and crop types using MODIS time-series data
CN113009485B (en) Remote sensing tobacco field identification method based on improved vegetation index
CN111985543B (en) Construction method, classification method and system of hyperspectral image classification model
CN109635731B (en) Method and device for identifying valid farmland, storage medium and processor
CN111368736B (en) Rice refined estimation method based on SAR and optical remote sensing data
CN110175931B (en) Method for rapidly extracting crop planting area and phenological information in large range
CN114926748A (en) Soybean remote sensing identification method combining Sentinel-1/2 microwave and optical multispectral images
CN111598019A (en) Crop type and planting mode identification method based on multi-source remote sensing data
CN105787457A (en) Evaluation method for improving vegetation classified remote sensing precision through integration of MODIS satellite and DEM
CN108710864B (en) Winter wheat remote sensing extraction method based on multi-dimensional identification and image noise reduction processing
CN109711102A (en) A kind of crop casualty loss fast evaluation method
WO2023109652A1 (en) Rice planting extraction and multiple-cropping index monitoring method and system, and terminal and storage medium
CN113963260A (en) Extraction method and device for winter wheat planting area and computer equipment
CN116543316B (en) Method for identifying turf in paddy field by utilizing multi-time-phase high-resolution satellite image
Niu et al. A 30 m annual maize phenology dataset from 1985 to 2020 in China
CN115272860A (en) Method and system for determining rice planting area, electronic device and storage medium
CN113223040A (en) Remote sensing-based banana yield estimation method and device, electronic equipment and storage medium
Wei et al. Leaf shape simulation of castor bean and its application in nondestructive leaf area estimation
CN114462579A (en) Soil organic carbon content estimation method based on terrain and remote sensing data
CN112949607A (en) Wetland vegetation feature optimization and fusion method based on JM Relief F
CN114282609B (en) Method for extracting indexes of area and phenology of crops
CN110059890B (en) County scale agricultural flood monitoring method and system
CN116994126A (en) Crop leaf area index obtaining method and device based on canopy reflectivity spectrum
CN112052799A (en) Rosemary planting distribution high-resolution satellite remote sensing identification method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant