CN113591946A - Phoenix-eye blue biomass monitoring platform - Google Patents

Phoenix-eye blue biomass monitoring platform Download PDF

Info

Publication number
CN113591946A
CN113591946A CN202110803560.0A CN202110803560A CN113591946A CN 113591946 A CN113591946 A CN 113591946A CN 202110803560 A CN202110803560 A CN 202110803560A CN 113591946 A CN113591946 A CN 113591946A
Authority
CN
China
Prior art keywords
model
module
phoenix
blue
biomass
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110803560.0A
Other languages
Chinese (zh)
Inventor
江泓炜
张源清
丁俊凯
马春
李岩舟
王汉杰
李想
韦雅财
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi University
Original Assignee
Guangxi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangxi University filed Critical Guangxi University
Priority to CN202110803560.0A priority Critical patent/CN113591946A/en
Publication of CN113591946A publication Critical patent/CN113591946A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • G06F18/2193Validation; Performance evaluation; Active pattern learning techniques based on specific statistical tests
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a phoenix-eye blue biomass monitoring platform, and belongs to the technical field of aquatic biomass. Through installing image sensor on unmanned aerial vehicle, again with image sensor and data analysis system data connection, after image sensor shoots the picture, picture data transmission gives data analysis system, be equipped with machine learning module in the data analysis system respectively, feature identification module and risk assessment module, machine learning module is used for being used for the classification and the discernment of phoenix eye blue, feature identification module is used for discerning the colour characteristic of phoenix eye blue, risk assessment module is used for establishing the invasive risk assessment of the foreign plant of this region of adaptation, estimate out the biomass of phoenix eye blue through three different modules, need not manual observation or manual count, and efficient, be convenient for provide powerful reference data for the work of follow-up clearance to phoenix eye blue.

Description

Phoenix-eye blue biomass monitoring platform
Technical Field
The invention relates to the technical field of aquatic organism monitoring, in particular to a blue phoenix biomass monitoring platform.
Background
The ailanthus altissima is listed as a list of the first foreign invasive species by the state in 2003, and is introduced into various provinces in China as livestock and poultry feed in the 30 s and popularized and planted, and then escapes to be a wild plant. The fast propagation speed of the ailanthus altissima blue is widely distributed in 19 provinces and cities in north, east, China, south and southwest China, particularly the invasion of Yunnan, Jiangsu, Zhejiang, Fujian and other provinces is the most serious, and great threat is brought to local environment. Experts and scholars at home and abroad make a great deal of research on the invasion of the Eichhornia crassipes, and the existing prevention and treatment methods of the Eichhornia crassipes comprise a mechanical method, a chemical method, biological prevention and treatment and the like. Due to the superior habitability and reproductive capacity of Eichhornia crassipes, once the invasive scale is established, eradication is very difficult. Therefore, the monitoring of the biomass of the Eichhornia crassipes is an effective prevention and treatment means before the Eichhornia crassipes is exploded.
At present, in the prior art, the feature extraction is carried out on the phoenix-eye blue by using a traditional image processing method so as to achieve the purpose of identifying the weight of the phoenix-eye blue, but the image identification of the phoenix-eye blue based on multi-target detection is lacked, so that the identification is inaccurate, and the monitoring requirement on the weight of the phoenix-eye blue cannot be met.
Disclosure of Invention
The invention aims to provide a blue phoenix biomass monitoring platform. According to the eye-in-the-canal blue biomass monitoring platform, the eye-in-the-canal blue is subjected to multi-target detection image recognition, so that the biomass of the eye-in-the-canal blue can be accurately recognized, convenience is brought to subsequent cleaning and research, and the technical scheme adopted by the invention is as follows:
according to one aspect of the invention, an Eichhornia crassipes biomass monitoring platform is provided, and comprises an unmanned aerial vehicle, an image sensor and a data analysis system, wherein the image sensor is mounted on the unmanned aerial vehicle and is in data connection with the data analysis system, and the data analysis system comprises a machine learning module, a feature recognition module and a risk assessment module;
the machine learning module is used for classifying and identifying the Eichhornia crassipes;
the characteristic identification module is used for identifying the color characteristic of the eye-in-the-eye blue;
and the risk evaluation module is used for establishing the invasive risk evaluation of the foreign plants adapting to the region.
Preferably, the machine learning module comprises a cluster analysis unit comprising a K-means model comprising a clustering criterion function comprising a sum of squared errors criterion JKThe criterion JKComprises the following steps:
Figure BDA0003164064960000021
preferably, the feature recognition module includes an RGB color space model, and the RGB color space model is: f ═ R [ R ] + rG ] + rB.
Preferably, the risk assessment module comprises a fuzzy hierarchical analysis model and a biomass estimation model, the fuzzy hierarchical analysis model is used for constructing a fuzzy complementary matrix and a fuzzy consistent judgment matrix and researching a fuzzy relation, and the biomass estimation model is used for calculating the biomass of the phoenix-eye blue in the monitoring area.
Preferably, the data analysis system further comprises a multi-target detection module, wherein the multi-target detection module comprises a YOLO-v4-tiny model, and the YOLO-v4-tiny model is used for identifying the Eichhornia crassipes.
The technical scheme adopted by the invention has the following remarkable effects:
(1) according to the invention, the image sensor is installed on the unmanned aerial vehicle, the image sensor is in data connection with the data analysis system, after the image sensor takes a picture, the picture data is transmitted to the data analysis system, the data analysis system is internally provided with the machine learning module, the feature identification module and the risk evaluation module respectively, the machine learning module is used for classifying and identifying the phoenix-eye blue, the feature identification module is used for identifying the color feature of the phoenix-eye blue, the risk evaluation module is used for establishing the invasive risk evaluation of the foreign plants adapting to the area, the biomass of the phoenix-eye blue is estimated through three different modules, manual observation or manual counting is not needed, the efficiency is high, and powerful reference data can be provided for the subsequent cleaning work of the phoenix-eye blue.
(2) The machine learning module is provided with a clustering analysis unit, pixels with similar values on the surface of the picture are segmented through a K-means model of the clustering analysis unit, a machine learning algorithm is applied to carry out evaluation, good performance is shown through practice, the machine learning module can be well applied to segmentation and recognition of the image, and the recognition precision and the recognition efficiency are high.
(3) Because the characteristic information such as texture, shape, spatial relationship and the like shot at high altitude is not outstanding and is not suitable for being used as the identification characteristic of the phoenix-eye blue, the RGB color space model is arranged in the characteristic identification module, and the leaf color of the phoenix-eye blue is used as the characteristic for distinguishing other aquatic plants by adopting the RGB color space model, so that the phoenix-eye blue can be identified more conveniently by applying the color space model, and the identification precision and accuracy are improved.
(4) A fuzzy level analysis model and a biomass estimation model are arranged in the risk evaluation module, a fuzzy complementary matrix and a fuzzy consistent judgment matrix are constructed through the fuzzy level analysis model, a fuzzy relation is researched, the biomass of the blue of the phoenix eye in the monitoring area is estimated through the biomass estimation model, and the subsequent evaluation and cleaning of the blue of the phoenix eye are facilitated.
(5) The Eichhornia crassipes is identified through the YOLO-v4-tiny model in the multi-target detection module, accurate data positioning is provided for cleaning the Eichhornia crassipes at the later stage, other harmless aquatic plants or landscape aquatic plants in a water area are prevented from being cut by mistake when the Eichhornia crassipes is cleaned, the identification precision is high, and the identification speed is high.
Drawings
FIG. 1 is a schematic structural view of the present invention;
FIG. 2 is a schematic diagram of the K-means model implementation of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail below with reference to the accompanying drawings by way of examples of preferred embodiments. It should be noted, however, that the numerous details set forth in the description are merely for the purpose of providing the reader with a thorough understanding of one or more aspects of the present invention, which may be practiced without these specific details.
As shown in fig. 1, according to the present invention, the phoenix-eye-blue biomass monitoring platform comprises an unmanned aerial vehicle, an image sensor and a data analysis system, wherein the image sensor is installed on the unmanned aerial vehicle, the image sensor is in data connection with the data analysis system, the unmanned aerial vehicle is used for collecting high-altitude images in a target cleaning water area, the image sensor sends collected image information to the data analysis system, the data analysis system performs image recognition to extract the phoenix-eye-blue images, and the data analysis system comprises a machine learning module, a feature recognition module and a risk assessment module.
The machine learning module comprises a clustering analysis unit, the clustering analysis unit comprises a K-means model, and the K-means model is carried out based on the centroid. The model takes K as an input parameter and divides n object sets into K clusters. The similarity within each cluster is very high and the similarity between clusters is very low. The clustering criterion function used by the K-means model is the sum of squared errors criterion JK
Figure BDA0003164064960000031
To optimize the clustering result, criterion J should be madeKMinimization, the steps are as follows:
1) giving n mixed samples, making I equal to 1, representing the iteration number, and selecting K initial aggregationHeart Zj(I),j=1,2,…,k;
2) Calculating the distance of each sample from the center of aggregation
Figure BDA0003164064960000032
If it is
Figure BDA0003164064960000033
X is thenk∈ωi;;
3) K new set centers are calculated:
Figure BDA0003164064960000034
4) and (3) judging: if Z isj(I+1)≠Zj(I) (j ═ 1,2, …, k), then return to step 2); otherwise, the procedure is ended. The above steps are performed as shown in fig. 2.
Based on image analysis, the leaf color of the Eichhornia crassipes can be used as a feature for distinguishing other aquatic plants, compared with the color feature, feature information such as texture, shape, spatial relationship and the like shot at high altitude is not prominent, and the leaf color is not suitable for being used as an identification feature of the Eichhornia crassipes. Therefore, for the color characteristics of the blue of the phoenix, the RGB color space model is mainly applied, and the color space is actually composed of a color model (a color model) and a mapping function (a mapping function), and the RGB color space model code is composed of three parts, red r (red), green g (green), and blue b (blue). Visible colors can be reproduced by adding different intensities of red, green and blue: f ═ R [ R ] + rg + rb, yielding a rich and broad spectrum of colors.
Through the fusion of the K-means model and the color space model based on unsupervised machine learning, more problems are solved in practical application, and compared with the independent use of the two models, the fused model has more potential and value.
Experiments are carried out on a river channel, the shooting place is Guangxi Zhuang autonomous region Nanning city \37013inGuangxi, the Ningdistrict eight-chi river channel is aerial-shot in the river channel which is about 30m higher than the water surface, the images are collected in an RGB color space, R, G, B three-channel values corresponding to 100 pixel points in total of 10 images in the phoenix-eye-blue image data set are randomly extracted, and numerical analysis and calculation are carried out.
TABLE 1
Figure BDA0003164064960000041
Based on the segmentation result of the K-means model on the whole monitoring area image, the identification threshold set under the RGB color space model in table 1 is applied, i.e., the average value ± 2 × standard deviation of the phoenix-eye blue at R, G, B three channels, R is 129-65, G is 176-. And identifying the eye-phone blue image.
The risk assessment module comprises a fuzzy level analysis model and a biomass estimation model, the fuzzy level analysis model is used for constructing a fuzzy complementary matrix and a fuzzy consistent judgment matrix and researching a fuzzy relation, the fuzzy level analysis model can convert qualitative assessment into quantitative assessment, in ecological risk assessment, some uncertain factors often cause distortion of assessment results, and the reliability of the assessment results can be improved by a fuzzy theory. The fuzzy evaluation is mainly based on fuzzy mathematic membership theory, applies the principle of fuzzy relation synthesis, and carries out scientific and comprehensive evaluation on the evaluation target through a plurality of systematic and comprehensive indexes. The hierarchical analysis model is used for decomposing elements which are always related to decision into a target, a criterion, a scheme and other hierarchies, and carrying out qualitative and quantitative analysis on the basis. The hierarchical analysis model is characterized in that on the basis of deep analysis of the essence, influence factors, internal relations and the like of a complex decision problem, the thinking process of decision is made to be mathematical by using less quantitative information, so that a simple decision method is provided for the complex decision problem with multiple targets, multiple criteria or no structural characteristics. The method is particularly suitable for occasions where the decision result is difficult to directly and accurately measure. The hierarchical analysis model has many disadvantages as a model for determining the weight of a plurality of indexes in evaluation, and the important point in the disadvantages is that it is difficult to check whether the AHP checks and judges the matrixes are consistent, once the indexes of a certain level are more, the judgers are difficult to determine the weight relationship among the indexes so as to ensure the consistency of thinking, and the subjective uncertainties easily cause the consistency of the whole system to be poor and cannot achieve the expected evaluation effect.
Therefore, a designer based on the above fuzzy hierarchical analysis model (FAHP) modifies the original hierarchical analysis method from the judgment matrix, and in the fuzzy hierarchical analysis model, the importance between two elements is not compared by using a triangular fuzzy number, that is, the importance between two elements is not compared by using a multiple relation, but the importance sum of two elements to be compared is 1, and the importance between different indexes is compared by allocating values according to the degree of influence. The fuzzy hierarchical analysis model provides a basis for quantifying evaluation indexes and selecting an optimal scheme. The key of the fuzzy hierarchical analysis model lies in the study of fuzzy relation, the construction of fuzzy complementary matrix and the construction of fuzzy consistent judgment matrix, and the specific method is as follows:
detailed research is carried out on an evaluated object, various indexes determining the evaluated object are screened out, the indexes are divided into a target layer, a criterion layer and a measure layer according to factors such as the inclusion relation of the indexes, and the inclusion relation between the criterion layer and the target layer is clarified;
TABLE 2
Figure BDA0003164064960000051
Secondly, the weight relationship between every two elements is compared in each layer, and the relationship between every two elements is determined by adopting a scale mode (shown in table 2) of 0.1-0.9.
And thirdly, establishing a fuzzy complementary matrix according to the weight relation obtained in the last step.
Fuzzy complementary matrix:
Figure BDA0003164064960000061
the above formula is a fuzzy complementary matrix obtained by a judgment layer composed of four indexes, and each element r represents how important the ith element is compared with the jth element. Thus, it can be obtained that the sum of the importance of each pair of corresponding elements (the relationship between the lower left and the upper right) is 1.
And fourthly, adjusting the fuzzy complementary matrix into a fuzzy consistent matrix. The fuzzy complementary matrix obtained in the last step is not a 1-9 scale reciprocal matrix in a strict sense, and a consistency conversion is required.
On the basis of the consistency judgment matrix, obtaining the weight of each index through a formula, and further sequencing the weights to obtain the importance relation of each index.
Calculating by combining the numerical relationship between the biological accumulation amount of the upper part of the Eichhornia crassipes and the attached host plants under different environmental conditions and combining the numerical relationship of the Eichhornia crassipes coverage area, the number of flowers, the total number of flowers and the like,
TABLE 3
Figure BDA0003164064960000062
Table 3 shows the results of the measurements on 20 specimens of 0.25m2The area of a unit pixel point of the monitoring area image, the biomass of the phoenix-eye-blue flowers corresponding to the unit pixel point and the total biomass are determined by combining the area of the monitoring area and the pixels of the monitoring area image.
TABLE 4
Figure BDA0003164064960000063
As shown in table 4, the calculation of the amount of the phoenix-eye blue in the monitoring region can be realized based on the amount of the phoenix-eye blue corresponding to the unit pixel in the monitoring region in table 4 and by combining the calculation of the number of the pixel points of the phoenix-eye blue identification result.
The data analysis system further comprises a multi-target detection module, the multi-target detection module comprises a YOLO-v4-tiny model, the YOLO-v4-tiny adopts CSPdakrnet 53-tiny as a main feature extraction network, and the total number of the five residual modules is five. And finally, the outputs of the three residual modules are input into a characteristic pyramid structure, and the characteristic pyramid part of the YOLO-v4-tiny adopts a PANet structure to perform characteristic fusion on a characteristic layer. After feature fusion, the sizes of the three output feature layers are 1/8, 1/16 and 1/32 of the original input size respectively, which is equivalent to dividing the picture into grids with the size of S multiplied by S. In each cell, there are three different sized prior boxes. And finally, adjusting the prior frame through a predicted value to obtain a bounding box (bounding box).
The multi-target detection model construction follows the following steps:
1. collecting a dataset of Eichhornia crassipes and other aquatic plants: selecting 10 common aquatic plants including Water plants including cocklebur, leafy and floating plants, namely Water lettuce (Eichhornia crassipes), Pistia stratiotes (Pistia stratiotes), royal jelly (Victoria amazonica), gordon euryale (Gorgon fruit), Lotus (Lotus), Water lily (Water lily), Chinese iris (Butimus umbellata), giant Reed (Mosaic bamboo Reed) and Reed (Reed) as identification targets, wherein the 10 common aquatic plants include three types of emergent aquatic plants, floating plants and floating plants, and the three types of plants can be observed on the Water surface. Wherein, the water hyacinth and the pistia stratiotes can all be regarded as and clear away the object, after confirming 10 kinds of the common aquatic plants of choosing, have collected and arranged the image data set of these 10 kinds of aquatic plants, and its picture quantity is: the number of the Eichhornia crassipes and the number of the ships are respectively 400 and 318, and the number of the images of the rest 9-clock aquatic plants is 200.
2. Labeling of images: since a picture may contain a plurality of objects, which may contain target objects and non-target objects, the target objects are labeled and named before proper training is performed. And (3) labeling the pictures by using a labeling tool LabelImg for multi-target detection, and finally storing the labeling result of each picture in an xml file, wherein the xml file is consistent with the picture name and records the picture name, the labeled object type, the pixel coordinate and other information, and the labeled type, coordinate and length width are important information.
3. YOLO-v4 network construction: the YOLO-v4-tiny network is built under a PyTorch deep learning framework, an input data structure is 608 multiplied by 608DPI, the effective feature layers of two clips, namely the effective feature layers of the last two clips of CSPdarknet53_ tiny, can be obtained by utilizing a trunk feature extraction network, and the effective feature layers are transmitted into an enhanced feature extraction network to build the FPN.
4. Model training: and (3) importing the label file and the training data set obtained in the step (2) into a YOLO-v4-Tiny network, obtaining a final weight file after 400 training generations, and importing the weight file (. PTH) into a YOLO-v4-Tiny model to complete prediction. In the training process, the quality of the training process can be reflected by two indexes, namely training error (Train loss) and verification error (Validation loss), wherein the training error is the error between a prediction result and a real result of a training set in a model, and the fitting capacity of the model on the training set is measured. The verification error refers to the error between the prediction result and the real result of the verification set in the model, the fitting ability or generalization ability of the verification set on unseen data is measured, the training effect is qualitatively evaluated, and the Train loss and the Validation loss are reduced, which indicates that the network still learns and correctly fits the model.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that those skilled in the art can make various improvements and modifications without departing from the principle of the present invention, and these improvements and modifications should also be construed as the protection scope of the present invention.

Claims (5)

1. The utility model provides a phoenix-eye blue biomass monitoring platform which characterized in that: the unmanned aerial vehicle comprises an unmanned aerial vehicle, an image sensor and a data analysis system, wherein the image sensor is installed on the unmanned aerial vehicle and is in data connection with the data analysis system, and the data analysis system comprises a machine learning module, a feature recognition module and a risk assessment module;
the machine learning module is used for classifying and identifying the Eichhornia crassipes;
the characteristic identification module is used for identifying the color characteristic of the eye-in-the-eye blue;
and the risk evaluation module is used for establishing the invasive risk evaluation of the foreign plants adapting to the region.
2. The blue tooth mass monitoring platform according to claim 1, whereinIs characterized in that: the machine learning module comprises a cluster analysis unit, the cluster analysis unit comprises a K-means model, the K-means model comprises a clustering criterion function, and the clustering criterion function comprises a sum of squared errors criterion JKThe criterion JKComprises the following steps:
Figure FDA0003164064950000011
3. the blue tooth mass monitoring platform of claim 1, wherein: the feature recognition module comprises an RGB color space model, and the RGB color space model is as follows: f ═ R [ R ] + rG ] + rB.
4. The blue tooth mass monitoring platform of claim 1, wherein: the risk assessment module comprises a fuzzy hierarchical analysis model and a biomass estimation model, the fuzzy hierarchical analysis model is used for constructing a fuzzy complementary matrix and a fuzzy consistent judgment matrix and researching a fuzzy relation, and the biomass estimation model is used for calculating biomass of the phoenix-eye blue in the monitoring area.
5. The blue tooth mass monitoring platform of claim 1, wherein: the data analysis system further comprises a multi-target detection module, wherein the multi-target detection module comprises a YOLO-v4-tiny model, and the YOLO-v4-tiny model is used for identifying the Eichhornia crassipes.
CN202110803560.0A 2021-07-15 2021-07-15 Phoenix-eye blue biomass monitoring platform Pending CN113591946A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110803560.0A CN113591946A (en) 2021-07-15 2021-07-15 Phoenix-eye blue biomass monitoring platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110803560.0A CN113591946A (en) 2021-07-15 2021-07-15 Phoenix-eye blue biomass monitoring platform

Publications (1)

Publication Number Publication Date
CN113591946A true CN113591946A (en) 2021-11-02

Family

ID=78247672

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110803560.0A Pending CN113591946A (en) 2021-07-15 2021-07-15 Phoenix-eye blue biomass monitoring platform

Country Status (1)

Country Link
CN (1) CN113591946A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101814063A (en) * 2010-05-24 2010-08-25 天津大学 Global K-means clustering algorithm based on distance weighting
CN108020211A (en) * 2017-12-01 2018-05-11 云南大学 A kind of method of unmanned plane aeroplane photography estimation instruction plant biomass
CN111163628A (en) * 2017-05-09 2020-05-15 蓝河技术有限公司 Automatic plant detection using image data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101814063A (en) * 2010-05-24 2010-08-25 天津大学 Global K-means clustering algorithm based on distance weighting
CN111163628A (en) * 2017-05-09 2020-05-15 蓝河技术有限公司 Automatic plant detection using image data
CN108020211A (en) * 2017-12-01 2018-05-11 云南大学 A kind of method of unmanned plane aeroplane photography estimation instruction plant biomass

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱宏伟: "珠海淇澳岛无瓣海桑入侵风险综合评估", 《中国优秀硕士学位论文全文数据库基础科学辑》, pages 85 - 87 *

Similar Documents

Publication Publication Date Title
CN108573276B (en) Change detection method based on high-resolution remote sensing image
CN110399909B (en) Hyperspectral image classification method based on label constraint elastic network graph model
CN111739075A (en) Deep network lung texture recognition method combining multi-scale attention
CN103413151B (en) Hyperspectral image classification method based on figure canonical low-rank representation Dimensionality Reduction
CN107679509B (en) Cyclotella tenera identification method and device
CN109410238B (en) Wolfberry identification and counting method based on PointNet + + network
CN112347970B (en) Remote sensing image ground object identification method based on graph convolution neural network
CN107358203B (en) A kind of High Resolution SAR image classification method based on depth convolution ladder network
CN112016392A (en) Hyperspectral image-based small sample detection method for soybean pest damage degree
CN110853070A (en) Underwater sea cucumber image segmentation method based on significance and Grabcut
CN111222545B (en) Image classification method based on linear programming incremental learning
Chen et al. Agricultural remote sensing image cultivated land extraction technology based on deep learning
Gleason et al. A Fusion Approach for Tree Crown Delineation from Lidar Data.
CN114842264A (en) Hyperspectral image classification method based on multi-scale spatial spectral feature joint learning
CN113936214A (en) Karst wetland vegetation community classification method based on fusion of aerospace remote sensing images
CN111738052B (en) Multi-feature fusion hyperspectral remote sensing ground object classification method based on deep learning
CN111291818B (en) Non-uniform class sample equalization method for cloud mask
CN117409339A (en) Unmanned aerial vehicle crop state visual identification method for air-ground coordination
CN109886146A (en) Flood information remote-sensing intelligent acquisition method and equipment based on Machine Vision Detection
CN115908924A (en) Multi-classifier-based small sample hyperspectral image semantic segmentation method and system
CN116704497B (en) Rape phenotype parameter extraction method and system based on three-dimensional point cloud
CN111325158B (en) CNN and RFC-based integrated learning polarized SAR image classification method
CN111460943A (en) Remote sensing image ground object classification method and system
CN115049842B (en) Method for detecting damage of aircraft skin image and positioning 2D-3D
CN114494586B (en) Lattice projection deep learning network broadleaf branch and leaf separation and skeleton reconstruction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination