CN113177911A - Method for nondestructive evaluation of ozone sensitivity of plants by leaves - Google Patents

Method for nondestructive evaluation of ozone sensitivity of plants by leaves Download PDF

Info

Publication number
CN113177911A
CN113177911A CN202110391446.1A CN202110391446A CN113177911A CN 113177911 A CN113177911 A CN 113177911A CN 202110391446 A CN202110391446 A CN 202110391446A CN 113177911 A CN113177911 A CN 113177911A
Authority
CN
China
Prior art keywords
damage
leaf
ozone
image
plants
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110391446.1A
Other languages
Chinese (zh)
Inventor
张巍巍
方得安
许佳瑶
孙子程
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang University
Original Assignee
Shenyang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang University filed Critical Shenyang University
Priority to CN202110391446.1A priority Critical patent/CN113177911A/en
Publication of CN113177911A publication Critical patent/CN113177911A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

The invention relates to a method for nondestructive evaluation of ozone sensitivity of plants by leaves, belongs to the field of biological monitoring of atmospheric pollution, and can be applied to nondestructive and continuous measurement of damage symptoms of ozone-sensitive plants such as soybean, poplar, white birch, tilia amurensis, quercus liaotungensis and the like. The method comprises the following steps: classifying damage types according to the leaf surface images, then segmenting the leaf surface images, extracting and analyzing characteristics such as symptom shapes, area sizes and the like, establishing an extraction method based on an ISODATA algorithm under a Lab color mode, and grading damage degrees after obtaining the lesion rate (figure 1). Compared with the existing visual observation or destructive sampling method, the method disclosed by the invention can analyze the damaged area and proportion of the leaf surface without damage, realize continuous monitoring and accurate grading, solve the problem that the damage degree cannot be quantitatively compared due to the difference of ozone damage symptoms of different plants, simply and directly judge the ozone damage of the plants, and has the advantages of rapidness, no damage, continuous observation and the like.

Description

Method for nondestructive evaluation of ozone sensitivity of plants by leaves
Technical Field
The invention relates to a method for nondestructive evaluation of plant ozone sensitivity by leaves, belongs to the field of air pollution biological monitoring, and is particularly suitable for nondestructive, continuous monitoring and quantitative evaluation of plant ozone damage under natural conditions.
Background
The high-concentration atmospheric ozone can generate oxidation damage to plants, can cause the metabolism of the plants to be changed in different degrees, causes the physiological and biochemical processes such as plant pigment content, tissue structure, cell ultrastructure and the like to be changed, and causes the plants to present external damage symptoms. The external symptoms are reflected on the image and show differences of color, shape and the like, and a basis is provided for evaluating the ozone sensitivity of the plants by utilizing an image processing technology. The ozone damage to the plants is not detected as any residual element, but can cause the upper leaves in the canopy to have the symptoms of spots, plaques, dry-out and the like. Moreover, the damage degree of plants is usually proportional to the superficial damage area of the leaf surface, and the color and shape of the lesion are changed and the area is increased as the ozone concentration is increased. Therefore, under natural conditions, the influence degree of the ozone on the plants can be judged and evaluated according to the change of the foliar injury symptoms.
At present, the evaluation method of plant ozone damage mainly comprises the steps of observing or cutting off the scanning area of leaves by naked eyes, then roughly estimating the damage proportion and simply grading. The method has strong subjectivity, cannot carry out rapid and nondestructive quantitative detection, cannot realize continuous monitoring of the development condition of the injury symptom of the same leaf after the leaves are reduced, also has certain influence on plant growth, is time-consuming and labor-consuming, and is not suitable for being developed in a large range. The invention solves the problems of strong subjectivity, incapability of quantifying, time and labor consumption, influence on plant growth and the like of the plant ozone injury symptom analysis method. The method can analyze the damaged area and proportion of the leaf surface without damage, realize continuous monitoring and accurate grading, solve the problem that the damage degree cannot be quantitatively compared due to the difference of ozone damage symptoms of different plants, simply and directly judge the ozone damage of the plants, and has the advantages of rapidness, no damage, continuous observation and the like.
Disclosure of Invention
(1) Data acquisition: under natural conditions, a camera or a mobile phone is used for continuously collecting color image data (RGB, JPG) of the leaf surface of the ozone-sensitive plant every 7-20 days, and the target leaf surface is required to occupy most of the image.
(2) Image processing: classifying the damage types of the leaf surfaces, extracting and analyzing the shape and area characteristics by adopting image processing software by segmenting the damage images of the leaf surfaces, completing the separation of the normal parts of the leaf surfaces from the lesion areas, and establishing an extraction method based on an ISODATA algorithm under a Lab color mode.
(3) Quantitative analysis of injury: for the leaves of the same leaf or the corresponding position of the same plant, the damage area and proportion are calculated by calculating the difference value between the normal leaf image and the whole leaf image, and the damage degree of the leaves is quantitatively analyzed (figure 1). The specific operation steps are as follows.
Step 1: collection plant leaf image (RGB, JPG)
And acquiring a blade color image under a natural condition by using a color digital camera or a mobile phone camera. It is desirable to capture the image in as simple a background as possible so that the target leaf for analysis can occupy a large area of the image.
Step 2: segmenting the blade image, removing the background, and extracting the target blade from the complex background;
and 3, extracting the blades shot in a natural state from a complex background by using drawing 3D software built in a win10 system. The specific operation is as follows: firstly, determining a target blade in an image, detecting a blade boundary by using a painting tool (magic painting brush), and obtaining a complete target blade image with a complex background removed by adjusting, adding and deleting complex backgrounds such as soil, branches and sky, so that the gray level of the background is 0, and the target blade presents an original color image.
And 3, step 3: symptom classification
Ozone does not generally affect the veins, and the symptoms of injury usually occur in the mesophyll portion between the veins, with old leaves more severe than new ones. The leaves with obvious insect spots, protruding mildew spots and the like are removed, the color characteristics are used as the basis for distinguishing, and the ozone-damaged leaves are divided into two categories: blotchy and ulcerated leaves (fig. 2). The invention mainly takes the two types of blades as analysis objects:
a. leaf blotches: fine spots are scattered on the leaf tips and the leaf edges and are white, yellow brown or tan; sometimes full leaf spots appear;
b. ulcer leaves: yellowish brown or brownish plaques with irregular shapes and different sizes appear in the gaps of veins; the color of the lesion becomes dark and dry, and the tip of the leaf burns, curls and falls off when the lesion is serious.
And 4, step 4: performing image segmentation to distinguish leaf region and lesion region
The leaf scab characteristics (color, shape and texture characteristics) are extracted, and the complete target leaf color picture is converted into a gray picture to obtain a corresponding gray image only containing black, white and different shades of gray. Then Image J software is utilized, an ISODATA algorithm is adopted to quickly find an optimal threshold value, the separation of the normal part of the leaf from the lesion area is completed, and an ozone injury symptom extraction method based on an Lab color mode and adopting an ISODATA algorithm (iterative self-organizing data analysis method) is established;
image J software analysis step:
①Open
opening the target leaf;
②Image/Type/Lab Stack/
converting the color image from the RGB model into an L × a × b image, and converting the L × or a × component into an 8-bit image;
③Image/Adjust/Threshold/Default
and processing the histogram of the L or a component by using a threshold segmentation method of an ISODATA algorithm to obtain an optimal threshold image, wherein the area of the optimal threshold image is the area of the green leaf after the lesion is removed. Filling the whole blade to obtain an image which is the maximum area of the blade;
④Process/Binary/Make binary/
and (4) carrying out binarization processing on the optimal threshold value image obtained by segmentation to enable the image to show obvious black-white differentiation, and obtaining an image of a normal part of the leaf. After binarization processing is carried out on the maximum leaf image, if a hole is found in the leaf, filling the leaf by Fill Holes to obtain the whole leaf image;
⑤Process/Image Calculator
because the ozone often causes the plaque at the edge of the blade, the area of the marginal plaque can be underestimated by directly reversing the image of the normal part of the blade after binarization, so that an accurate plaque image is obtained by adopting a method of subtracting the normal part image of the blade from the whole blade image;
⑥Analyze/Analyze Particles/
in this way, the area of each lesion, the total area of the lesion and the maximum leaf area can be calculated.
And 5, step 5: quantitative analysis of ozone damage degree of blade
The size and proportion of the damaged area are indexes for measuring the damage degree of the blade caused by the ozone. According to the formula: the lesion rate (%) = total lesion area/total leaf area x 100, and the lesion rate was calculated. Can carry out quick and nondestructive quantitative detection on the lesion spots of a large number of leaves.
Drawings
FIG. 1 is a method and a flow chart for nondestructive evaluation of ozone sensitivity of plants by leaves.
Figure 2 two typical ozone injury symptoms.
FIG. 3 is a color picture of ozone foliar damage of Quercus mongolicus.
FIG. 4 is a graph of the target foliar damage of Quercus mongolicus.
FIG. 5 is an ozone damage area analysis diagram of a quercus mongolica target blade.
Fig. 6 is a color picture of ozone leaf surface damage of birch.
Fig. 7 graph of birch target leaf surface damage.
Fig. 8 is a normal area diagram of a birch target leaf.
Fig. 9 is a graph of an area analysis of a target leaf damage of birch.
Detailed Description
The invention will be further illustrated with reference to the following specific examples.
Example analysis 1
In this example, quercus mongolica was selected as the subject, and foliar damage analysis and ozone sensitivity evaluation were performed by the method of the present invention:
step 1: collecting plant leaf images (RGB, JPG) to expand target leaves as much as possible (FIG. 3);
step 2: using a win10 system to make the background black by a magic painting pen function with drawing 3D software, obtaining a target leaf and extracting the target leaf from the complex background (figure 4);
and 3, step 3: classifying the symptoms according to the color and texture characteristics, and judging as follows: leaf of macula;
and 4, step 4: and (4) carrying out Image segmentation, distinguishing a leaf area and a lesion area, and carrying out Image J operation steps of (i) - (sixth), so as to obtain the lesion area and the whole leaf area. First, open the target leaf file, convert the color image from the RGB model to L x a b image, convert the a component to 8-bit image. And operating Image/Adjust/Threshold/Default to obtain the normal area Image of the leaf. And converting the a component into an 8-bit Image again, operating Image/Adjust/Threshold/Default, filling the whole blade, clicking the appliance, and operating Process/Binary/Fill Holes to obtain the whole blade Image. And operating the Process/Image Calculator to obtain an accurate Image of the lesion area. Operating Analyze/Analyze partitions to calculate the area of each lesion, the total area of the lesion, and the maximum leaf area (FIG. 5);
and 5, step 5: and (5) quantitatively analyzing the ozone damage degree of the blade. According to the formula: the lesion ratio (%) was calculated as = total lesion area/total leaf area × 100. The damage ratio (% spotting) is = 46150 ÷ 3215182 × 100, 1.44.
Example analysis 2
In this example, white birch was selected as the target, and the method of the present invention was used to perform foliar damage analysis and ozone sensitivity evaluation:
step 1: collecting plant leaf images (RGB, JPG) to expand target leaves as much as possible (FIG. 6);
step 2: using a win10 system to make the background black by a magic painting pen function with drawing 3D software, obtaining a target leaf and extracting the target leaf from the complex background (figure 7);
and 3, step 3: classifying the symptoms according to the color and texture characteristics, and judging as follows: ulcerated leaves;
and 4, step 4: and (4) carrying out Image segmentation, distinguishing a leaf area and a lesion area, and carrying out Image J operation steps of (i) - (sixth), so as to obtain the lesion area and the whole leaf area. First, open the target leaf file, convert the color image from the RGB model to L x a b image, convert the L x component to 8-bit image. Operating Image/Adjust/Threshold/Default, obtaining the normal area Image of the leaf (figure 8). And converting the a component into an 8-bit Image again, operating Image/Adjust/Threshold/Default, filling the whole blade, clicking the appliance, and operating Process/Binary/Fill Holes to obtain the whole blade Image. And operating the Process/Image Calculator to obtain an accurate Image of the lesion area. Operating Analyze/Analyze partitions to calculate the area of each lesion, the total area of the lesion, and the maximum leaf area (FIG. 9);
and 5, step 5: and (5) quantitatively analyzing the ozone damage degree of the blade. According to the formula: the lesion ratio (%) was calculated as = total lesion area/total leaf area × 100. The damage ratio (% spotting) is = 471836 ÷ 2987415 × 100, 15.8.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be able to cover the technical solutions and the inventive concepts of the present invention within the technical scope of the present invention.

Claims (4)

1. A method for evaluating the ozone sensitivity of plants without damaging leaves is characterized by comprising the following steps: (1) data acquisition: under natural conditions, using a camera or a mobile phone to continuously collect color image data (RGB, JPG) of the leaf surface of the ozone-sensitive plant, wherein the target leaf surface is required to occupy most of the image; (2) image processing: classifying the damage types of the leaf surfaces, segmenting the damage images of the leaf surfaces, extracting and analyzing the shape and area characteristics by adopting image processing software, completing the separation of the normal parts of the leaf surfaces from the lesion areas, and establishing an extraction method (3) quantitative analysis of the damage based on an ISODATA algorithm under a Lab color mode: and calculating the difference value between the normal leaf image and the whole leaf image of the same leaf to the leaf of the same leaf or the corresponding position of the same plant, calculating the damage area and proportion, and quantitatively analyzing the damage degree of the leaf.
2. The method for the nondestructive evaluation of ozone sensitivity of plants according to claim 1, wherein the ozone damage symptoms include spotted and ulcerated damage types.
3. The method for the nondestructive evaluation of the ozone sensitivity of the plants on the leaves as claimed in claim 1, wherein the ozone damage symptom extraction method based on the ISODATA algorithm under the Lab color mode (L or a component) is established for the leaf damage image.
4. The method for the nondestructive evaluation of ozone sensitivity of plants on leaves as claimed in claim 1, wherein the quantitative analysis method for the damage of leaves comprises calculating the difference between the normal leaf image and the whole leaf image of the leaves at the corresponding positions of the same leaf or the same plant, calculating the damage area and ratio, and quantitatively analyzing the damage degree of the leaves.
CN202110391446.1A 2021-04-13 2021-04-13 Method for nondestructive evaluation of ozone sensitivity of plants by leaves Withdrawn CN113177911A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110391446.1A CN113177911A (en) 2021-04-13 2021-04-13 Method for nondestructive evaluation of ozone sensitivity of plants by leaves

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110391446.1A CN113177911A (en) 2021-04-13 2021-04-13 Method for nondestructive evaluation of ozone sensitivity of plants by leaves

Publications (1)

Publication Number Publication Date
CN113177911A true CN113177911A (en) 2021-07-27

Family

ID=76924894

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110391446.1A Withdrawn CN113177911A (en) 2021-04-13 2021-04-13 Method for nondestructive evaluation of ozone sensitivity of plants by leaves

Country Status (1)

Country Link
CN (1) CN113177911A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7305902B1 (en) * 2023-04-21 2023-07-10 株式会社レフ・テクノロジー Yellowing information acquisition method and yellowing information acquisition system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101539531A (en) * 2009-04-09 2009-09-23 浙江大学 Rice leaf blast detection and classification method based on multi-spectral image processing
CN101819039A (en) * 2010-04-19 2010-09-01 虞毅 Method for analyzing and evaluating earth surface coarse graining degree by using digital image
CN105844610A (en) * 2016-01-23 2016-08-10 重庆布委科技有限公司 Plant leaf rusty stain automatic detection system and method based on machine vision
CN106910214A (en) * 2017-02-09 2017-06-30 中国林业科学研究院资源信息研究所 A kind of santal trunk insect pest degree of injury level images method of discrimination
CN107330892A (en) * 2017-07-24 2017-11-07 内蒙古工业大学 A kind of sunflower disease recognition method based on random forest method
CN107507175A (en) * 2017-08-18 2017-12-22 潘荣兰 A kind of device for being used to calculate Maize Leaf helminthosporium maydis scab occupied area ratio
CN110544237A (en) * 2019-08-06 2019-12-06 广州林猫物种自动识别技术有限公司 Oil tea pest model training method and recognition method based on image analysis
CN111967357A (en) * 2020-08-05 2020-11-20 茅台学院 Intelligent sorghum disease identification system and identification method based on machine vision

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101539531A (en) * 2009-04-09 2009-09-23 浙江大学 Rice leaf blast detection and classification method based on multi-spectral image processing
CN101819039A (en) * 2010-04-19 2010-09-01 虞毅 Method for analyzing and evaluating earth surface coarse graining degree by using digital image
CN105844610A (en) * 2016-01-23 2016-08-10 重庆布委科技有限公司 Plant leaf rusty stain automatic detection system and method based on machine vision
CN106910214A (en) * 2017-02-09 2017-06-30 中国林业科学研究院资源信息研究所 A kind of santal trunk insect pest degree of injury level images method of discrimination
CN107330892A (en) * 2017-07-24 2017-11-07 内蒙古工业大学 A kind of sunflower disease recognition method based on random forest method
CN107507175A (en) * 2017-08-18 2017-12-22 潘荣兰 A kind of device for being used to calculate Maize Leaf helminthosporium maydis scab occupied area ratio
CN110544237A (en) * 2019-08-06 2019-12-06 广州林猫物种自动识别技术有限公司 Oil tea pest model training method and recognition method based on image analysis
CN111967357A (en) * 2020-08-05 2020-11-20 茅台学院 Intelligent sorghum disease identification system and identification method based on machine vision

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7305902B1 (en) * 2023-04-21 2023-07-10 株式会社レフ・テクノロジー Yellowing information acquisition method and yellowing information acquisition system

Similar Documents

Publication Publication Date Title
CN113989279B (en) Plastic film quality detection method based on artificial intelligence and image processing
Bock et al. From visual estimates to fully automated sensor-based measurements of plant disease severity: status and challenges for improving accuracy
CN109978822B (en) Banana maturity judging modeling method and judging method based on machine vision
Barbedo An automatic method to detect and measure leaf disease symptoms using digital image processing
Patil et al. Leaf disease severity measurement using image processing
CN107665492B (en) Colorectal panoramic digital pathological image tissue segmentation method based on depth network
US8068132B2 (en) Method for identifying Guignardia citricarpa
CN108181316B (en) Bamboo strip defect detection method based on machine vision
CN107860722B (en) Method and system for online detection of internal quality of honeydew melons
Chopin et al. RootAnalyzer: a cross-section image analysis tool for automated characterization of root cells and tissues
CN111852792B (en) Fan blade defect self-diagnosis positioning method based on machine vision
CN114723704A (en) Textile quality evaluation method based on image processing
Laga et al. Image-based plant stornata phenotyping
AU2020103260A4 (en) Rice blast grading system and method
CN111160451A (en) Flexible material detection method and storage medium thereof
Tech et al. Methods of image acquisition and software development for leaf area measurements in pastures
CN105067532A (en) Method for identifying early-stage disease spots of sclerotinia sclerotiorum and botrytis of rape
CN106780347A (en) A kind of loquat early stage bruise discrimination method based on OCT image treatment
Wah et al. Analysis on feature extraction and classification of rice kernels for Myanmar rice using image processing techniques
CN114359539A (en) Intelligent identification method for high-spectrum image of parasite in sashimi
CN115587988A (en) Method for distinguishing maturity and height of tobacco leaves based on digital image processing
CN113177911A (en) Method for nondestructive evaluation of ozone sensitivity of plants by leaves
CN113129281B (en) Wheat stem section parameter detection method based on deep learning
Janardhana et al. Computer aided inspection system for food products using machine vision—a review
Yadav et al. An automated image processing method for segmentation and quantification of rust disease in maize leaves

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210727

WW01 Invention patent application withdrawn after publication