AU2020103332A4 - IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System - Google Patents

IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System Download PDF

Info

Publication number
AU2020103332A4
AU2020103332A4 AU2020103332A AU2020103332A AU2020103332A4 AU 2020103332 A4 AU2020103332 A4 AU 2020103332A4 AU 2020103332 A AU2020103332 A AU 2020103332A AU 2020103332 A AU2020103332 A AU 2020103332A AU 2020103332 A4 AU2020103332 A4 AU 2020103332A4
Authority
AU
Australia
Prior art keywords
regions
green vegetation
soil
plant
memory map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2020103332A
Inventor
Kaushal Bhatt
Nitin Chhimwal
Prashant Mishra
Abhay Kumar Sharma
Sudhir Sharma
Peeyush Tewari
Sandesh Tripathi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tewari Peeyush Dr
Tripathi Sandesh Dr
Original Assignee
Tewari Peeyush Dr
Tripathi Sandesh Dr
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tewari Peeyush Dr, Tripathi Sandesh Dr filed Critical Tewari Peeyush Dr
Priority to AU2020103332A priority Critical patent/AU2020103332A4/en
Application granted granted Critical
Publication of AU2020103332A4 publication Critical patent/AU2020103332A4/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M21/00Apparatus for the destruction of unwanted vegetation, e.g. weeds
    • A01M21/04Apparatus for destruction by steam, chemicals, burning, or electricity
    • A01M21/043Apparatus for destruction by steam, chemicals, burning, or electricity by chemicals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D1/00Dropping, ejecting, releasing, or receiving articles, liquids, or the like, in flight
    • B64D1/16Dropping or releasing powdered, liquid, or gaseous matter, e.g. for fire-fighting
    • B64D1/18Dropping or releasing powdered, liquid, or gaseous matter, e.g. for fire-fighting by spraying, e.g. insecticides
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Insects & Arthropods (AREA)
  • Pest Control & Pesticides (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Image Processing (AREA)

Abstract

Our invention "IMLS-Weed Recognition/Classification" is an intelligent weed recognition and identification system comprises a chlorophyll sensor for detecting green vegetation and memory map means for storing images which contain different forms of green vegetation. The IMLS-weed recognition/classification is also the memory maps stored in memory are processed to eliminate the background information and leave a memory map containing only green vegetation and the enhanced memory map is further processed by an operation of segmentation into identifiable regions and the identifiable green vegetation regions are processed to identify unique attributes for each of the regions. The IMLS-weed recognition/classification also unique attributes for each of the regions is stored in a reference data base library and are used as reference data for comparing other green vegetation with the data stored in the base model by a processor which matches green vegetation in other regions with the green vegetation stored in said reference data base model and further produces decision data signals which are used by a controller to control a plurality of spray nozzles covering the area sensed and for dispensing a plurality of selectable controlled chemicals. The IMLS-weed recognition/classification is also including a method of determining the position of weeds growing in soil amongst crops comprising the steps of: 1. Using a camera to acquiring a stereoscopic image of plants growing on soil. 2. Segmenting plant pixels and soil pixels. 3. Creating a modelised height profile of the soil at positions at which soil pixels have not been captured by the stereoscopic image based on information derived from adjacent soil pixels. 4. Determining a corrected plant height at plant pixels representing the distance between the plant pixel and the modelised soil underneath. 5. Differentiating between weeds and crops by comparing the corrected plant height with an expected crop height. Once the position of a weed has been determined, it may be destroyed, for example by heat applied to the identified position or by a robotic arm. 21 LOORNRL X R E IM E EST A SUBTRACT SEGMEN FLANTS SpEa6D .R o MOe a T ONOU UN:KGUIN NT0 CROUPS. amTO sunAE C:B U.NT ROILRD SPEED COAND S 34 IDEgg:Y: 13ECHN PROCESS SET WP PLAT coEC4e0mN TYPE5c NTC~i Ft ENTTY p FATFO L $ATA DE5 OOR RO TYPES DAA ATAK O LC 2 TE9 4EGIO ETC - RE0 Weedmappng wth UV imgeryand bag Z-4sa orsbsdiag asfe R-4 17? ..1RAY .HEM MEmeLL imagecpbagn of visua words sased spagaclasppfng FIG.1: S A CHEATICBLOK DIGRA OF HE RESET IVENTON OVELSMAT RCgToNrAN CLASSIFCATIONSYSTEM

Description

SpEa6D .R o LOORNRL MOe a TONOU X UN:KGUIN R NT0 E CROUPS. IM E EST A SUBTRACT SEGMEN FLANTS
amTO sunAE C:B U.NT ROILRD
SPEED COAND S 34 IDEgg:Y: 13ECHN PROCESS SET WP PLAT coEC4e0mN TYPE5c NTC~i Ft ENTTY p
FATFO L $ATADE5 OOR RO TYPES DAA ATAK O LC 2 TE9 4EGIO ETC - RE0
Weedmappng wth UV imgeryand bag orsbsdiag asfe Z-4sa
R-4 17? 1RAY .. .HEM MEmeLL
imagecpbagn of visua words sased spagaclasppfng
FIG.1: DIGRA S A CHEATICBLOK OF HE RESET IVENTON OVELSMAT RCgToNrAN CLASSIFCATIONSYSTEM
IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System
FIELD OF THE INVENTION
Our invention "IMLS-Weed Recognition/Classification" is related to intelligent weed recognition/classification using machine learning system and also relates to systems used to recognize and classify weeds by type, size, shape and texture. More particularly, the present invention relates to a dynamic on-line and real time weed recognition system which is adapted to control the type, amount and timing of herbicide spray heads on a moving platform.
BACKGROUND OF THE INVENTION
It is well known that agricultural sprays for the control of weeds are usually applied over a complete area in which weeds are to be controlled. More recently, spray systems have been developed which employ sensors and control systems as well as a plurality of individual selectable spray heads. These systems are adapted to first locate the vegetation to be sprayed and to activate selective ones of the spray heads to be activated to spray only those areas in which vegetation is sensed or recognized. Such systems are shown in U.S. Pat. No. 5,144,767 which is classified in International Class A01G, Sub-Class 1/12 and U.S. Class 47, Sub-Classes 1.43 and 1.7. Such prior art systems decide to spray if a percentage level of green vegetation is recognizable in the area to be sprayed. While such prior art system is an improvement over the complete area coverage spraying used in the prior art, they failed to recognize the types of weeds, the size and texture of the weeds as distinguished from other vegetation.
Computer vision systems have been successfully used for automatically guiding the mechanical or chemical destruction of weeds between rows of crops; knowledge of the position of the rows where crops should be growing and the assumption that plants growing outside such positions are weeds may be used in such systems. However, for many horticultural crops the removal of weeds from inside a row or band of crops in which the weeds are mixed in with plants in a random manner remains a long and costly manual operation.
The use of stereoscopic images to assess the distance between a plant and a camera has been used with the aim of differentiating between crops and weeds and thus allowing the position of weeds to be determined. This is based on the assumption that the weeds and the crops grow at different speeds and the distance between the plant and the camera as determined from the image may be used to differentiate between crops and weeds. One aim of this invention is to improve the accuracy of the identification of weeds or of the differentiation between crops and weeds, particularly in the case of crops sown in bands in a substantially random distribution within the band with weeds present within the band of crops. Other aims will be clear from the following description.
According to one aspect, the present invention provides a method of differentiating between weeds and crops as defined in claim 1 , particularly with a view to determining the position of weeds growing in soil amongst crops.
According to another aspect, the present invention provides a method of selectively destroying weeds, preferably individually, using such a method to determine the position of weeds. The invention also extends to equipment adapted to identify and/or destroy weeds using such methods. The dependent claims define preferred or alternative embodiments of the invention.
The development of an automated in-row weeding device requires identifying the precise position of weeds. Once the position of a weed has been determined, an automated system may be used to destroy the weed, for example, by delivering or spraying a chemical at the precise location of the weed (for example by controlling an array of micro-sprayers rather than blanket spraying of chemicals), by thermal destruction for example using a flame or heated probe, or by using a robotic arm (for example using a mechanical cutter having a width of perhaps 1-2 cm). The possibility of thermal or non-chemical destruction of individual weeds has particular application for organic agriculture. The position of weeds may be generated in the form of a weed map or spray map.
The invention is based on the premise that improved accuracy in the automated identification of weeds and/or in the differentiation between weeds and crops and/or in the identification of their position can be achieved by estimating the height of plants protruding from the soil rather than the distance between the top of a plant and a fixed measurement device. This is particularly the case when the level of the soil is not even or planar and/or the soil has a non- planar profile or surface roughness. It is even more the case in such conditions when the plants are young and therefore of small size compared with the irregularities of the ground.
The invention is particularly applicable where the difference in height between weeds and crops is of the same order of magnitude as the surface roughness of the soils or where the ratio a/b between.
1. the variation in the soil height, for example within the area of a captured image. 2. the average crop height is greater than 1/8, 1/5, % or 1/3. This may be the case during the early stages of growth of the crops.
The height of the crops being analysed may be greater than 2 mm, 5 mm, 10 mm or 20 mm; it may be less than 200 mm, 150 mm, 100 mm, 80 mm or 50 mm. The invention is of particular use in relation.
1) crops sown in a line or band along a ridge or raised strip of soil, for example a ridge of a ploughed ridge and furrow and/or. 2) (ii) crops sown in rows or bands having a width of less than 100 cm, particularly less than 70 cm, less than 60 cm or less than 50 cm. The average density of crops within a band may be greater than 10 crop plants per M 2 ; it may be greater than 20, 50, 100 or 150 crop plants per M 2
. This average density may be assessed, for example for carrots, by considering ten separate areas of 30 cm long by 5 cm wide at which crops have been sown along the centre line of the band, counting the number of crop plants in each of these areas and calculating the mean average density of crops per M 2 . The average crop density particularly when seeds are sown and/or in the initial part of their growth cycle may be greater, for example greater than 500 or 1000 per M 2 . Early weeding is particularly beneficial for horticultural crops, for example carrots: some common annual weeds have their peak of germination around the same times as crop sowing and affect the crop growth.
It has been shown that there is a significant effect of weed removal timing on the yield of e.g. carrots: the 3-week and 5-week after sowing weeded plots have a significantly greater yield than 7-week treatment. Furthermore, carrots are sown in a relatively dense irregular pattern. Consequently, the invention is particularly applicable to differentiation between plants comprising mixed weeds and crops early in the crop growing cycle. This may be within the time period starting 7 days, 10 days or 14 days after sowing of the crops and/or ending 60 days, 90 days or 100 days after sowing of the crops.
The invention is particularly applicable to identification of weeds and/or determination of weed position where the weeds are mixed with crops, particularly where the weeds are within a crop line or band. This is often the case with horticultural crops. The invention may be of particular application for use with one of more of the following crops: one or more apiaceae (also called umbellifers); carrots; celery; fennel; medicinal plants; cumin; parsley; coriander; parsnip; common beans. The apiaceae class is intended to include:
Anethum graveolens - Dill, Anthriscus cerefolium - Chervil, Angelica spp. - Angelica, Apium graveolens - Celery, Arracacia xanthorrhiza - Arracacha, Carum carvi Caraway, Centella asiatica - Gotu Kola (pennywort), Coriandrum sativum - Coriander, Cuminum cyminum - Cumin, Daucus carota - Carrot, Eryngium spp. - Sea holly, Foeniculum vulgare - Fennel, Myrrhis odorata - Cicely, Ferula gummosa - galbanum, Pastiness sativa - Parsnip, Petroselinum crispum - Parsley, Pimpinella anisum - Anise, Levisticum officinale - Lovage. The invention may also be used in relation to: beans; potatoes; chicory; beetroot.
Automated weeding equipment according to the invention may be arranged to work its way along a line or band of crops; it may be self-propelling. It may operate on a step by step basis so that it remains stationary at a first position along a line a crops whilst detecting the position of weeds and dealing with them before move to a second position further along the line of crops and repeating this procedure. Alternatively, the weeding equipment may function whilst it is moving.
The weeding equipment may be arranged to travel along a line of crops several times during a critical weeding period. Thus, for example, if a weed is not correctly identified as such during a first passage, it may be correctly identified during a second passage, for example 2, 3, 4, 5 or more days later. Such a system may be used to reduce the risk of inadvertently destroying crops through incorrect identification; if the first time the weeding equipment passes a weed having a corrected plant height similar to the expected crop height it may not be identified as a weed. Nevertheless, at the subsequent passage of the weeding equipment, given the difference in growth rates of the weeds and the crops, the difference between the corrected plant height and the expected crop height will be greater thus facilitating correct identification of that particular weed.
The weeding equipment may be partially or entirely solar powered; it may comprise one or more solar panels. It may use solar energy to supplement and/or replenish an electrical energy storage device, for example a battery or accumulator. Such arrangements may facilitate the autonomy of the weeding device. The weeding equipment may comprise a guidance system to direct its movement, for example, along a line of crops.
The stereoscopic data of plants growing on soil may be acquired in the form of an image, for example using a camera as a sensor. In this case, the captured plant data points (ie data points representing a position at which the presence of a plant has been detected) and the captured soil data points (ie data points representing a position at which the presence of soil has been detected) may be represented by pixels of the image.
The data analysis and/or weed detection is preferably implemented using a computer, for example using computer software. Where structured light is projected, this may be coded structured light. Nevertheless, non- coded structured light may be used and may allow faster data processing. A line scanner system may be used to acquire stereoscopic information, for example using common structured light. When using a line scanner, segmentation of each image may be used to find the scanner line but this may result in reliability problems due to occlusions and discontinuities and large differences of reflectance between soil and plants. Alternatively, temporal analysis on images sequences may be used to find the lines and may actually give better results than using coded structured light. Passive stereoscopy may also be used to acquire stereoscopic information.
PRIOR ART SEARCH
US5144767A*1988-06-221992-09-08The Minister For Agriculture & Rural Affairs For The State Of New South Wales Controller for agricultural sprays US5222324A *1991-02-211993-06-29Neall Donald L 0 Crop spraying system US5278423A *1992-12-301994-01-11Schwartz Electro-Optics, Inc. Object sensor and method for use in controlling an agricultural sprayer US8295979B2 *2010-01-062012-10-23Deere & Company Adaptive scheduling of a service robot
US8285460B22010-01-062012-10-09Deere & Company Varying irrigation scheduling based on height of vegetation DE102011120858A1*2011-12-132013-06-13Yara International Asa Method and device for contactless determination of plant parameters and for processing this information CN107846848A *2015-07-022018-03-27 Robotic vehicle and the method automatically processed for being used for plant organism using robot US10123475B22017-02-032018-11-13Cnh Industrial America Llc System and method for automatically monitoring soil surface roughness US10262206B22017-05-162019-04-16Cnh Industrial America Llc. Vision-based system for acquiring crop residue data and related calibration methods
OBJECTIVES OF THE INVENTION
1. The objective of the invention is to provide a system for recognizing weeds by shape, type, size, color, texture and texture structural properties. It is a principal object of the present invention to provide a system for recognizing and identifying green vegetation by individual plants grouped into regions for purposes of analysis. 2. The other objective of the invention is to provide a plurality of image analysis functions on potential green vegetation regions established within a plurality of memory map frames. 3. The other objective of the invention is to identify and classify green vegetation regions and potentially occluded plants. 4. The other objective of the invention is to identify and classify the same type of plant even though it may have a different leaf configuration. 5. The other objective of the invention is to generate control signals for spraying a plurality of weed specific chemicals simultaneously onto a predetermined area. 6. The other objective of the invention is to dynamically build an image base data bank of useful vegetation attributes which permit refinement and optimization of the functions used in the process of recognizing certain types of vegetation.
SUMMARY OF THE INVENTION
The invention a smart weed recognition and identification system is provided which comprises a green vegetation sensor which is adapted to supply standard TV color signals and a vegetation index ratio to a frame grabber and memory storage device that is adapted to store a memory map of the sensed vegetation. The stored vegetation data in the memory map is first enhanced and then a plurality of image analysis functions are performed on the vegetation regions identified in the memory map frame which includes texture segmentation and occlusion resolution, edge analysis and region clustering. During a teaching operation, the results of the analysis functions are stored in a plant library as base models. Subsequently, other areas of vegetation are sensed as regions and subjected to the plural image analysis functions and then the results are compared with the reference base models learned during the teaching operation. Matching is employed to identify plant types, shape, size, color and texture and to generate control signals capable of controlling selectable plural spray heads and selectable plural chemicals.
Apparatus for recognizing and controlling weeds growing in a field of desired vegetation, comprising: sensor means for detecting green vegetation in said field, first memory means for storing a vegetation memory map of said field to provide a vegetation image representative of different forms of green vegetation in said field, means for eliminating background information from said vegetation memory map to produce an enhanced vegetation memory map in said memory means, means for segmenting said enhanced vegetation memory map into identifiable map regions corresponding to unique field regions of said field, means for establishing data representative of unique attributes for each of said map regions, second memory means for storing reference data base models of the green vegetation defining each of said regions, processing means for matching each enhanced vegetation memory map region with said stored reference data base model of the corresponding region, said processing means being coupled to a dynamic controller means for controlling a plurality of the ejection of weed-controlling fluids through spray nozzles wherein said spray is dispensed into said regions, wherein said sensor means comprises a chlorophyll sensor for generating TV color compatible signals red, blue and green and an index ration NIR/R, wherein said processing means further comprises means for identifying the regions of green vegetation by size, shape color, texture and spectral properties, and wherein said dynamic controller means is constructed to control chemicals that are ejected from a plurality of spray heads which are constructed to spray selected isolated green vegetation with different chemicals.
A method of recognizing classifying and controlling weeds in an area containing green vegetation, comparing the steps of: sensing green vegetation in said area with a green vegetation sensor, storing data from the sensed area in a digital memory map in digital form, processing the digital memory map of the sensed area to eliminate background information from the digital memory map to produce a digital memory map of only the green vegetation, segmenting the green vegetation into identifiable regions, establishing unique attributes for the green vegetation for each of said regions, matching the unique attributes of regions sensed with previously stored unique attributes to determine if, any of said regions contain weeds to be controlled, and generating signals for controlling a plurality of spray nozzles covering the area sensed for dispensing weed controlling chemicals into said weeds.
Problem Solution
Presence of occlusions, thin objects Per pixel decoding Shallow projector depth of field Per pixel decoding, weakly correlated codes High dynamic range High dynamic range acquisition, correlation based decoding Internal Reflections Pseudorandom Pattern Because of the large amount of occlusions and thin objects (such as plant bracts), we chose to use a per pixel decoding scheme, where code is decoded at every camera pixel rather than using neighboring pixels. The nature of the code was chosen to give robust results in presence of high dynamic range scenes. We used weakly correlated codes with a minimum Hamming distance of 8 (empirically determined).
The Hamming distance between two codes is the number of corresponding bits that are different. The length of the code used was 22 bits, which allowed for the minimum Hamming distance requirement and gave good decoding results. The codes were decoded by correlation: the signal received by a single camera pixel over time was compared with all possible signals. As correlation also gives a measure of the reliability of the decoding, it was used to remove spurious measurements by applying a threshold.
Usually, in time multiplexing binary or gray code techniques, the projected images are comprised of black and white bands of large then finer width. The wider bands cause problems when the scene is prone to internal reflections (the illuminated part of one part of the scene will illuminate other parts of the scene). The code used here also happened to be pseudorandom (i.e. no apparent structure) which resulted in a more uniform illumination.
The scenes presented a high dynamic range since the reflectance of soil can vary greatly with its moisture content and certain plant species had a highly reflective surface. We thus chose to acquire high dynamic range images using multiple exposures blending. Four exposures of each pattern were taken at different exposure times and linearly blended. The number of exposures and the exposure times were determined empirically on potted plants. The high dynamic range acquisition allowed us to have a strong signal to noise ratio for all pixels of the image. An equipment related problem encountered was the shallow depth of field of the projector: given the size of the scene and distance from projector it was not possible to have the projected pattern sharp on close and distant object of the scene. The choice of a per pixel decoding scheme combined with the weakly correlated code was also motivated by that characteristic.
The calibration of the camera-projector system was done using the Zhang technique from the Intel Open-CV library. A study was conducted concerning two carrots' varieties without distinction, Nerac F1 and Namur Fl. Approximately 200 linear metres of rows were mechanically sown at a density of
to 15 seeds per 100 mm long by 50 mm wide which is a common commercial planting density (ie a mean average of 2000 to 3000 seeds per M 2 ). Several species of weeds were naturally present in the field and others were manually introduced. The main species were the following at the time of data acquisition: Sonchus asper L., Chenopodium sp., Cirsium sp., Merurialis M. perennis, Brassica sp. and Matricaria maritima. Other species might have been present. Weeds were considered as a single class in the discrimination approach since they appeared in fields in unpredictable species and quantities. Table 2 gives a summary of acquired data. Images were acquired at an early growth stage of both carrots and weeds (from one week after crop emergence to 19 days later which is the usual period for manual weed removal). Indeed, early weed detection can increase yields and weed elimination becomes increasingly difficult with plant growth. A total of 28 multispectral stereoscopic images were acquired at random locations in the parcel.
Summary of acquired data.
The method used plant height as a discriminating parameter between crop and weed. The raw data acquired by the stereoscopic device was not plant height but the distance of the plants relative to the measurement device. This distance doesn't accurately represent plant height if the position of the device relative to the ground varies or if the ground is irregular. We thus computed a new parameter called corrected plant height which is independent of those problems by using plant and ground data. This parameter is the distance between plant pixels and the actual ground level under them obtained by fitting a surface and seen from a reconstructed point of view corresponding to a camera's optical axis perpendicular to the ridge plane. The crop/weed discrimination process is illustrated Figure 4. The camera 13 was used to acquire a stereoscopic image 41 of the plants growing on the soil. First we segmented the stereoscopic image 41 in to ground pixels 42 in the images and plant pixels 43 using only the multispectral data.
This operation was done on two spectral bands by quadratic discriminant analysis. We fitted a plane surface 44 representing the average plane of the soil through the soil pixels; this was adjusted using a RANSAC (RANdom Sample Consensus) algorithm. In addition, we used the griddata function of Matlab to fit a triangle-based cubic interpolated surface 45 through the soil pixels to represent the profile or surface roughness of the soil. Because this function produced a surface that passed through all specified points, it was very sensitive to spurious pixels resulting from any imperfect segmentation between plants and ground. Furthermore, any spurious pixels were frequently present at the limit between plants and ground, which was the border of the regions that were of interest for the plant height determination.
To avoid this problem, the borders of those regions were eroded by a round structuring element of diameter 3 pixels. The plant pixels 43 and soil pixels 45 (the latter with the interpolated pixels obtained from modelling since the ground under the plants was not visible from the camera, and not all points seen by the camera were illuminated by the projector) were then put back together to create a corrected image 46 for which the orientation of the fitted plane 44 was used to rotate the data in space so as to align the plane normal with a virtual camera (X) optical axis perpendicular to the calculated plane of the soil.
For the classification between crops and weeds (by means of a quadratic discriminant analysis), we used two parameters. The first parameter is for each plant pixel the distance between the plant pixel and the reconstructed soil underneath (corrected plant height). The second is the expected crop height which, in this case, was estimated from the number of days after sowing and empirical data previously determined giving the average height of the crops in question as a function of the number of days after sowing in similar growing conditions. For classification between crops and weeds, it was found that the measurement device position and ground irregularities greatly influenced the classification accuracies and that using the corrected plant height, which took into account a correction for those effects, improved the classification results (Table 3).
Classification results:
Parameter Non corrected plant height plant height Overall 66 83 Carrots 75 85 Weeds 57 80
The overall classification accuracy without correction was 66%. For the corrected height parameter, the overall classification accuracy was 83%.
For the carrot class alone, there was a smaller improvement when going from the non-corrected height parameter to the corrected plant height than for the weed class. This can be explained by the central position of the carrot plants on the ridge and the better surface state of the soil in that area of the ridges, due to the sowing apparatus. The expected crop height may be determined automatically, for example by determining an average height of the plants or the crops from the captured images. Preferably, the corrected plant height is used for such a determination. Particularly in the case of carrots, the carrots typically are sown in a band 5 cm wide with 10 to 15 carats per 10 cm length of the band. The position of the band may be estimated from the captured images and the image divided into a zone in which there are only weeds, and a zone in which there are weeds and crops. Comparing data of plant height from these two zones may be used in estimating the height of the crops.
Preferred stereoscopic imaging and analysis Stereoscopic imaging aims to record three-dimensional information. There are mainly two kinds of acquisition methods: passive and active and either may be used in the context of the invention. Passive methods usually rely on several views of the same scene to recover depth information (e.g. binocular stereoscopy, similar to human depth perception). Active methods are characterized by the projection on the scene of some form of energy (commonly light) to help acquire depth information. Binocular stereoscopy is fairly common since it is simple to implement in hardware and is well suited to real-time acquisition. Robust acquisition of dense stereoscopic data by this technique is not an easy task.
The imaging and analysis preferably uses structure coding light with: • A time multiplexing code; and • A pseudo-random pattern; and • Per pixel decoding. The projected coded light may have at least 18, preferably at least 20 and more preferably at least 22 patterns. There may be at least 6, preferably at least 7 and more preferably at least 8 differences between each projected code.
Since the object of the prototype embodiment described above was not real time acquisition of stereoscopic images and several problems were encountered for passive stereoscopic data acquisition of the scenes (numerous occlusions, repetitive texture areas, texture-free areas, high dynamic range, reflective surfaces and thin objects high dynamic range, repetitive texture areas, lowly textured areas, thin objects and occlusions), it was chosen to use an active system, based on coded structured light. This technique is based on the projection of light on the scene to make the correspondence problem easier. Its principle, as illustrated in Fig 5, is to project a single or multiple light patterns on the scene, for example with a video projector 11 and to capture the image with a camera 13. In the pattern or group of patterns, the position of each element (line, column, pixel or group of pixel) is encoded. The information extracted by active stereovision is usually of better precision and higher density than the one obtained using binocular stereoscopy techniques.
At least two defined spectral bands may be used in acquiring the image, for example a spectral band centred on 450 nm and having a band width of 80 nm and a spectral band centred on 700 nm and having a band width of 50 nm. The spectral bands may be implemented by the use of filters, for example placed in front of the camera. Alternatively, the camera may be adapted to have a frequency response in the desired spectral bands. A third spectral band may also be used, for example centred on 550 nm and having a band width of 80 nm. The use of selected spectral bands may increase the accuracy of differentiating between plants and soil.
The scene perceived using the coded structured light is analysed by the camera (active stereoscopy). Due to the presence of very fine objects, the existence of occlusions generating discontinuities (superimposed objects), the large dynamic range (presence of very light and very dark objects) and the existence of internal reflection, a specific light scheme was used having the following characteristics:
Each light projection consisted of 768 fringes (corresponding to the resolution of the projector), each of these projections being called "a pattern"; Each luminous fringe is successively lit or not lit 22 times which corresponds to a 22-bit code; the codes are pseudo-random (without apparent structure) to avoid disturbances due to internal reflection of the scene and are weakly correlated between themselves;
A pattern is thus characterised by 768 luminous fringes, each being lit or not lit as a function of one of the 22 values of the code; For each pattern, four exposures of different duration (0.6, 0.3, 0.07 and 0.01 seconds) are used so that each part of the scene is correctly exposed be it light or dark. The first image with the longest exposure time is used as the base. The overexposed part of the image is eliminated and replaced by the corresponding part of the image with a lower exposure time. This is repeated three times to until a correctly exposed image is obtained; The image is analysed pixel by pixel by correlation between the signal emitted and the signal received.
Comparative examples
The accuracy of differentiation between carrots and weeds was assessed using techniques not in accordance with the invention and using different embodiments of the invention: The better accuracy of identifying carrots may be explained by the position of the plants
1) towards the centre of the image making them less sensitive to variations in height and irregularities of the soil level, and 2) towards the middle of the ridges or mounds on which they were sown, the ridges being more regular at this position.
These identification accuracies are averages over the whole growing period measured. The improved accuracy of using corrected plant height and/or compensating for the angle of the camera with respect to the plane of the soil are significantly greater in the later portions of the growing period measured; this may be due to the increased damage to the regularity of the growing ridges as the growing period advances, for example by erosion and passage of machines and equipment.
Fig 6 illustrates crops 61, 62 growing in soil 63 having an irregular soil profile 22. In this embodiment, the variation in the soil height a (ie the difference between the highest and lowest point on the soil profile in an area being considered) is of the same order of magnitude as the average crop height b. For example, the variation in soil height a may be 40 mm and the crop height b may be 50mm so that the ratio a/b is 0.8. Fig 6 is similar save that the area of soil analysed is larger and covers substantially the entire width of a mound of soil 64 in which crops 61,62, for example carrots, are grown. In this case, the variation in the soil height a may be 15cm and the average crop height b may be 3 cm (early during the growing cycle) so that the ratio a/b is 0.2. Later in the growing cycle, the average crop height b may be 10 cm with the variation in soil height a 15 cm so that the ratio a/b is 0.67.
It will be appreciated that whilst the invention has been described, for example with reference to Fig 4, in terms of modelling planes and surfaces, alternative data treatment techniques may be used to estimate the corrected plant height and/or compensate for the camera or sensor angle. Where the crops are sown in bands, the method may comprise determining the position and/or boundaries of the sown band. This may be used to reduce the area of a field where high precision weeding would be necessary as in the area outside the sown band, all plants may be destroyed without further characterization on the assumption that they are weeds. Alternatively, or additionally, this may be used in automatic determination of the expected crop height.
Derivation of the expected crop height from captured data of the plants growing on the soil may be determined as follows: a) determining the boundaries of the sown band; b) determining the corrected plant height of plants within the boundaries of the sown band; c) determining the most frequently occurring corrected plant heights and assuming this to indicate the height of the crops (ie the average expected crop height on the basis that the crops are significantly more prevalent than weed in the sown band) and/or using this to determine a range of expected crop heights.
BRIEF DESCRIPTION OF THE DIAGRAM
FIG. 1: is a schematic block diagram of the present invention novel smart recognition and classification system;
FIG. 2: is a schematic drawing of a sensor and its field of view and spray heads and their area being sprayed;
FIG. 3: is a schematic block diagram of the operation of plant segmentation performed in FIG. 1;
FIG. 4: is a schematic block drawing of a display of a memory map showing green vegetation after being enhanced and the background information removed;
FIG. 5: is a schematic drawing of a display of the memory map of FIG. 4 after edge detection is performed as shown in FIG. 3;
FIG. 6: is a schematic drawing used to explain a histogram slicing operation performed in FIG. 3;
FIG. 7: is a schematic block diagram used to explain a texture segmentation operation;
FIG. 8: is a schematic drawing of a memory map used to explain clustering and connecting of components into separate regions;
FIG. 9: is a schematic drawing of a memory map used to explain Omni directional spoke analysis.
DESCRIPTION OF THE INVENTION
Refer now to FIG. 1 showing a block diagram of the preferred embodiment of the present invention novel smart weed recognition and classification system 10. Information used by the novel system is preferably supplied by a chlorophyll sensor 11 which is capable of generating red, blue and green television compatible signals on line 12 and a vegetation index ratio signal NIR/R on line 13. While a chlorophyll sensor is preferred such as that shown and described in our aforementioned cop ending U.S. application No. 08/254,630, it is possible to modify other types of chlorophyll and green vegetation sensors to operate in conjunction with the weed recognition systems to be described herein.
The television compatible signals on line 12 are coupled to a TV signal converter 14 which generates TV signals on line 15 that are applied directly to a frame grabbing and storing memory 16 which is also provided with an input of the vegetation index ratio on line 13. It will be understood that the four color inputs to the frame grabbing and storing memory 16 are capable of being controlled so as to provide a desired predetermined frame map image on line 17. The frame map image is enhanced by improving the resolution and compensating for motion in block 18. The enhanced image map shown in block 18 is refined by making an estimate and subtracting out the background leaving a clean memory map of green vegetation or plants as shown in block 19. It will be understood that in order to perform the functions of enhancement explained with regards to blocks 18 and 19, the vegetation ratio NIR/R is a desired input as shown on line 13.
The information used in block 21 is shown labeled as segmenting plants into groups and labeling regions. This functional operation will be explained in greater detail hereinafter and includes other operations which are extremely desirable for the correct analysis of the memory map regions which identify the vegetation by plant size, etc. After the green vegetation is segmented into regions, the regional information is then analyzed to identify the plants by their attributes which are defined for each of the regions as shown in block 22.
The same information that is used to set up the plant identity and its attributes for each of the regions is also information which is loaded via line 23 into an archive memory plant library which contains the data identifying the base models, their region, size, etc. During a teach operation, it is necessary to either use information previously stored in the plant library 24 or to load new information generated in blocks 21 and 22 into the memory library 24 for identifying plants to be treated using the plant identification stored in the plant library 24. After the plant library 24 is sufficiently enabled with the proper information for identifying plants, the information which subsequently is being examined as the sensor 11 is moved on its platform through a field, is supplied via line 25 to the block 26 labeled process and match for regions, for size, for shape, for texture, etc.
During the teach operation, it is possible to use the manual control keyboard 27 which is coupled to block 26 to blocks 22 and 24 to call up information and run a series of test programs which will optimize the identification of the plants being subsequently examined during a run operation. The keyboard 27 is also shown coupled via line 28 to a command and control block 29 which will be explained hereinafter. Even though not shown, the control lines from the keyboard and manual control 27 of the processing means 26 are connected to other elements of FIG. 1 to complete a manual and step control of each of the elements. Block 26 is shown coupled to the plant library 24 via parallel lines 31 which are used to perform the match or comparison operation to determine the identify and label the types, size, shape, texture, etc. of the plants. The decision information is generated on line 32 and the functional results are shown in block 33 as identifying the region, types, plants, crops, etc.
In summary, the same information that is used in the comparison and match process has undergone the plurality of imaging processing operations that are used to identify the plants and also to store the identifying data in the plant library 24. Different plants under different growth conditions and different background will require emphasis on different sets of image analysis operations. However, the image analysis operation selected at the keyboard 27 are more than capable of identifying the regions and the plants by shape, size, texture, etc.
This information is employed to generate signals to block 33 which is functionally contained within the processing means block 26. The output on line 34 is connected to the command and control block 29. The signals which identify the type of plant or weed and its size and texture and shape are signals which are employed in the command and control block 29 to determine the amount of and type of chemicals to be used to spray the previously identified vegetation. It will also be understood that the system is not necessarily limited to spraying herbicides and weed control chemicals but may also be used to fertilize areas containing mixed vegetation or isolated vegetation.
The command and control block 29 is shown having a sync input, a speed input, and a location input on lines 35 to 37. This information is necessary for timing the dispensing of chemicals being sprayed from a moving platform which has servos as shown in block 38. As will be explained hereinafter, it is possible to have a cluster of spray heads in a single plural spray head or have a spray head with plural inputs which are operated simultaneously to mix the chemicals to be used with the present invention. Thus, the command and control 29 sends signals to the spray heads as well as to control valves to spray chemicals and control the rate, time and type patterns as shown in block 39. The control line 41 from the command and control 29 which controls the spray heads 39 may also be supplied to the archival memory and plant library 24 and stored in such a form as to be recovered to generate information which describes the amount and type of chemicals sprayed throughout and entire field. At a future date, this information may be extremely important in determining the effectiveness of the spray operation.
The command control unit 29 is shown coupled via line 42 to the chlorophyll sensor 11 and via line 43 to a monitor 44 which has a second input from the chlorophyll sensor via line 45 to enable the command and control unit to monitor the information being received by the chlorophyll sensor. In some embodiments of the present invention, the chlorophyll sensor may already be provided with a monitor or display for observing the sensor 11 inputs to the system 10.
In summary, the present invention preferably employs a chlorophyll sensor 11 which supplies not only the color information of the vegetation but a vegetation index NIR/R. The information received from the sensor is processed either in a custom design frame grabber or in some type of computing system which will perform a frame grabbing operation. Such frame grabbers are available for insertion into standard personal computers and such devices may be employed which would be cheaper than designing a complete customer frame grabbing and storing operation. Once the information is stored in a memory as a memory image map, this information can then be further processed by a computing system to perform the novel image analysis operations to be defined in greater detail hereinafter.
Refer now to FIG. 2 showing a schematic drawing of a sensor 11 and its field of view 46 and a plurality of spray heads 39A and 39B and their overlapping spray areas 47. The sensor and spray heads are supported by a platform 38 which is preferably movable by servos (not shown). It will be understood that the sensor in FIG. 2 senses an area of vegetation prior to the time the spray heads 39 pass over the area previously viewed. In a moving and dynamic system, the image obtained by the frame grabbing operation is blurred and can be substantially enhanced by performing a motion compensation correction operation previously explained in reference to block 18 of FIG. 1. Further, green vegetation is supported by soil which contains various forms of moisture, texture and contours. This background information should be subtracted out to provide the ideal and enhanced map or frame that contains the vegetation to be analyzed.
Refer now to FIG. 3 showing a schematic block diagram of the operation of plant segmentation performed in block 21 of FIG. 1. The output from block 19 serves as an input on line 52 to three blocks labeled edge detection 53 sliced intensity histogram 54 and texture segmentation 55 which will be explained in greater detail hereinafter. The results of edge detection and slice intensity histogram analysis are merged in block 56 and the results of this merger are included as an input into block 57 where the clustering and connecting of components as well as the separation into regions is performed. An additional input on line 67 from the texture segmentation block 55 is provided as an input to block 57.
The results of the separation into regions permits the processor of the system 10 to now define the segmented region by boundaries its edges, its color and texture as shown in block 58. After defining the regions to be analyzed it is now possible to find their centers or centroids as segmented regions as shown in block 59. Once the centroids of the segmented regions are established in block 59, it is now possible to perform a functional analysis entitled Omni directional spoke analysis as shown in block 61. Information from the block 58 is also employed in the block 61 analysis. After obtaining all of the low level properties from the image analysis, these properties may be further analyzed to obtain and identify high level attributes which will permit the identification of the plants of each of the regions as shown in block 22 as previously described with reference to FIG. 1.
Refer now to FIG. 4 showing a schematic drawing of a display of a memory map image of the aforementioned green vegetation after being enhanced and the background removed. The memory map image shown at the area 48 is shown having a plant 49 and a plant 51 which are different to the human eye. The analysis of the attributes of the plants 49 and 51, though decidedly different in shape may have some similar attributes. Thus, it is necessary to analyze numerous attributes which define the differences in the plants 49 and 51 which will permit their identification as types of plants. The preferred analysis of attributes in the present invention includes the attributes of intensity, color, size, shape, texture, structural signature and structural relationship.
Refer now to FIG. 5 showing a schematic drawing of the memory map image shown in FIG. 4 after the operation of edge detection has been performed. Thus, the plants 49 and 51 are now defined by very precise lines or edges and are designated as plants 49' and 51' in image memory map 48'. The functional operation of edge detection was previously performed in block 53 of FIG. 3 and the information contained therein is necessary for performing some of the other functional operations in the block shown in FIG. 3.
Refer now to FIG. 6 showing a schematic drawing which will be used to explain a histogram slicing functional operation. The three peaks 62, 63 and 64 are the results of analyzing three separate and independent regions R1, R2 and R3 contained in an image memory map 48. Each of the regions to be analyzed contains a number of pixels inside of the region having different values of intensity. The number of pixels which have the same or substantially the intensity same may be plotted as a number of pixels versus the intensity. Thus for the region defined by R1 there are 35 times the factor K number of pixels having an intensity value of 34. In similar manner, a sub region within a plant or region R2 has higher intensity may be defined by the wave form 63 which has similar number of pixels at the higher intensity value 33 at the peak 63.
The third sub region is shown as R3 having a peak 64. In order to better explain this information which is used to analyze the plants 49 and 51 shown in FIG. 4, two of the dark leaves of plant 49 have been labeled R3 because of their darkness and similar intensity. These two sub regions R3 may be definable separately, but if located close enough together, they may be merged into a region. The light leaf is labeled R1 and the intermediate intensity leaf is labeled R2. Thus, having explained that the regions R1, R2 and R3 as sub regions of a plant of larger regions, it will be appreciated that the sub region R2, R3 and R1 may easily overlap each other as intensity values but have been shown separately for ease of explanation of histogram slicing as shown in block 54 of FIG. 3.
Refer now to FIG. 7 showing a schematic block diagram which will be used to explain texture segmentation operations. The input on line 52 was previously shown on FIG. 3 as an input to block 55 in which the texture segmentation operation is performed. In order to perform a preferred embodiment texture segmentation operation, it is first necessary to take all of the information in an image map display and specify a much smaller window which may be defined by a number of pixels in the X and Y direction. This particular window is effectively placed over a small area within the image map and the value of the values of the pixels within the window are examined for a painting texture measure. There are numerous acceptable measures of texture, however, a preferred measure of texture for purposes of this invention may be average intensity and standard deviation from average intensity.
The window is moved one pixel at a time over the entire image map and the output of this operation at each window position is used to provide another image on line 65. The new image generated on line 65 is now analyzed and undergoes the histogram slice operation previously explained with reference to FIG. 6. The output from the histogram slice operation in block 66 provides an output of group intensity values on line 67 of the type shown and explained hereinbefore with reference to FIG. 6. The only difference which now occurs is that the histogram slice is a texture mixture slice and not an intensity value slice. The output of the texture segmentation from block 66 is shown provided as an input on line 67 to the clustering and connecting operation performed in block 57 of FIG. 3. The output of the histogram slice operation in block 66 is the output on line 67.
Refer now to FIG. 8 showing a schematic drawing of a memory map 68 used to explain clustering and connecting of components into separate regions. Memory map 68 is shown containing four identifiable clustered regions labeled respectively as clustered regions A to D. It is now possible to perform the operation in block 57 of FIG. 3 by employing the edge detection information and the slice intensity information and the texture segmentation information to define the boundaries and other attributes of the cluster regions A, B, C and D as separate and identifiable regions within the memory map 68. Now that these cluster regions have been segmented and identified, it is possible to add further attributes which identify these regions as shown in block 58. For example, the aforementioned low level attributes may now be applied to the cluster regions to more particularly and specifically identify these regions.
Refer now to FIG. 9 showing a schematic drawing of a memory map used to explain Omni directional spoke analysis. The aforementioned regions A to D may now be functionally analyzed to identify their centroids such as centroid 69 shown at the center of the area of cluster A. Additional centroids are shown in cluster regions B to D but need not be defined further for purposes of this explanation. A plurality of vectors is shown as vectors 71, 72 which extend from the centroid to the maximum extent of the outside area of the clustered region A. For example, this information is capable of identifying length, width and number of leaves as well as their distribution. This information is performed respectively at the blocks 59 and 61 of FIG. 3. Having performed each of the functional operations shown in FIG. 3 within the segmentation block 21 shown in FIG. 1, it will be appreciated that the information being supplied as an output on line 73 to the block 22 is nothing more than a list of low level attributes which define each of the segmented or clustered regions of the type shown in FIGS. 8 and 9 within a memory map 68.
Having explained a preferred embodiment of the present invention, it will be understood that the recognition system defined and described hereinbefore may be employed with other types of sensors to perform different types of automatic target recognition. For example, a chlorophyll sensor may be used with the present recognition system to identify targets that are embedded or camouflaged in high density green vegetation.
Further, the present system may be used for surveillance and reconnaissance wherein memory maps are taken from time to time to observe the changes that are made in the area being surveyed. As a further example, it is possible to substitute the chlorophyll sensor with some of the latest types of infrared sensors and to observe targets which emit high amounts of infrared radiation in other bands. In this case, the region now becomes the object to be surveyed and the background information is eliminated and attributes are made for each of the objects (regions) and stored in the memory library or archives 68 so that later identification of similar objects will be performed in the matching operation.

Claims (9)

WE CLAIM
1) Our invention "IMLS-Weed Recognition/Classification" is an intelligent weed recognition and identification system comprises a chlorophyll sensor for detecting green vegetation and memory map means for storing images which contain different forms of green vegetation. The IMLS-weed recognition/classification is also the memory maps stored in memory are processed to eliminate the background information and leave a memory map containing only green vegetation and the enhanced memory map is further processed by an operation of segmentation into identifiable regions and the identifiable green vegetation regions are processed to identify unique attributes for each of the regions. The IMLS-weed recognition/classification also unique attributes for each of the regions is stored in a reference data base library and are used as reference data for comparing other green vegetation with the data stored in the base model by a processor which matches green vegetation in other regions with the green vegetation stored in said reference data base model and further produces decision data signals which are used by a controller to control a plurality of spray nozzles covering the area sensed and for dispensing a plurality of selectable controlled chemicals.
2) According to claims# The invention is to an intelligent weed recognition and identification system comprises a chlorophyll sensor for detecting green vegetation and memory map means for storing images which contain different forms of green vegetation.
3) According to claim1,2# The invention is to an also the memory maps stored in memory are processed to eliminate the background information and leave a memory map containing only green vegetation and the enhanced memory map is further processed by an operation of segmentation into identifiable regions and the identifiable green vegetation regions are processed to identify unique attributes for each of the regions.
4) According to claims,# The invention is to a unique attributes for each of the regions is stored in a reference data base library and are used as reference data for comparing other green vegetation with the data stored in the base model by a processor which matches green vegetation in other regions with the green vegetation stored in said reference data base model and further produces decision data signals which are used by a controller to control a plurality of spray nozzles covering the area sensed and for dispensing a plurality of selectable controlled chemicals.
5) According to claiml,2,4# The invention is to using a camera to acquiring a stereoscopic image of plants growing on soil.
6) According to claiml,2,3# The invention is to segmenting plant pixels and soil pixels.
7) According to claiml,2,4,5# The invention is to creating a modelised height profile of the soil at positions at which soil pixels have not been captured by the stereoscopic image based on information derived from adjacent soil pixels.
8) According to claiml,2,3# The invention is to determining a corrected plant height at plant pixels representing the distance between the plant pixel and the modelised soil underneath.
9) According to claiml,2,4# The invention is to differentiating between weeds and crops by comparing the corrected plant height with an expected crop height. Once the position of a weed has been determined, it may be destroyed, for example by heat applied to the identified position or by a robotic arm.
FIG. 1: IS A SCHEMATIC BLOCK DIAGRAM OF THE PRESENT INVENTION NOVEL SMART RECOGNITION AND CLASSIFICATION SYSTEM;
FIG. 2: IS A SCHEMATIC DRAWING OF A SENSOR AND ITS FIELD OF VIEW AND SPRAY HEADS AND THEIR AREA BEING SPRAYED;
FIG. 3: IS A SCHEMATIC BLOCK DIAGRAM OF THE OPERATION OF PLANT SEGMENTATION PERFORMED IN FIG. 1;
FIG. 4: IS A SCHEMATIC BLOCK DRAWING OF A DISPLAY OF A MEMORY MAP SHOWING GREEN VEGETATION AFTER BEING ENHANCED AND THE BACKGROUND INFORMATION REMOVED;
FIG. 5: IS A SCHEMATIC DRAWING OF A DISPLAY OF THE MEMORY MAP OF FIG. 4 AFTER EDGE DETECTION IS PERFORMED AS SHOWN IN FIG. 3;
FIG. 6: IS A SCHEMATIC DRAWING USED TO EXPLAIN A HISTOGRAM SLICING OPERATION PERFORMED IN FIG. 3;
FIG. 7: IS A SCHEMATIC BLOCK DIAGRAM USED TO EXPLAIN A TEXTURE SEGMENTATION OPERATION;
FIG. 8: IS A SCHEMATIC DRAWING OF A MEMORY MAP USED TO EXPLAIN CLUSTERING AND CONNECTING OF COMPONENTS INTO SEPARATE REGIONS;
FIG. 9: IS A SCHEMATIC DRAWING OF A MEMORY MAP USED TO EXPLAIN OMNI DIRECTIONAL SPOKE ANALYSIS.
AU2020103332A 2020-11-09 2020-11-09 IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System Ceased AU2020103332A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2020103332A AU2020103332A4 (en) 2020-11-09 2020-11-09 IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2020103332A AU2020103332A4 (en) 2020-11-09 2020-11-09 IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System

Publications (1)

Publication Number Publication Date
AU2020103332A4 true AU2020103332A4 (en) 2021-01-21

Family

ID=74341057

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2020103332A Ceased AU2020103332A4 (en) 2020-11-09 2020-11-09 IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System

Country Status (1)

Country Link
AU (1) AU2020103332A4 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113369295A (en) * 2021-06-08 2021-09-10 姜云保 Soil ecological remediation system
CN115641331A (en) * 2022-11-18 2023-01-24 山东天意装配式建筑装备研究院有限公司 Intelligent detection method for spraying effect of wallboard film
CN116012733A (en) * 2022-12-14 2023-04-25 兰州大学 Method for repairing severe degradation alpine grassland bare spot by using species combination of native grass
CN116912702A (en) * 2023-09-14 2023-10-20 潍坊现代农业山东省实验室 Weed coverage determination method, system and device and electronic equipment
CN117635719A (en) * 2024-01-26 2024-03-01 浙江托普云农科技股份有限公司 Weeding robot positioning method, system and device based on multi-sensor fusion

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113369295A (en) * 2021-06-08 2021-09-10 姜云保 Soil ecological remediation system
CN115641331A (en) * 2022-11-18 2023-01-24 山东天意装配式建筑装备研究院有限公司 Intelligent detection method for spraying effect of wallboard film
CN116012733A (en) * 2022-12-14 2023-04-25 兰州大学 Method for repairing severe degradation alpine grassland bare spot by using species combination of native grass
CN116012733B (en) * 2022-12-14 2023-09-29 兰州大学 Method for repairing degenerated alpine grassland bare spot by utilizing species combination of native grass
CN116912702A (en) * 2023-09-14 2023-10-20 潍坊现代农业山东省实验室 Weed coverage determination method, system and device and electronic equipment
CN116912702B (en) * 2023-09-14 2024-01-26 潍坊现代农业山东省实验室 Weed coverage determination method, system and device and electronic equipment
CN117635719A (en) * 2024-01-26 2024-03-01 浙江托普云农科技股份有限公司 Weeding robot positioning method, system and device based on multi-sensor fusion
CN117635719B (en) * 2024-01-26 2024-04-16 浙江托普云农科技股份有限公司 Weeding robot positioning method, system and device based on multi-sensor fusion

Similar Documents

Publication Publication Date Title
AU2020103332A4 (en) IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System
US5606821A (en) Smart weed recognition/classification system
US11510355B2 (en) Method and apparatus for automated plant necrosis
Guerrero et al. Crop rows and weeds detection in maize fields applying a computer vision system based on geometry
Montalvo et al. Automatic detection of crop rows in maize fields with high weeds pressure
Bah et al. Weeds detection in UAV imagery using SLIC and the hough transform
Guerrero et al. Automatic expert system based on images for accuracy crop row detection in maize fields
CN112839511B (en) Method for applying a spray to a field
EP2327039A1 (en) Weed detection and/or destruction
Berenstein et al. Grape clusters and foliage detection algorithms for autonomous selective vineyard sprayer
McCarthy et al. Applied machine vision of plants: a review with implications for field deployment in automated farming operations
Jiménez et al. Automatic fruit recognition: a survey and new results using range/attenuation images
CA2764135A1 (en) Device and method for detecting a plant
Lavania et al. Novel method for weed classification in maize field using Otsu and PCA implementation
Liu et al. Development of a machine vision system for weed detection during both of off-season and in-season in broadacre no-tillage cropping lands
Tillett et al. A field assessment of a potential method for weed and crop mapping on the basis of crop planting geometry
Steward et al. Real-time machine vision weed-sensing
Montalvo et al. Acquisition of agronomic images with sufficient quality by automatic exposure time control and histogram matching
Liu et al. Development of a proximal machine vision system for off-season weed mapping in broadacre no-tillage fallows
Weyrich et al. Quality assessment of row crop plants by using a machine vision system
Gong et al. Navigation line extraction based on root and stalk composite locating points
Raja et al. A novel weed and crop recognition technique for robotic weed control in a lettuce field with high weed densities
Qureshi et al. Seeing the fruit for the leaves: towards automated apple fruitlet thinning
US20230403964A1 (en) Method for Estimating a Course of Plant Rows
US12020465B2 (en) Method and apparatus for automated plant necrosis

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry