EP2327039A1 - Weed detection and/or destruction - Google Patents

Weed detection and/or destruction

Info

Publication number
EP2327039A1
EP2327039A1 EP09765883A EP09765883A EP2327039A1 EP 2327039 A1 EP2327039 A1 EP 2327039A1 EP 09765883 A EP09765883 A EP 09765883A EP 09765883 A EP09765883 A EP 09765883A EP 2327039 A1 EP2327039 A1 EP 2327039A1
Authority
EP
European Patent Office
Prior art keywords
soil
height
crops
plant
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP09765883A
Other languages
German (de)
French (fr)
Inventor
Alexis Piron
Marie-France Destain
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gembloux Agro-.Bio Tech-Universite de Liege
Original Assignee
Gembloux Agro-.Bio Tech-Universite de Liege
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gembloux Agro-.Bio Tech-Universite de Liege filed Critical Gembloux Agro-.Bio Tech-Universite de Liege
Priority to EP09765883A priority Critical patent/EP2327039A1/en
Publication of EP2327039A1 publication Critical patent/EP2327039A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B39/00Other machines specially adapted for working soil on which crops are growing
    • A01B39/12Other machines specially adapted for working soil on which crops are growing for special purposes, e.g. for special culture
    • A01B39/18Other machines specially adapted for working soil on which crops are growing for special purposes, e.g. for special culture for weeding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the present invention relates to automated weed detection and/or destruction.
  • One aim of this invention is to improve the accuracy of the identification of weeds or of the differentiation between crops and weeds, particularly in the case of crops sown in bands in a substantially random distribution within the band with weeds present within the band of crops.
  • Other aims will be clear from the following description.
  • the present invention provides a method of differentiating between weeds and crops as defined in claim 1 , particularly with a view to determining the position of weeds growing in soil amongst crops.
  • the present invention provides a method of selectively destroying weeds, preferably individually, using such a method to determine the position of weeds.
  • the invention also extends to equipment adapted to identify and/or destroy weeds using such methods.
  • an automated in-row weeding device requires identifying the precise position of weeds. Once the position of a weed has been determined, an automated system may be used to destroy the weed, for example, by delivering or spraying a chemical at the precise location of the weed (for example by controlling an array of micro-sprayers rather than blanket spraying of chemicals), by thermal destruction for example using a flame or heated probe, or by using a robotic arm (for example using a mechanical cutter having a width of perhaps 1-2 cm).
  • the possibility of thermal or non-chemical destruction of individual weeds has particular application for organic agriculture.
  • the position of weeds may be generated in the form of a weed map or spray map.
  • the invention is based on the premise that improved accuracy in the automated identification of weeds and/or in the differentiation between weeds and crops and/or in the identification of their position can be achieved by estimating the height of plants protruding from the soil rather than the distance between the top of a plant and a fixed measurement device. This is particularly the case when the level of the soil is not even or planar and/or the soil has a non- planar profile or surface roughness. It is even more the case in such conditions when the plants are young and therefore of small size compared with the irregularities of the ground.
  • the invention is particularly applicable where the difference in height between weeds and crops is of the same order of magnitude as the surface roughness of the soils or where the ratio a/b between (a) the variation in the soil height, for example within the area of a captured image, and (b) the average crop height is greater than 1/8, 1/5, % or 1/3. This may be the case during the early stages of growth of the crops.
  • the height of the crops being analysed may be greater than 2 mm, 5 mm, 10 mm or 20 mm; it may be less than 200 mm, 150 mm, 100 mm, 80 mm or 50 mm.
  • the invention is of particular use in relation to (i) crops sown in a line or band along a ridge or raised strip of soil, for example a ridge of a ploughed ridge and furrow and/or (ii) crops sown in rows or bands having a width of less than 100 cm, particularly less than 70 cm, less than 60 cm or less than 50 cm.
  • the average density of crops within a band may be greater than 10 crop plants per m 2 ; it may be greater than 20, 50, 100 or 150 crop plants per m 2 .
  • This average density may be assessed, for example for carrots, by considering ten separate areas of 30 cm long by 5 cm wide at which crops have been sown along the centre line of the band, counting the number of crop plants in each of these areas and calculating the mean average density of crops per m 2 .
  • the average crop density particularly when seeds are sown and/or in the initial part of their growth cycle may be greater, for example greater than 500 or 1000 per m 2 .
  • Early weeding is particularly beneficial for horticultural crops, for example carrots: some common annual weeds have their peak of germination around the same times as crop sowing and affect the crop growth. It has been shown that there is a significant effect of weed removal timing on the yield of e.g.
  • carrots the 3-week and 5-week after sowing weeded plots have a significantly greater yield than 7-week treatment. Furthermore, carrots are sown in a relatively dense irregular pattern. Consequently, the invention is particularly applicable to differentiation between plants comprising mixed weeds and crops early in the crop growing cycle. This may be within the time period starting 7 days, 10 days or 14 days after sowing of the crops and/or ending 60 days, 90 days or 100 days after sowing of the crops.
  • the invention is particularly applicable to identification of weeds and/or determination of weed position where the weeds are mixed with crops, particularly where the weeds are within a crop line or band. This is often the case with horticultural crops.
  • the invention may be of particular application for use with one of more of the following crops: one or more apiaceae (also called umbellifers); carrots; celery; fennel; medicinal plants; cumin; parsley; coriander; parsnip; common beans.
  • the apiaceae class is intended to include: Anethum graveolens - Dill, Anthriscus cerefolium - Chervil, Angelica spp.
  • the invention may also be used in relation to: beans; potatoes; chicory; beetroot.
  • Automated weeding equipment may be arranged to work its way along a line or band of crops; it may be self propelling. It may operate on a step by step basis so that it remains stationary at a first position along a line a crops whilst detecting the position of weeds and dealing with them before move to a second position further along the line of crops and repeating this procedure. Alternatively, the weeding equipment may function whilst it is moving.
  • the weeding equipment may be arranged to travel along a line of crops several times during a critical weeding period. Thus, for example, if a weed is not correctly identified as such during a first passage, it may be correctly identified during a second passage, for example 2, 3, 4, 5 or more days later. Such a system may be used to reduce the risk of inadvertently destroying crops through incorrect identification; if the first time the weeding equipment passes a weed having a corrected plant height similar to the expected crop height it may not be identified as a weed. Nevertheless, at the subsequent passage of the weeding equipment, given the difference in growth rates of the weeds and the crops, the difference between the corrected plant height and the expected crop height will be greater thus facilitating correct identification of that particular weed.
  • the weeding equipment may be partially or entirely solar powered; it may comprise one or more solar panels. It may use solar energy to supplement and/or replenish an electrical energy storage device, for example a battery or accumulator. Such arrangements may facilitate the autonomy of the weeding device.
  • the weeding equipment may comprise a guidance system to direct its movement, for example, along a line of crops.
  • the stereoscopic data of plants growing on soil may be acquired in the form of an image, for example using a camera as a sensor.
  • the captured plant data points ie data points representing a position at which the presence of a plant has been detected
  • the captured soil data points ie data points representing a position at which the presence of soil has been detected
  • the data analysis and/or weed detection is preferably implemented using a computer, for example using computer software.
  • structured light is projected, this may be coded structured light.
  • non- coded structured light may be used and may allow faster data processing.
  • a line scanner system may be used to acquire stereoscopic information, for example using common structured light. When using a line scanner, segmentation of each image may be used to find the scanner line but this may result in reliability problems due to occlusions and discontinuities and large differences of reflectance between soil and plants. Alternatively, temporal analysis on images sequences may be used to find the lines and may actually give better results than using coded structured light. Passive stereoscopy may also be used to acquire stereoscopic information.
  • the weeds to be detected or differentiated from crops may be selected from the group consisting of: Sonchus asper L., Chenopodium sp., Cirsium sp., Merurialis M. perennis, Brassica sp. and Matricaria maritima.
  • Fig 1 is a perspective view of an experimental apparatus for determining the position of weeds
  • Fig 2 and Fig 3 which are schematic side views illustrating plant height
  • Fig 4 which is a schematic representation of steps in the method of determining the position of weeds
  • Fig 5 which is a schematic representation of a time multiplexed coded structured light system
  • Fig 6a and 6b which are schematic cross sections of plants growing in soil; Fig 7 which is a representation of determination of expected plant height.
  • the experimental apparatus of Fig 1 comprises a video projector 1 1 arranged to illuminate a portion of a band of soil 12 which contains both crops and weeds and a camera 13 arranged to capture an image, in this case a top-down image approximately 200 mm by 250 mm.
  • the video projector 1 1 and camera 13 are mounted on a carriage 10 adapted to move along the band of soil 12. Additional lighting 14 and/or a reflector 15 may also be provided on the carriage 10.
  • Shrouds (not shown) to occlude natural light and shield the scene to be analysed from external light were fitted to the carriage 10.
  • the system uses a projected image and a camera to capture the image as reflected from the scene to be analysed; analysis can thus be based on the deformation between the projected image and the captured image.
  • One aspect of the invention is based upon improving the accuracy of the estimation of plant height h when differentiating between crops and weeds. Whilst the distance between the top of a plant and an overlying camera 13 can be determined, for example by image analysis, this may not provide a particularly accurate indication of plant height h. As illustrated in Fig 2, one difficulty arises from the soil profile 22 which gives a variation in soil height such that the distances Z1 and Z2 between the top of a plant and the camera 13 is a poor approximation of the plant height h. Another problem, illustrated in Fig 3, arises from any inclination of the camera 13 with respect to the plane of the soil 23 which exacerbates this. Each of these factors may be significant when assessing plant height in the real conditions of an agricultural field.
  • the embodiment used a coded structured light stereoscopic imaging system with a multispectral camera to allow registration of height information over multispectral images.
  • the multispectral camera was based on a black and white camera (C-cam BCI 5 1.3 megapixels); the projector was a DLP video projector (OPTOMA EP719 with a 1024x768 resolution). Acquisition speed and mechanical vibrations concerns due to the filter wheel dictated the use of monochromatic patterns acquired without filter in front of the camera.
  • the coded structured light technique had to take into account the specificities of the small scale agricultural scenes, namely occlusion and thin objects, internal reflections and scene high dynamic range. It was also necessary to obtain robust results and take into account the specificities of the video projector.
  • Table 1 summarizes the choice of the codification and strategies used to overcome those problems. As fast acquisition was not a concern in this embodiment and a black and white camera was used (for the multispectral part of the acquisition device), we decided to use a time multiplexing approach with a binary codeword basis (black or white illumination).
  • the Hamming distance between two codes is the number of corresponding bits that are different.
  • the length of the code used was 22 bits, which allowed for the minimum Hamming distance requirement and gave good decoding results.
  • the codes were decoded by correlation: the signal received by a single camera pixel over time was compared with all possible signals. As correlation also gives a measure of the reliability of the decoding, it was used to remove spurious measurements by applying a threshold.
  • the projected images are comprised of black and white bands of large then finer width.
  • the wider bands cause problems when the scene is prone to internal reflections (the illuminated part of one part of the scene will illuminate other parts of the scene).
  • the code used here also happened to be pseudorandom (i.e. no apparent structure) which resulted in a more uniform illumination.
  • the high dynamic range acquisition allowed us to have a strong signal to noise ratio for all pixels of the image.
  • An equipment related problem encountered was the shallow depth of field of the projector: given the size of the scene and distance from projector it was not possible to have the projected pattern sharp on close and distant object of the scene.
  • the choice of a per-pixel decoding scheme combined with the weakly correlated code was also motivated by that characteristic.
  • the calibration of the camera-projector system was done using the Zhang technique from the Intel OpenCV library.
  • the method used plant height as a discriminating parameter between crop and weed.
  • the raw data acquired by the stereoscopic device was not plant height but the distance of the plants relative to the measurement device. This distance doesn't accurately represent plant height if the position of the device relative to the ground varies or if the ground is irregular.
  • This parameter is the distance between plant pixels and the actual ground level under them obtained by fitting a surface and seen from a reconstructed point of view corresponding to a camera's optical axis perpendicular to the ridge plane.
  • the crop/weed discrimination process is illustrated Figure 4.
  • the camera 13 was used to acquire a stereoscopic image 41 of the plants growing on the soil.
  • any spurious pixels were frequently present at the limit between plants and ground, which was the border of the regions that were of interest for the plant height determination. To avoid this problem, the borders of those regions were eroded by a round structuring element of diameter 3 pixels.
  • the plant pixels 43 and soil pixels 45 (the latter with the interpolated pixels obtained from modelling since the ground under the plants was not visible from the camera, and not all points seen by the camera were illuminated by the projector) were then put back together to create a corrected image 46 for which the orientation of the fitted plane 44 was used to rotate the data in space so as to align the plane normal with a virtual camera ( ⁇ ) optical axis perpendicular to the calculated plane of the soil.
  • the first parameter is for each plant pixel the distance between the plant pixel and the reconstructed soil underneath (corrected plant height).
  • the second is the expected crop height which, in this case, was estimated from the number of days after sowing and empirical data previously determined giving the average height of the crops in question as a function of the number of days after sowing in similar growing conditions. il onas s i
  • the overall classification accuracy without correction was 66%.
  • the overall classification accuracy was 83%.
  • the expected crop height may be determined automatically, for example by determining an average height of the plants or the crops from the captured images.
  • the corrected plant height is used for such a determination.
  • the carrots typically are sown in a band 5 cm wide with 10 to 15 carrots per 10 cm length of the band.
  • the position of the band may be estimated from the captured images and the image divided into a zone in which there are only weeds, and a zone in which there are weeds and crops. Comparing data of plant height from these two zones may be used in estimating the height of the crops.
  • Preferred stereoscopic imaging and analysis Stereoscopic imaging aims to record three-dimensional information.
  • acquisition methods passive and active and either may be used in the context of the invention.
  • Passive methods usually rely on several views of the same scene to recover depth information (e.g. binocular stereoscopy, similar to human depth perception).
  • Active methods are characterized by the projection on the scene of some form of energy (commonly light) to help acquire depth information.
  • Binocular stereoscopy is fairly common since it is simple to implement in hardware and is well suited to real-time acquisition. Robust acquisition of dense stereoscopic data by this technique is not an easy task.
  • the imaging and analysis preferably uses structure coding light with: • A time multiplexing code;
  • the projected coded light may have at least 18, preferably at least 20 and more preferably at least 22 patterns. There may be at least 6, preferably at least 7 and more preferably at least 8 differences between each projected code.
  • At least two defined spectral bands may be used in acquiring the image, for example a spectral band centred on 450 nm and having a band width of 80 nm and a spectral band centred on 700 nm and having a band width of 50 nm.
  • the spectral bands may be implemented by the use of filters, for example placed in front of the camera. Alternatively, the camera may be adapted to have a frequency response in the desired spectral bands.
  • a third spectral band may also be used, for example centred on 550 nm and having a band width of 80 nm. The use of selected spectral bands may increase the accuracy of differentiating between plants and soil.
  • Each light projection consisted of 768 fringes (corresponding to the resolution of the projector), each of these projections being called “a pattern”; Each luminous fringe is successively lit or not lit 22 times which corresponds to a 22 bit code; the codes are pseudo-random (without apparent structure) to avoid disturbances due to internal reflection of the scene and are weakly correlated between themselves;
  • a pattern is thus characterised by 768 luminous fringes, each being lit or not lit as a function of one of the 22 values of the code; For each pattern, four exposures of different duration (0.6, 0.3, 0.07 and 0.01 seconds) are used so that each part of the scene is correctly exposed be it light or dark.
  • the first image with the longest exposure time is used as the base.
  • the overexposed part of the image is eliminated and replaced by the corresponding part of the image with a lower exposure time. This is repeated three times to until a correctly exposed image is obtained;
  • the image is analysed pixel by pixel by correlation between the signal emitted and the signal received.
  • Fig 6a illustrates crops 61 , 62 growing in soil 63 having an irregular soil profile 22.
  • the variation in the soil height a ie the difference between the highest and lowest point on the soil profile in an area being considered
  • the variation in soil height a is of the same order of magnitude as the average crop height b.
  • the variation in soil height a may be 40 mm and the crop height b may be 50mm so that the ratio a/b is 0.8.
  • Fig 6b is similar save that the area of soil analysed is larger and covers substantially the entire width of a mound of soil 64 in which crops 61 ,62 , for example carrots, are grown.
  • the variation in the soil height a may be 15cm and the average crop height b may be 3 cm (early during the growing cycle) so that the ratio a/b is 0.2. Later in the growing cycle, the average crop height b may be 10 cm with the variation in soil height a 15 cm so that the ratio a/b is 0.67.
  • the method may comprise determining the position and/or boundaries of the sown band. This may be used to reduce the area of a field where high precision weeding would be necessary as in the area outside the sown band, all plants may be destroyed without further characterisation on the assumption that they are weeds. Alternatively or additionally, this may be used in automatic determination of the expected crop height.
  • Derivation of the expected crop height from captured data of the plants growing on the soil may be determined as follows: a) determining the boundaries of the sown band; b) determining the corrected plant height of plants within the boundaries of the sown band; c) determining the most frequently occurring corrected plant heights and assuming this to indicate the height of the crops (ie the average expected crop height on the basis that the crops are significantly more prevalent than weed in the sown band) and/or using this to determine a range of expected crop heights.
  • line 71 represents the corrected heights of plants detected within the sown band against their probability density (which is representative of their frequency of occurrence) 21 days after sowing for a row of carrots.
  • the peak of line 71 occurs at a corrected plant height of about 17 cm. This is shown as the average of the expected plant height 72.
  • Line 74 represents the probability density function of carrots and line 73 the probability density function of weeds.
  • the range of expected crop height for the carrots, as illustrated by line 72 is about 7 to 17 cm.
  • the expected crop height is preferably used as a range so that any plant determined to have a corrected plant height outside this range is classified as a weed.
  • a range may be determined from empirical data or may be derived from captured data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Soil Working Implements (AREA)

Abstract

A method of determining the position of weeds growing in soil amongst crops comprising the steps of: • Using a camera to acquiring a stereoscopic image of plants growing on soil; • Segmenting plant pixels and soil pixels; • Creating a modelised height profile of the soil at positions at which soil pixels have not been captured by the stereoscopic image based on information derived from adjacent soil pixels • Determining a corrected plant height at plant pixels representing the distance between the plant pixel and the modelised soil underneath; and • Differentiating between weeds and crops by comparing the corrected plant height with an expected crop height. Once the position of a weed has been determined, it may be destroyed, for example by heat applied to the identified position or by a robotic arm.

Description

Weed detection and/or destruction
The present invention relates to automated weed detection and/or destruction.
Computer vision systems have been successfully used for automatically guiding the mechanical or chemical destruction of weeds between rows of crops; knowledge of the position of the rows where crops should be growing and the assumption that plants growing outside such positions are weeds may be used in such systems. However for many horticultural crops the removal of weeds from inside a row or band of crops in which the weeds are mixed in with plants in a random manner remains a long and costly manual operation.
The use of stereoscopic images to assess the distance between a plant and a camera has been used with the aim of differentiating between crops and weeds and thus allowing the position of weeds to be determined. This is based on the assumption that the weeds and the crops grow at different speeds and the distance between the plant and the camera as determined from the image may be used to differentiate between crops and weeds.
One aim of this invention is to improve the accuracy of the identification of weeds or of the differentiation between crops and weeds, particularly in the case of crops sown in bands in a substantially random distribution within the band with weeds present within the band of crops. Other aims will be clear from the following description.
According to one aspect, the present invention provides a method of differentiating between weeds and crops as defined in claim 1 , particularly with a view to determining the position of weeds growing in soil amongst crops.
According to another aspect, the present invention provides a method of selectively destroying weeds, preferably individually, using such a method to determine the position of weeds.
The invention also extends to equipment adapted to identify and/or destroy weeds using such methods.
The dependent claims define preferred or alternative embodiments of the invention.
The development of an automated in-row weeding device requires identifying the precise position of weeds. Once the position of a weed has been determined, an automated system may be used to destroy the weed, for example, by delivering or spraying a chemical at the precise location of the weed (for example by controlling an array of micro-sprayers rather than blanket spraying of chemicals), by thermal destruction for example using a flame or heated probe, or by using a robotic arm (for example using a mechanical cutter having a width of perhaps 1-2 cm). The possibility of thermal or non-chemical destruction of individual weeds has particular application for organic agriculture. The position of weeds may be generated in the form of a weed map or spray map.
The invention is based on the premise that improved accuracy in the automated identification of weeds and/or in the differentiation between weeds and crops and/or in the identification of their position can be achieved by estimating the height of plants protruding from the soil rather than the distance between the top of a plant and a fixed measurement device. This is particularly the case when the level of the soil is not even or planar and/or the soil has a non- planar profile or surface roughness. It is even more the case in such conditions when the plants are young and therefore of small size compared with the irregularities of the ground. The invention is particularly applicable where the difference in height between weeds and crops is of the same order of magnitude as the surface roughness of the soils or where the ratio a/b between (a) the variation in the soil height, for example within the area of a captured image, and (b) the average crop height is greater than 1/8, 1/5, % or 1/3. This may be the case during the early stages of growth of the crops. The height of the crops being analysed may be greater than 2 mm, 5 mm, 10 mm or 20 mm; it may be less than 200 mm, 150 mm, 100 mm, 80 mm or 50 mm. The invention is of particular use in relation to (i) crops sown in a line or band along a ridge or raised strip of soil, for example a ridge of a ploughed ridge and furrow and/or (ii) crops sown in rows or bands having a width of less than 100 cm, particularly less than 70 cm, less than 60 cm or less than 50 cm. The average density of crops within a band may be greater than 10 crop plants per m2; it may be greater than 20, 50, 100 or 150 crop plants per m2. This average density may be assessed, for example for carrots, by considering ten separate areas of 30 cm long by 5 cm wide at which crops have been sown along the centre line of the band, counting the number of crop plants in each of these areas and calculating the mean average density of crops per m2. The average crop density particularly when seeds are sown and/or in the initial part of their growth cycle may be greater, for example greater than 500 or 1000 per m2. Early weeding is particularly beneficial for horticultural crops, for example carrots: some common annual weeds have their peak of germination around the same times as crop sowing and affect the crop growth. It has been shown that there is a significant effect of weed removal timing on the yield of e.g. carrots: the 3-week and 5-week after sowing weeded plots have a significantly greater yield than 7-week treatment. Furthermore, carrots are sown in a relatively dense irregular pattern. Consequently, the invention is particularly applicable to differentiation between plants comprising mixed weeds and crops early in the crop growing cycle. This may be within the time period starting 7 days, 10 days or 14 days after sowing of the crops and/or ending 60 days, 90 days or 100 days after sowing of the crops.
The invention is particularly applicable to identification of weeds and/or determination of weed position where the weeds are mixed with crops, particularly where the weeds are within a crop line or band. This is often the case with horticultural crops. The invention may be of particular application for use with one of more of the following crops: one or more apiaceae (also called umbellifers); carrots; celery; fennel; medicinal plants; cumin; parsley; coriander; parsnip; common beans. The apiaceae class is intended to include: Anethum graveolens - Dill, Anthriscus cerefolium - Chervil, Angelica spp. - Angelica, Apium graveolens - Celery, Arracacia xanthorrhiza - Arracacha, Carum carvi - Caraway, Centella asiatica - Gotu Kola (pennywort), Coriandrum sativum - Coriander, Cuminum cyminum - Cumin, Daucus carota - Carrot, Eryngium spp. - Sea holly, Foeniculum vulgare - Fennel, Myrrhis odorata - Cicely, Ferula gummosa - galbanum, Pastinaca sativa - Parsnip, Petroselinum crispum - Parsley, Pimpinella anisum - Anise, Levisticum officinale - Lovage. The invention may also be used in relation to: beans; potatoes; chicory; beetroot.
Automated weeding equipment according to the invention may be arranged to work its way along a line or band of crops; it may be self propelling. It may operate on a step by step basis so that it remains stationary at a first position along a line a crops whilst detecting the position of weeds and dealing with them before move to a second position further along the line of crops and repeating this procedure. Alternatively, the weeding equipment may function whilst it is moving.
The weeding equipment may be arranged to travel along a line of crops several times during a critical weeding period. Thus, for example, if a weed is not correctly identified as such during a first passage, it may be correctly identified during a second passage, for example 2, 3, 4, 5 or more days later. Such a system may be used to reduce the risk of inadvertently destroying crops through incorrect identification; if the first time the weeding equipment passes a weed having a corrected plant height similar to the expected crop height it may not be identified as a weed. Nevertheless, at the subsequent passage of the weeding equipment, given the difference in growth rates of the weeds and the crops, the difference between the corrected plant height and the expected crop height will be greater thus facilitating correct identification of that particular weed.
The weeding equipment may be partially or entirely solar powered; it may comprise one or more solar panels. It may use solar energy to supplement and/or replenish an electrical energy storage device, for example a battery or accumulator. Such arrangements may facilitate the autonomy of the weeding device. The weeding equipment may comprise a guidance system to direct its movement, for example, along a line of crops.
The stereoscopic data of plants growing on soil may be acquired in the form of an image, for example using a camera as a sensor. In this case, the captured plant data points (ie data points representing a position at which the presence of a plant has been detected) and the captured soil data points (ie data points representing a position at which the presence of soil has been detected) may be represented by pixels of the image.
The data analysis and/or weed detection is preferably implemented using a computer, for example using computer software. Where structured light is projected, this may be coded structured light. Nevertheless, non- coded structured light may be used and may allow faster data processing. A line scanner system may be used to acquire stereoscopic information, for example using common structured light. When using a line scanner, segmentation of each image may be used to find the scanner line but this may result in reliability problems due to occlusions and discontinuities and large differences of reflectance between soil and plants. Alternatively, temporal analysis on images sequences may be used to find the lines and may actually give better results than using coded structured light. Passive stereoscopy may also be used to acquire stereoscopic information.
The weeds to be detected or differentiated from crops may be selected from the group consisting of: Sonchus asper L., Chenopodium sp., Cirsium sp., Merurialis M. perennis, Brassica sp. and Matricaria maritima.
Embodiments of the invention will now be described by way of non-limiting examples with reference to: Fig 1 which is a perspective view of an experimental apparatus for determining the position of weeds;
Fig 2 and Fig 3 which are schematic side views illustrating plant height;
Fig 4 which is a schematic representation of steps in the method of determining the position of weeds; Fig 5 which is a schematic representation of a time multiplexed coded structured light system;
Fig 6a and 6b which are schematic cross sections of plants growing in soil; Fig 7 which is a representation of determination of expected plant height. The experimental apparatus of Fig 1 comprises a video projector 1 1 arranged to illuminate a portion of a band of soil 12 which contains both crops and weeds and a camera 13 arranged to capture an image, in this case a top-down image approximately 200 mm by 250 mm. The video projector 1 1 and camera 13 are mounted on a carriage 10 adapted to move along the band of soil 12. Additional lighting 14 and/or a reflector 15 may also be provided on the carriage 10. A filter wheel 16 holding 22 interference filters covering the VIS-NIR spectral domain positioned in front of the camera allowed the effect of different filters to be assessed. Shrouds (not shown) to occlude natural light and shield the scene to be analysed from external light were fitted to the carriage 10. The system uses a projected image and a camera to capture the image as reflected from the scene to be analysed; analysis can thus be based on the deformation between the projected image and the captured image.
One aspect of the invention is based upon improving the accuracy of the estimation of plant height h when differentiating between crops and weeds. Whilst the distance between the top of a plant and an overlying camera 13 can be determined, for example by image analysis, this may not provide a particularly accurate indication of plant height h. As illustrated in Fig 2, one difficulty arises from the soil profile 22 which gives a variation in soil height such that the distances Z1 and Z2 between the top of a plant and the camera 13 is a poor approximation of the plant height h. Another problem, illustrated in Fig 3, arises from any inclination of the camera 13 with respect to the plane of the soil 23 which exacerbates this. Each of these factors may be significant when assessing plant height in the real conditions of an agricultural field.
The embodiment used a coded structured light stereoscopic imaging system with a multispectral camera to allow registration of height information over multispectral images. The multispectral camera was based on a black and white camera (C-cam BCI 5 1.3 megapixels); the projector was a DLP video projector (OPTOMA EP719 with a 1024x768 resolution). Acquisition speed and mechanical vibrations concerns due to the filter wheel dictated the use of monochromatic patterns acquired without filter in front of the camera.
The coded structured light technique had to take into account the specificities of the small scale agricultural scenes, namely occlusion and thin objects, internal reflections and scene high dynamic range. It was also necessary to obtain robust results and take into account the specificities of the video projector.
Table 1 summarizes the choice of the codification and strategies used to overcome those problems. As fast acquisition was not a concern in this embodiment and a black and white camera was used (for the multispectral part of the acquisition device), we decided to use a time multiplexing approach with a binary codeword basis (black or white illumination).
Table 1. Summary of the difficulties for stereoscopic acquisition of small-scale in-field plants
Problem Solution
Presence of occlusions, thin objects Per pixel decoding
Shallow projector depth of field Per pixel decoding, weakly correlated codes
High dynamic range High dynamic range acquisition, correlation based decoding
Internal reflections Pseudorandom pattern
Because of the large amount of occlusions and thin objects (such as plant bracts), we chose to use a per pixel decoding scheme, where code is decoded at every camera pixel rather than using neighbouring pixels.
The nature of the code was chosen to give robust results in presence of high dynamic range scenes. We used weakly correlated codes with a minimum Hamming distance of 8
(empirically determined). The Hamming distance between two codes is the number of corresponding bits that are different. The length of the code used was 22 bits, which allowed for the minimum Hamming distance requirement and gave good decoding results. The codes were decoded by correlation: the signal received by a single camera pixel over time was compared with all possible signals. As correlation also gives a measure of the reliability of the decoding, it was used to remove spurious measurements by applying a threshold.
Usually, in time multiplexing binary or gray code techniques, the projected images are comprised of black and white bands of large then finer width. The wider bands cause problems when the scene is prone to internal reflections (the illuminated part of one part of the scene will illuminate other parts of the scene). The code used here also happened to be pseudorandom (i.e. no apparent structure) which resulted in a more uniform illumination.
The scenes presented a high dynamic range since the reflectance of soil can vary greatly with its moisture content and certain plant species had a highly reflective surface. We thus chose to acquire high dynamic range images using multiple exposures blending. Four exposures of each pattern were taken at different exposure times and linearly blended. The number of exposures and the exposure times were determined empirically on potted plants. The high dynamic range acquisition allowed us to have a strong signal to noise ratio for all pixels of the image. An equipment related problem encountered was the shallow depth of field of the projector: given the size of the scene and distance from projector it was not possible to have the projected pattern sharp on close and distant object of the scene. The choice of a per-pixel decoding scheme combined with the weakly correlated code was also motivated by that characteristic.
The calibration of the camera-projector system was done using the Zhang technique from the Intel OpenCV library.
A study was conducted concerning two carrots' varieties without distinction, Nerac F1 and Namur F1. Approximately 200 linear metres of rows were mechanically sown at a density of
10 to 15 seeds per 100 mm long by 50 mm wide which is a common commercial planting density (ie a mean average of 2000 to 3000 seeds per m2). Several species of weeds were naturally present in the field and others were manually introduced. The main species were the following at the time of data acquisition: Sonchus asper L., Chenopodium sp., Cirsium sp., Merurialis M. perennis, Brassica sp. and Matricaria maritima. Other species might have been present. Weeds were considered as a single class in the discrimination approach since they appeared in fields in unpredictable species and quantities. Table 2 gives a summary of acquired data. Images were acquired at an early growth stage of both carrots and weeds
(from one week after crop emergence to 19 days later which is the usual period for manual weed removal). Indeed, early weed detection can increase yields and weed elimination becomes increasingly difficult with plant growth. A total of 28 multispectral stereoscopic images were acquired at random locations in the parcel.
Table 2. Summary of acquired data.
The method used plant height as a discriminating parameter between crop and weed. The raw data acquired by the stereoscopic device was not plant height but the distance of the plants relative to the measurement device. This distance doesn't accurately represent plant height if the position of the device relative to the ground varies or if the ground is irregular. We thus computed a new parameter called corrected plant height which is independent of those problems by using plant and ground data. This parameter is the distance between plant pixels and the actual ground level under them obtained by fitting a surface and seen from a reconstructed point of view corresponding to a camera's optical axis perpendicular to the ridge plane. The crop/weed discrimination process is illustrated Figure 4. The camera 13 was used to acquire a stereoscopic image 41 of the plants growing on the soil. First we segmented the stereoscopic image 41 in to ground pixels 42 in the images and plant pixels 43 using only the multispectral data. This operation was done on two spectral bands by quadratic discriminant analysis. We fitted a plane surface 44 representing the average plane of the soil through the soil pixels; this was adjusted using a RANSAC (RANdom Sample Consensus) algorithm. In addition, we used the griddata function of Matlab to fit a triangle-based cubic interpolated surface 45 through the soil pixels to represent the profile or surface roughness of the soil. Because this function produced a surface that passed through all specified points, it was very sensitive to spurious pixels resulting from any imperfect segmentation between plants and ground. Furthermore, any spurious pixels were frequently present at the limit between plants and ground, which was the border of the regions that were of interest for the plant height determination. To avoid this problem, the borders of those regions were eroded by a round structuring element of diameter 3 pixels. The plant pixels 43 and soil pixels 45 (the latter with the interpolated pixels obtained from modelling since the ground under the plants was not visible from the camera, and not all points seen by the camera were illuminated by the projector) were then put back together to create a corrected image 46 for which the orientation of the fitted plane 44 was used to rotate the data in space so as to align the plane normal with a virtual camera (λ) optical axis perpendicular to the calculated plane of the soil.
For the classification between crops and weeds (by means of a quadratic discriminant analysis), we used two parameters. The first parameter is for each plant pixel the distance between the plant pixel and the reconstructed soil underneath (corrected plant height). The second is the expected crop height which, in this case, was estimated from the number of days after sowing and empirical data previously determined giving the average height of the crops in question as a function of the number of days after sowing in similar growing conditions. ilonass i
For classification between crops and weeds, it was found that the measurement device position and ground irregularities greatly influenced the classification accuracies and that using the corrected plant height, which took into account a correction for those effects, improved the classification results (Table 3).
Table 3 - Classification results
Parameter
Non corrected Corrected plant height plant height
Overall 66 83
Carrots 75 85
Weeds 57 80
The overall classification accuracy without correction was 66%. For the corrected height parameter, the overall classification accuracy was 83%.
For the carrot class alone, there was a smaller improvement when going from the non corrected height parameter to the corrected plant height than for the weed class. This can be explained by the central position of the carrot plants on the ridge and the better surface state of the soil in that area of the ridges, due to the sowing apparatus. The expected crop height may be determined automatically, for example by determining an average height of the plants or the crops from the captured images. Preferably, the corrected plant height is used for such a determination. Particularly in the case of carrots, the carrots typically are sown in a band 5 cm wide with 10 to 15 carrots per 10 cm length of the band. The position of the band may be estimated from the captured images and the image divided into a zone in which there are only weeds, and a zone in which there are weeds and crops. Comparing data of plant height from these two zones may be used in estimating the height of the crops.
Preferred stereoscopic imaging and analysis Stereoscopic imaging aims to record three-dimensional information. There are mainly two kinds of acquisition methods: passive and active and either may be used in the context of the invention. Passive methods usually rely on several views of the same scene to recover depth information (e.g. binocular stereoscopy, similar to human depth perception). Active methods are characterized by the projection on the scene of some form of energy (commonly light) to help acquire depth information. Binocular stereoscopy is fairly common since it is simple to implement in hardware and is well suited to real-time acquisition. Robust acquisition of dense stereoscopic data by this technique is not an easy task.
The imaging and analysis preferably uses structure coding light with: • A time multiplexing code; and
• A pseudo-random pattern; and
• Per pixel decoding.
The projected coded light may have at least 18, preferably at least 20 and more preferably at least 22 patterns. There may be at least 6, preferably at least 7 and more preferably at least 8 differences between each projected code.
Since the object of the prototype embodiment described above was not real time acquisition of stereoscopic images and several problems were encountered for passive stereoscopic data acquisition of the scenes (numerous occlusions, repetitive texture areas, texture-free areas, high dynamic range, reflective surfaces and thin objects high dynamic range, repetitive texture areas, lowly textured areas, thin objects and occlusions), it was chosen to use an active system, based on coded structured light. This technique is based on the projection of light on the scene to make the correspondence problem easier. Its principle, as illustrated in Fig 5, is to project a single or multiple light patterns on the scene, for example with a video projector 11 and to capture the image with a camera 13. In the pattern or group of patterns, the position of each element (line, column, pixel or group of pixel) is encoded. The information extracted by active stereovision is usually of better precision and higher density than the one obtained using binocular stereoscopy techniques.
At least two defined spectral bands may be used in acquiring the image, for example a spectral band centred on 450 nm and having a band width of 80 nm and a spectral band centred on 700 nm and having a band width of 50 nm. The spectral bands may be implemented by the use of filters, for example placed in front of the camera. Alternatively, the camera may be adapted to have a frequency response in the desired spectral bands. A third spectral band may also be used, for example centred on 550 nm and having a band width of 80 nm. The use of selected spectral bands may increase the accuracy of differentiating between plants and soil.
The scene perceived using the coded structured light is analysed by the camera (active stereoscopy). Due to the presence of very fine objects, the existence of occlusions generating discontinuities (superimposed objects), the large dynamic range (presence of very light and very dark objects) and the existence of internal reflection, a specific light scheme was used having the following characteristics:
Each light projection consisted of 768 fringes (corresponding to the resolution of the projector), each of these projections being called "a pattern"; Each luminous fringe is successively lit or not lit 22 times which corresponds to a 22 bit code; the codes are pseudo-random (without apparent structure) to avoid disturbances due to internal reflection of the scene and are weakly correlated between themselves;
A pattern is thus characterised by 768 luminous fringes, each being lit or not lit as a function of one of the 22 values of the code; For each pattern, four exposures of different duration (0.6, 0.3, 0.07 and 0.01 seconds) are used so that each part of the scene is correctly exposed be it light or dark. The first image with the longest exposure time is used as the base. The overexposed part of the image is eliminated and replaced by the corresponding part of the image with a lower exposure time. This is repeated three times to until a correctly exposed image is obtained; The image is analysed pixel by pixel by correlation between the signal emitted and the signal received.
Comparative examples
The accuracy of differentiation between carrots and weeds was assessed using techniques not in accordance with the invention and using different embodiments of the invention:
The better accuracy of identifying carrots may be explained by the position of the plants
• towards the centre of the image making them less sensitive to variations in height and irregularities of the soil level, and
• towards the middle of the ridges or mounds on which they were sown, the ridges being more regular at this position.
These identification accuracies are averages over the whole growing period measured. The improved accuracy of using corrected plant height and/or compensating for the angle of the camera with respect to the plane of the soil are significantly greater in the later portions of the growing period measured; this may be due to the increased damage to the regularity of the growing ridges as the growing period advances, for example by erosion and passage of machines and equipment.
Fig 6a illustrates crops 61 , 62 growing in soil 63 having an irregular soil profile 22. In this embodiment, the variation in the soil height a (ie the difference between the highest and lowest point on the soil profile in an area being considered) is of the same order of magnitude as the average crop height b. For example, the variation in soil height a may be 40 mm and the crop height b may be 50mm so that the ratio a/b is 0.8. Fig 6b is similar save that the area of soil analysed is larger and covers substantially the entire width of a mound of soil 64 in which crops 61 ,62 , for example carrots, are grown. In this case, the variation in the soil height a may be 15cm and the average crop height b may be 3 cm (early during the growing cycle) so that the ratio a/b is 0.2. Later in the growing cycle, the average crop height b may be 10 cm with the variation in soil height a 15 cm so that the ratio a/b is 0.67.
It will be appreciated that whilst the invention has been described, for example with reference to Fig 4, in terms of modelling planes and surfaces, alternative data treatment techniques may be used to estimate the corrected plant height and/or compensate for the camera or sensor angle. Where the crops are sown in bands, the method may comprise determining the position and/or boundaries of the sown band. This may be used to reduce the area of a field where high precision weeding would be necessary as in the area outside the sown band, all plants may be destroyed without further characterisation on the assumption that they are weeds. Alternatively or additionally, this may be used in automatic determination of the expected crop height.
Derivation of the expected crop height from captured data of the plants growing on the soil may be determined as follows: a) determining the boundaries of the sown band; b) determining the corrected plant height of plants within the boundaries of the sown band; c) determining the most frequently occurring corrected plant heights and assuming this to indicate the height of the crops (ie the average expected crop height on the basis that the crops are significantly more prevalent than weed in the sown band) and/or using this to determine a range of expected crop heights.
This is illustrated in Fig 7 in which line 71 represents the corrected heights of plants detected within the sown band against their probability density (which is representative of their frequency of occurrence) 21 days after sowing for a row of carrots. The peak of line 71 occurs at a corrected plant height of about 17 cm. This is shown as the average of the expected plant height 72. Line 74 represents the probability density function of carrots and line 73 the probability density function of weeds. The range of expected crop height for the carrots, as illustrated by line 72 is about 7 to 17 cm.
The expected crop height is preferably used as a range so that any plant determined to have a corrected plant height outside this range is classified as a weed. Such a range may be determined from empirical data or may be derived from captured data.

Claims

1 A method of differentiating between weeds and crops comprising the steps of:
• acquiring stereoscopic data of plants growing on soil;
• Segmenting plant data points and soil data points;
• Creating a modelised height profile of the soil at positions at which soil data points have not been acquired based on information derived from adjacent soil data points;
• Determining a corrected plant height at plant data points representing the distance between the plant data point and the modelised soil underneath ; and
• Differentiating between weeds and crops by comparing the corrected plant height with an expected crop height.
2 A method in accordance with claim 1 , in which the data is acquired in the form of an image, the pixels of the image representing the data points.
3 A method in accordance with claim 1 or claim 2, in which the image is acquired by a sensor and the method comprises a step of determining a captured data point height and calculating a corresponding corrected data point height which adjusts the captured data point height so as to compensate for the relative orientation between the sensor and the plane of the soil.
4 A method in accordance with any preceding claim, in which the stereoscopic data is obtained by projecting structured light and capturing the reflected image.
5 A method in accordance with any preceding claim, in which the stereoscopic data is acquired at at least two spectral bands.
6 A method in accordance with any preceding claim, in which the plant data points and the soil data points are segregated using spectral data. 7 A method in accordance with any preceding claim, in which the modelised height profile of the soil is created by fitting at least one surface through the soil data points.
8 A method in accordance with any preceding claim, in which the expected crop height is determined from empirical data of the expected growth of the crop.
9 A method in accordance with any of claims 1 to 7, in which the expected crop height is derived from the stereoscopic data of the plants growing on the soil.
10 A method in accordance with any preceding claim, in which the ratio a/b between (a) the variation in the soil height within the area of a captured image and (b) the average crop height is greater than 1/5.
1 1 A method in accordance with any preceding claim, in which the crops are sown in a band and the weeds are mixed with the crops.
12 A method in accordance with any preceding claim, in which crops are selected from the group consisting of carrots; celery; fennel; medicinal plants; cumin; parsley; coriander; parsnip; common beans. .
13 A method in accordance in any preceding claim, comprising the step of directing a weed treatment at a position at which a weed has been determined to be present.
14 A weeding device which determines the position of weeds growing in soil amongst crops comprising:
• A sensor which acquires stereoscopic data of plants growing on soil; and
• A processor programmed to o Segment the data in to plant data points and soil data points; o Create a modelised height profile of the soil at positions at which soil data points have not been captured based on information derived from adjacent soil data points; o determine a corrected plant height at plant data points representing the distance between the plant data point and the modelised soil underneath ; and o differentiate between weeds and crops by comparing the corrected plant height with an expected crop height.
Weed control equipment comprising:
• an apparatus for determining the position of weeds growing in soil amongst crops arranged to function according to any of claims 1 to 13; and • a weed destroying apparatus using the position of weeds thus determined to effect their destruction.
EP09765883A 2008-06-20 2009-06-18 Weed detection and/or destruction Withdrawn EP2327039A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP09765883A EP2327039A1 (en) 2008-06-20 2009-06-18 Weed detection and/or destruction

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP08011245 2008-06-20
PCT/EP2009/057588 WO2009153304A1 (en) 2008-06-20 2009-06-18 Weed detection and/or destruction
EP09765883A EP2327039A1 (en) 2008-06-20 2009-06-18 Weed detection and/or destruction

Publications (1)

Publication Number Publication Date
EP2327039A1 true EP2327039A1 (en) 2011-06-01

Family

ID=40030373

Family Applications (1)

Application Number Title Priority Date Filing Date
EP09765883A Withdrawn EP2327039A1 (en) 2008-06-20 2009-06-18 Weed detection and/or destruction

Country Status (2)

Country Link
EP (1) EP2327039A1 (en)
WO (1) WO2009153304A1 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8295979B2 (en) * 2010-01-06 2012-10-23 Deere & Company Adaptive scheduling of a service robot
US8285460B2 (en) * 2010-01-06 2012-10-09 Deere & Company Varying irrigation scheduling based on height of vegetation
DE102011120858A1 (en) * 2011-12-13 2013-06-13 Yara International Asa Method and device for contactless determination of plant parameters and for processing this information
ES2846786T3 (en) * 2015-07-02 2021-07-29 Ecorobotix Sa Robot vehicle and procedure that uses a robot for an automatic treatment of plant organisms
EP3244343A1 (en) * 2016-05-12 2017-11-15 Bayer Cropscience AG Recognition of weed in a natural environment
US11266054B2 (en) 2017-01-24 2022-03-08 Cnh Industrial America Llc System and method for automatically estimating and adjusting crop residue parameters as a tillage operation is being performed
US10123475B2 (en) 2017-02-03 2018-11-13 Cnh Industrial America Llc System and method for automatically monitoring soil surface roughness
US10262206B2 (en) 2017-05-16 2019-04-16 Cnh Industrial America Llc Vision-based system for acquiring crop residue data and related calibration methods
US10820472B2 (en) 2018-09-18 2020-11-03 Cnh Industrial America Llc System and method for determining soil parameters of a field at a selected planting depth during agricultural operations
WO2021062247A1 (en) * 2019-09-25 2021-04-01 Blue River Technology Inc. Treating plants using feature values and ground planes extracted from a single image
CN111523457B (en) * 2020-04-22 2023-09-12 七海行(深圳)科技有限公司 Weed identification method and weed treatment equipment
US20220100996A1 (en) * 2020-09-25 2022-03-31 Blue River Technology Inc. Ground Plane Compensation in Identifying and Treating Plants
AU2021107451A4 (en) * 2021-03-09 2021-12-23 Stealth Technologies Pty Ltd Weed Detector and Method of Weed Detection
CN113597874B (en) * 2021-09-29 2021-12-24 农业农村部南京农业机械化研究所 Weeding robot and weeding path planning method, device and medium thereof
US11553636B1 (en) 2022-06-21 2023-01-17 FarmWise Labs Inc. Spacing-aware plant detection model for agricultural task control

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5253302A (en) * 1989-02-28 1993-10-12 Robert Massen Method and arrangement for automatic optical classification of plants

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2009153304A1 *

Also Published As

Publication number Publication date
WO2009153304A1 (en) 2009-12-23

Similar Documents

Publication Publication Date Title
WO2009153304A1 (en) Weed detection and/or destruction
AU2020103332A4 (en) IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System
US11197409B2 (en) System and method for automated odometry calibration for precision agriculture systems
US20230189691A1 (en) Agricultural trench depth sensing systems, methods, and apparatus
US8090194B2 (en) 3D geometric modeling and motion capture using both single and dual imaging
Nakarmi et al. Automatic inter-plant spacing sensing at early growth stages using a 3D vision sensor
Guerrero et al. Crop rows and weeds detection in maize fields applying a computer vision system based on geometry
CA2764135C (en) Device and method for detecting a plant
US9699968B2 (en) Calibration of a distance sensor on an agricultural vehicle
US9314150B2 (en) System and method for detecting tooth cracks via surface contour imaging
Montalvo et al. Automatic detection of crop rows in maize fields with high weeds pressure
EP1788525B1 (en) Method of radiographic imaging for three-dimensional reconstruction, device and computer program for carrying out said method
US11297768B2 (en) Vision based stalk sensors and associated systems and methods
Jin et al. Corn plant sensing using real‐time stereo vision
US20150139524A1 (en) Method and apparatus for providing panorama image data
BR102014027364B1 (en) cutting height measurement and control system for a basic harvester, method, and harvester
CN107154050A (en) A kind of automatic obtaining method of the stone material geometric parameter based on machine vision
KR20160030509A (en) Video-based auto-capture for dental surface imaging apparatus
Chen et al. Intra-row weed recognition using plant spacing information in stereo images
Andrén et al. Recording, processing and analysis of grass root images from a rhizotron
Terawaki et al. Distinction between sugar beet and weeds for development of automatic thinner and weeding machine of sugar beet
CN112614193A (en) Intelligent calibration method for wheat green turning period spray interested region based on machine vision
Piron et al. Determination of plant height for weed detection in stereoscopic images
Kondo et al. Robust arch detection and tooth segmentation in 3D images of dental plaster models
Humburg et al. Field performance of machine vision for the selective harvest of asparagus

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110120

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA RS

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20130103