WO2021062459A1 - Cartographie des mauvaises herbes - Google Patents
Cartographie des mauvaises herbes Download PDFInfo
- Publication number
- WO2021062459A1 WO2021062459A1 PCT/AU2019/051079 AU2019051079W WO2021062459A1 WO 2021062459 A1 WO2021062459 A1 WO 2021062459A1 AU 2019051079 W AU2019051079 W AU 2019051079W WO 2021062459 A1 WO2021062459 A1 WO 2021062459A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- weeds
- aerial vehicle
- list
- sensing unit
- cameras
- Prior art date
Links
- 241000196324 Embryophyta Species 0.000 title claims abstract description 179
- 238000013507 mapping Methods 0.000 title description 6
- 238000000034 method Methods 0.000 claims description 80
- 238000012545 processing Methods 0.000 claims description 34
- 239000007921 spray Substances 0.000 claims description 24
- 238000004891 communication Methods 0.000 claims description 20
- 238000004458 analytical method Methods 0.000 claims description 11
- 238000005259 measurement Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 8
- 238000009826 distribution Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 description 17
- 239000004009 herbicide Substances 0.000 description 9
- 238000001228 spectrum Methods 0.000 description 7
- 238000005507 spraying Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000002363 herbicidal effect Effects 0.000 description 6
- 238000012805 post-processing Methods 0.000 description 5
- 238000001429 visible spectrum Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 230000010006 flight Effects 0.000 description 4
- 238000005096 rolling process Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 239000002689 soil Substances 0.000 description 2
- 241000209761 Avena Species 0.000 description 1
- 235000007319 Avena orientalis Nutrition 0.000 description 1
- 240000006694 Stellaria media Species 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 239000004411 aluminium Substances 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 239000000975 dye Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003628 erosive effect Effects 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 238000009313 farming Methods 0.000 description 1
- 230000035784 germination Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 239000003562 lightweight material Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000003908 quality control method Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001932 seasonal effect Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/31—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
- G01N21/35—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
- G01N21/359—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light using near infrared light
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/933—Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/421—Determining position by combining or switching between position solutions or signals derived from different satellite radio beacon positioning systems; by combining or switching between position solutions or signals derived from different modes of operation in a single system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/47—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/006—Apparatus mounted on flying objects
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/56—Accessories
- G03B17/561—Support related camera accessories
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/0008—Industrial image inspection checking presence/absence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/31—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
- G01N21/314—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths
- G01N2021/3155—Measuring in two spectral ranges, e.g. UV and visible
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/02—Mechanical
- G01N2201/021—Special mounting in general
- G01N2201/0214—Airborne
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/0098—Plants or trees
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/40—Correcting position, velocity or attitude
- G01S19/41—Differential correction, e.g. DGPS [differential GPS]
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/0009—Transmission of position information to remote stations
- G01S5/0045—Transmission from base station to mobile station
- G01S5/0054—Transmission from base station to mobile station of actual mobile position, i.e. position calculation on base station
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
Definitions
- the present invention relates to weed mapping.
- embodiments of the present invention relate to systems, methods and apparatus for weed mapping, but mapping of other features is also envisaged.
- a boom-mounted selective spray system comprising a plurality of sensors and a herbicide spraying system that can be mounted to a ground vehicle, such as a tractor, or to a dedicated guided vehicle.
- the sensors endeavour to distinguish weeds from desired plants, crops or stubble and selectively spray the weeds and not the desired plants, crops or stubble.
- This approach aims to target the weeds, avoid unnecessary treatment of the desired plants, crops or stubble and reduce the use and associated cost of herbicides and the like.
- the added weight, cost and power requirements, as well as dust created by the vehicle compromise and limit what can be achieved by such systems.
- aerial vehicle such as a fixed wing, multicopter (e.g. quadcopter or octocopter etc.), helicopter or hybrids of these aerial vehicles.
- UAVs unmanned, remotely controlled aerial vehicles
- Such aerial vehicles comprise an array of separate infra-red (IR), red (R) and green (G) digital cameras to capture image data and a location measuring device e.g. GPS to produce georeferenced images.
- IR infra-red
- R red
- G green
- photogrammetry via software, is used to combine the image data with the locational data to generate 3D images.
- a file is generated detailing the location of the weeds and the file is used by a ground based vehicle or aerial vehicle to spray the weeds.
- Digital cameras used for capturing images from an aerial vehicle vary greatly in capacity, arrangement and features. Typically, they are either RGB cameras based on consumer, mass produced variants e.g. Sony A6000, or specifically designed aerial vehicle camera comprising an array of ⁇ 5 monochrome cameras capturing images in different light spectrums paired with a light sensor.
- GPS data is captured typically via a GPS device only.
- the GPS accuracy can be increased using real-time kinematic (RTK) or post-processing kinematic (PPK) methods which require a base station.
- RTK real-time kinematic
- PPK post-processing kinematic
- Accuracy can be further enhanced by recording the yaw, pitch and roll of the vehicle via an inertial measurement unit (IMU) and the heading of the vehicle via a digital compass.
- IMU inertial measurement unit
- the height above ground can also be captured via a LiDAR or radar system.
- Surveyed ground control points (GCP) can also be used to further increase locational accuracy.
- One known apparatus comprises an array of five individual cameras covering blue, green, red, red-edge and near infrared spectra, a light sensor and a GPS unit mounted on an aerial vehicle.
- the apparatus processes the captured data via a methodology called photogrammetry.
- This process includes capturing overlapping images with a record of the estimated position from on-board GPS. Features are identified in overlapping images and are meshed together. The position, orientation and location of the images are determined via software using triangulation and trigonometry.
- Ground control points (GCP) with known locations, are placed throughout the surveyed area to increase positional accuracy to an accuracy of ⁇ 1 m or better.
- the output of the process is a georeferenced image which captures the locational data of the pixels in an image.
- the georeferenced images are processed to identify the weeds and convert the data to weed maps.
- the primary focus of this system is plant health in crops rather than the identification of individual plants or plant types. This is typically undertaken on comparatively small areas (e.g. under 100Ha) and only requires low resolution images.
- the aforementioned apparatus and other known apparatus involve a large overlap of images with large computational processing requirements.
- There are three main issues with the aforementioned apparatus relating to photogram metry, the cameras and the creation of shapefiles.
- the images In the photogrammetry process, the images must substantially overlap both horizontally and longitudinally. Typically, this is a 70% overlap. At 70% overlap, the coverage rate efficiency is 9%, i.e. only 9% of each image covers an additional area. Processing is extremely computationally demanding and is typically undertaken in server “farms”, i.e. in the cloud. This requires the data to be uploaded to the internet, which is problematic even with a dedicated, full speed internet connection. Ideally, weed maps should be available the same day or within 24 hours of surveying the area. This issue is further exacerbated in rural areas (i.e. where this system operates) which typically use 4G mobile or satellite internet connections, both of which are poorly placed to upload this quantity of data. If standard GPS (L1 band) is being used, the GCPs must be placed throughout the survey area to ensure positional accuracy. Each GCP must be surveyed in and be identifiable from the air, which requires the GCPs to be maintained.
- the outcome of the above issues relating to photogrammetry and the cameras are that, at resolutions of 1cm GSD, the aforementioned apparatus can only capture ⁇ 10ha per hour, excluding non-productive time, such as changing batteries etc.
- Another issue is that the creation of the shapefiles used by spraying equipment is not typically an output of aforementioned apparatus and other known apparatus and the shapefiles must be generated via third party software.
- the shapefiles should cover large areas, typically 200ha+, which results in shapefiles having a size which often exceeds the memory of the spray systems and hence the shapefiles are not useable by the spray systems.
- a preferred object of the present invention is to provide a system and/or a method and/or an apparatus that addresses or at least ameliorates one or more of the aforementioned problems and/or provides a useful commercial alternative.
- One aspect of the present invention is directed to an apparatus in the form of a sensing unit mountable to an aerial vehicle to capture images of the ground to detect ground features, in particular weeds.
- Another aspect of the present invention is directed to an apparatus in the form of an aerial vehicle comprising the sensing unit.
- a further aspect of the present invention is directed to methods of processing images and locational data to identify locations and sizes of ground features, in particular weeds.
- a yet further aspect of the present invention is directed to methods of generating shapefiles for use by weed treatment equipment.
- the present invention is directed to a sensing unit mountable to an aerial vehicle, the sensing unit comprising: a frame to mount the sensing unit to the aerial vehicle; a gimbal coupled to the frame allowing pitch and roll relative to the frame; a body mounted to the gimbal; and one or more cameras, in particular, a pair of cameras mounted to the body at an angle to the vertical and angled away from each other to minimise an overlap of images captured by the cameras.
- the sensing unit comprises a light detection and ranging (LiDAR) unit mounted to the body, and in particular between the cameras.
- LiDAR light detection and ranging
- the sensing unit comprises a processor mounted to, or housed within the body, wherein the processor is in communication with: the gimbal to control the orientation of the gimbal; the cameras to receive image data from the cameras; the LiDAR unit to receive ranging data from the LiDAR unit.
- the processor is in communication with an inertial measurement unit (IMU), digital compass and GPS unit, optionally provided in a combined unit, and mounted to the sensing unit or to the aerial vehicle.
- IMU inertial measurement unit
- GPS unit optionally provided in a combined unit, and mounted to the sensing unit or to the aerial vehicle.
- the processor is in communication with a light sensor mounted to the sensing unit or to the aerial vehicle.
- the processor is in communication with a memory, such as a USB SSD, optionally accommodated in a housing mounted to, or housed within the body.
- a memory such as a USB SSD
- the sensing unit comprises a power distribution board, optionally mounted to, or housed within the body, and in communication with one or more of the gimbal, cameras, LiDAR unit, processor and light sensor.
- the sensing unit comprises a timing board, optionally mounted to, or housed within the body, and in communication with one or more of the cameras, LiDAR unit, processor, IMU, digital compass, GPS unit and light sensor.
- a timing board optionally mounted to, or housed within the body, and in communication with one or more of the cameras, LiDAR unit, processor, IMU, digital compass, GPS unit and light sensor.
- the present invention is directed to an aerial vehicle comprising the sensing unit.
- the aerial vehicle may be an unmanned, remotely controlled aerial vehicles (UAVs), such as a fixed wing, multicopter, helicopter or hybrid thereof, or a manned aerial vehicle.
- UAVs remotely controlled aerial vehicles
- the present invention is directed to a method of processing image data and locational data captured by an aerial vehicle by a processor to identify a location and size of a ground feature, in particular weeds, the method comprising the processor: combining data captured by a GPS unit, an inertial measurement unit (IMU) and a digital compass mounted to the aerial vehicle with data captured by a ground based GPS base station to generate combined locational data; matching the combined locational data with image capture times for images captured by a pair of cameras mounted to the aerial vehicle and height data captured by aerial vehicle; analysing the image data to determine whether or not each pixel represents the ground feature, in particular weeds; performing area analysis on the image data to remove false positives; calculating a centroid and radius of each remaining area representing the ground feature, in particular weeds; combining a list of centroids and radii representing the ground feature, in particular weeds, with the combined locational data; and generating a list of longitudes, latitudes and
- the method further includes converting the list of longitudes, latitudes and radii representing the ground feature, in particular weeds, to a format usable by selected treatment equipment, such as a sprayer control system.
- the present invention is directed to a method of converting a list of longitudes, latitudes and radii representing a ground feature to be treated, in particular weeds, to a format usable by treatment equipment, the method comprising a processor: creating a background array based at least on a bounding box representing an area to be treated; setting elements representing weeds in the background array to zero; creating a polygon array equal to the background array; analysing the background array for elements representing weeds and elements representing non-weeds until all elements in the background array are set to zero and modifying the polygon array correspondingly until all non-weed elements in the polygon array are equal to a value of 1 and individual polygon elements have the same value; creating polygons comprising points based on the elements of the modified polygon array; and converting the points of the polygons to longitudes and latitude
- the method may include writing a shapefile based on the longitudes and latitudes converted from the points of the polygons.
- Creating polygons may comprise the following steps for each different set of values in the polygon array: a) creating a list of the coordinates of the maximum column values (e.g. from left to right) and a list of the minimum coordinates; b) creating a new list; c) working through the top list from left to right, appending the first set of coordinates to the new list, and then if the row is not equal to the previous row, appending one set of coordinates at the current column and old row and then appending the current coordinates to the new list and continuing until the last value in the list is reached and appending that coordinate to the new list; d) repeating step c) for the bottom list, but processing from right to left (i.e.
- the method may comprise analysing the array using a connected elements analysis comprising grouping blobs of weeds together.
- the method may comprise calculating a list of the centroids of each blob of weeds based on the connected elements analysis to determine the shortest distance between each centroid.
- the method may comprise generating a spray path independently for each blob of weeds to ensure each area is sprayed.
- the method may comprise combining the spray path for each blob of weeds into a list of all the spray paths in an order to be sprayed based on a shortest distance between each blob.
- FIG 1 is a front view of a sensing unit according to an embodiment of the present invention.
- FIG 2 is a side view of the sensing unit shown in FIG 1 ;
- FIG 3 is a front view of the sensing unit shown in FIG 1 mounted to an aerial vehicle;
- FIG 4 is a side view of the sensing unit mounted to the aerial vehicle shown in FIG 3;
- FIG 5 is a block diagram of elements of the sensing unit shown in FIG 1 ;
- FIG 6 shows the sensing unit mounted to the aerial vehicle shown in FIG 3 flying above an area of ground being imaged
- FIG 7 is a general flow diagram illustrating methods of processing image data and locational data to identify a location and size of a ground feature, in particular weeds, according to embodiments of the present invention
- FIG 8 is an example image in which plant pixels have been identified by methods of the present invention and highlighted in a specific colour
- FIG 9 shows the result of using thresholds across each image channel in the L*a*b* colour space for the same source image used in the example shown in FIG 8;
- FIG 10 shows the thresholds used for each image channel for the image in FIG 9
- FIG 11 shows the result of using thresholds in different colour spaces - RGB, HSV, YCbCr and L*a*b* for the same source image used in the example shown in FIG 8;
- FIG 12 shows the result of using is normalised difference vegetation index (NDVI) for the same source image used in the example shown in FIG 8;
- FIG 13 is a general flow diagram illustrating methods of converting a list of longitudes, latitudes and radii representing a ground feature to be treated, in particular weeds, to a format usable by treatment equipment.
- some embodiments of the present invention are directed to a sensing unit mountable to an aerial vehicle to capture images of the ground to detect ground features, and in particular to detect weeds.
- Other embodiments of the present invention are directed to an aerial vehicle comprising the sensing unit.
- Further embodiments of the present invention are directed to methods of processing images and locational data to identify locations and sizes of ground features, in particular weeds. Yet further embodiments of the present invention are directed to methods of generating shapefiles for use by weed treatment equipment, such as manned or autonomous spraying systems.
- FIGS 1 and 2 show a sensing unit 100 according to embodiments of the present invention, which is mountable to an aerial vehicle 200, as shown in FIGS 3 and 4.
- the sensing unit 100 comprises a frame 102 to mount the sensing unit 100 to the aerial vehicle 200.
- the frame 102 comprises a pair of substantially vertical, spaced apart arms 104, a pair of inclined arms 106 extending from the arms 104 and a substantially horizontal cross member 108 extending between the inclined arms 106.
- the frame 102 can be made of any suitably strong, lightweight material, such as aluminium or plastic and can be formed as a single element or multiple elements joined together by any suitable means known in the art.
- the frame 102 comprises one or more fasteners 110 for coupling the sensing unit 100 to the aerial vehicle 200.
- the one or more fasteners 110 can be in the form of hooks, clips, zip ties, clamps, bolts or the like, or a combination thereof to securely attach the sensing unit 100 to the aerial vehicle 200.
- the frame 102 can comprise a gusset or brace 112 attached between the pair of inclined arms 106 to provide additional strength and rigidity.
- the sensing unit 100 comprises a gimbal 114 rotatably coupled to the arms 104 of the frame 102 and a body 116 mounted to the gimbal 114 allowing pitch and roll of the body 116 relative to the frame 102.
- the sensing unit 100 comprises a pair of cameras 118 mounted to the body 116 at an angle to the vertical and angled away from each other to minimise an overlap of images captured by the cameras 118.
- the body 116 can comprise a plurality of surfaces 120A, 120B, 120C angled relative to each other to facilitate mounting the cameras 118 at an angle to the vertical and angled away from each other.
- the sensing unit 100 comprises a light detection and ranging (LiDAR) unit 122 mounted to the body 116 between the cameras 118.
- the body 116 comprises angled surfaces 120A and 120B, to, or through which cameras 118 are mounted, separated by substantially horizontal surface 120C, to, or through which LiDAR unit 122 is mounted.
- the sensing unit 100 comprises one or more cameras 118 mounted to the body 116. In some embodiments, the sensing unit 100 comprises a single camera 118 mounted to the body 116. In other embodiments, the sensing unit 100 comprises three or more cameras 118 mounted to the body 116.
- the sensing unit 100 comprises one or more processors 124 mounted to, or housed within the body 116.
- the one or more processors 124 can be in the form of an industrial single board computer (SBC) and/or a microcontroller.
- the processor 124 is in communication with the gimbal 114 to provide data thereto and control the orientation of the gimbal 114.
- the processor 124 is in communication with the cameras 118 to receive image data from the cameras.
- the processor 124 is in communication with the LiDAR unit 122 to receive ranging data from the LiDAR unit.
- the processor 124 is in communication with an inertial measurement unit (IMU), digital compass and GPS unit, optionally provided in a combined unit 126, and mounted to the sensing unit 100 or to the aerial vehicle 200.
- IMU inertial measurement unit
- a GPS antenna 138 mounted to the sensing unit 100 or to the aerial vehicle 200 is coupled to the unit 126.
- the processor 124 is in communication with a light sensor 128 mounted to the sensing unit 100 or to the aerial vehicle 200.
- the processor 124 is in communication with a memory 130, such as a USB SSD, optionally accommodated in a housing 132 mounted to, or housed within the body 116.
- the sensing unit 100 comprises a power distribution board 134, optionally mounted to, or housed within the body 116, and in communication with one or more of the gimbal 114, cameras 118, LiDAR unit 122, processor 124 and light sensor 128 to provide power thereto.
- a power distribution board 134 optionally mounted to, or housed within the body 116, and in communication with one or more of the gimbal 114, cameras 118, LiDAR unit 122, processor 124 and light sensor 128 to provide power thereto.
- the sensing unit 100 comprises a timing board 136, optionally mounted to, or housed within the body 116, and in communication with one or more of the cameras 118, LiDAR unit 122, processor 124, IMU, digital compass, GPS unit 126 and light sensor 128 to provide timing data thereto.
- the aerial vehicle 200 can be an unmanned, remotely controlled aerial vehicle (UAVs), such as the multicopter shown in FIGS 3 and 4, a fixed wing aerial vehicle or a hybrid thereof.
- UAVs unmanned, remotely controlled aerial vehicle
- the aerial vehicle 200 can be a manned aerial vehicle, such as a helicopter.
- the sensing unit 100 may be made, such as to the fasteners 110 and/or the shape of the frame 102, depending on the type of aerial vehicle 200 to which the sensing unit 100 is to be attached.
- FIG 6 shows the sensing unit 100 mounted to the aerial vehicle 200 flying above an area of ground 140 being imaged comprising a feature to be detected, such as a plant, such as a weed 142.
- the aerial vehicle 200 is approximately 60-80m above the ground 140.
- the dotted lines in FIG 6 indicate the field of view (FOV) of the cameras 118 of the sensing unit 100 illustrating the minimal overlap of the FOVs due to the angled mounting of the cameras 118.
- FOV field of view
- NIR near infrared
- Standard cameras are designed to mimic human eyesight and therefore only detect light in the visible spectrum.
- Regular cameras typically comprise three different light detecting diodes, each diode representing a single pixel.
- Blue diodes detect light of wavelength roughly 400 to 500nm, green diodes at 500nm to 580nm and red at 580nm to 650nm. This is achieved via coloured dyes applied directly to the diodes to differentiate between colours and then the use of cut filters. Glass effectively filters out UV light below 400nm and the cut filter eliminates light above 650nm in the NIR to infra-red (IR) spectrum. However, without the cut filter, the diodes will detect NIR light.
- IR infra-red
- the sensor unit 100 comprises two 12MP industrial cameras 118, 4000 pixels wide, which have been modified to sense NIR light. Modification of the cameras 118 involves replacing the IR cut filter with another filter. Light is blocked from 580nm to 680nm (red light), light is allowed through from 680 to 750nm and light higher than 750nm is blocked allowing the red diodes of the camera to capture NIR. This results in a camera which captures blue, green and NIR light allowing for comparison between these three spectra.
- the one or more cameras 118 employ a global shutter, which captures the entire image simultaneously. This is in contrast to a rolling shutter, which captures the image line by line over a period of time.
- the one or more cameras 118 are continuously moving forward. With a rolling shutter system, each line would be exposed for a period of time, e.g. 1/2000 th of a second, but not at the same time as the other lines in the image because the camera moves between capturing each line.
- a typical image capture time is 10ms for a rolling shutter over which time the camera would have moved ⁇ 14cm forward at 50km/h. Therefore, a global shutter is the preferred shutter type.
- the sensor unit 100 comprises the light sensor 128 to measure ambient light conditions to facilitate continual calibration of the cameras 118 as ambient lighting changes.
- the sensor unit 100 comprises the GPS, IMU (inertial measurement unit) and digital compass unit 126 which allow the position of the sensor unit to be determined extremely accurately.
- the sensor unit 100 comprises the LiDAR (light detection and ranging) device 122 which measures height of the sensor unit 100 above the ground (as opposed to the GPS which measures height above sea level).
- the sensor unit 100 comprises the on-board computer 124, including several custom built electronic daughter boards, timing microcontrollers and USB sticks to collate and store data.
- the sensor unit 100 has a mass of approximately 3kg and can generate and store over 2TB of data in a full day of flying.
- the apparatus and methods according to embodiments of the present invention combine location information and image data captured from the sensor unit 100 mounted to the aerial vehicle 200 to generate a map of ground features, and in particular a weed map for broadacre farming.
- the aerial vehicle 200 comprising the sensor unit 100 flies over a defined area of ground in swaths or sweeps and captures images of the ground via cameras 118.
- the pair of cameras 118 point downwards, are positioned side by side and are angled slightly away from vertical to capture images with minimal overlap and to increase the width and area of ground captured per swath or sweep.
- the swath width is the ground sampling distance (GSD) multiplied by the number of pixels in the width of the image.
- the GSD is 1cm and with images having a pixel width of 4000 pixels, this equates to a swath width of 78.4m
- the light sensor 128 positioned, for example, on top of the aerial vehicle 200 measures ambient light conditions to facilitate continual calibration of the cameras 118 as ambient lighting changes.
- the combined GPS, IMU and digital compass unit 126 of the sensor unit 100 enables the position, direction and angle of the sensor unit 126 to be known accurately in 3D space.
- the single point LiDAR unit 122 measures the height of the sensor unit 100 above the ground 140 enabling direct calculation of GPS coordinates of locations within the captured images via simple triangulation.
- One or more processors 124 tie the elements of the sensing unit 100 together.
- the sensing unit 100 can comprise a custom microcontroller, which synchronises the capture of data between the different elements of the sensing unit 100 in the nanosecond scale.
- the sensing unit 100 can comprise the industrial single board computer (SBC) coupled to the high-speed USB memory device 130 to capture all the required data.
- SBC industrial single board computer
- GPS base station 144 Separate to the sensor unit 100 is a GPS base station 144, which is positioned within, for example, 15km of the area in which the aerial vehicle 200 comprising the sensor unit 100 is operating.
- the GPS base station 144 logs raw GPS data (RINEX) for use in post-processing of the GPS data captured by the sensor unit 100.
- direct georeferencing is employed wherein the measurements of the height of the aerial vehicle above ground via the LiDAR device 122, the location and orientation of the one or more cameras 118 on the aerial vehicle (via the high accuracy GPS and IMU) allow the location of any individual pixel on the ground in an image to be calculated.
- the benefits of direct georeferencing compared to aerial triangulation include a highly efficient area coverage rate because direct georeferencing only requires an image overlap of ⁇ 2% to ensure complete coverage of the designated area. Also, the location of a pixel only needs to be calculated if the feature of interest, in particular, a weed is positively identified, thus significantly reducing processing requirements.
- the reduced processing requirements allow, in particular, weed maps to be produced on the same day of flying with a standard laptop in the field without the need to transfer large amounts of data over the internet.
- the images produced by the cameras 118 need to be high contrast and low blur.
- An image with high contrast enhances the distinction between background and weeds, which makes the detection of weeds more reliable.
- To enhance the contrast of an image more light needs to be captured by the camera for each pixel. Increasing contrast can be achieved by using a bigger image sensor because more light can be captured by the camera, but this increases mass (linearly) and cost (exponentially). Contrast can be increased with a lower pixel count.
- the portion of the sensor for each pixel is bigger, but the coverage rate is directly decreased, if the ground sampling distance (GSD) or area per pixel stays the same. Contrast can be increased with longer exposure times. Capturing the images over a longer period allows more light to be captured, but because the camera is moving, this increases the area each pixel is exposed to, i.e. increases blur.
- GSD ground sampling distance
- Blur refers to the area on the ground each pixel is exposed to during the capture of an image with a moving camera. Blur increases the more the camera moves during the exposure time. As blur increases, the distinction between background and weed decreases. A standard camera with an exposure time of 1 /50 th of a second travelling at 36km/h will move 20cm during the exposure time. If the camera was operating at a GSD of 1cm, each pixel would be exposed to a 21x1 cm area and be represented in the image as a 1x1 cm area. This would result in a significantly blurred image that would not be suitable for detecting a weed. Blur can be reduced by increasing the shutter speed.
- a camera with a shutter speed of 1 /2000 th of a second travelling at 36km/hr would travel 0.5cm during the exposure time, significantly reducing blur.
- contrast decreases.
- Blur can be reduced by decreasing camera velocity with the result of lower coverage rates.
- Coverage rate is inversely proportional to the size of the ground feature, in particular, weeds detectable and the cost per area e.g. cost//ha of the system. Coverage rate is dependent on the Ground Sampling Distance (GSD), the swath width, the forward velocity of the aerial vehicle and an efficiency factor.
- GSD Ground Sampling Distance
- the GSD is the size of each pixel on the ground with the camera stationary. 1cm GSD refers to each pixel being 1x1 cm in a square array on the ground. The size of the smallest weed consistently detectable is determined by the GSD. As a base standard to consistently detect a certain size weed, at least one pixel must completely view the weed.
- the GSD needs to be 1cm to ensure at least one pixel is filled by the image of the weed.
- the swath width is the GSD (e.g. 1cm) multiplied by the image width in the number of pixels, e.g. 4000. Additionally, the overlap between the images must be taken into account, which for direct georeferencing is ⁇ 2%. As stated herein, in embodiments where two 12MP cameras are used having image widths of 4000 pixels each, this equates to a swath width of 78.4m at 1cm GSD.
- the forward velocity of the aerial vehicle has three potential limitations. The aerial vehicle should be chosen so that the speed of the aerial vehicle is not the limit.
- the image capture rate also needs to be considered. Where 12MP cameras are used in the sensor unit 100 with images 3000 pixels in length, capturing one image per second with a GSD of 1cm and 2% overlap, a maximum forward velocity of 29.4m/s ( ⁇ 106km/h) is achieved. This is close to the top speed of most fixed wing aircraft and is not the limitation in a direct georeferencing system. The shutter speed and blur limitations of the cameras also need to be considered. This is a balance between image quality and coverage rate. However, this is expected to be 50 to 60km/h which becomes the limit in a direct georeferencing system
- the efficiency factor covers the time not productively flying and covers set up time, flight to and from the area, turning at each end of a swath and changing batteries and swapping memory storage, such as USBs.
- the efficiency factor is expected to be ⁇ 80% based on 30 minutes endurance for the aerial vehicle.
- the data captured by the cameras 118 of the sensor unit 100 and the GPS base station 144 is combined using custom built software.
- the first step is to upload the data from the GPS base station 144 and the GPS, IMU and digital compass unit 126 through 3rd party software to increase the accuracy of the data from the cameras 118 and the GPS, IMU and digital compass unit 126.
- the next step is to identify the features of interest, e.g. weeds 142 within the images and ultimately determine an area centred on each weed, such as the radius and the centre of a circle centred on the weed.
- this process uses blurring, normalised difference vegetation index (NDVI) thresholding and value thresholding from hue, saturation and value (HSV) thresholding to eliminate non-weed pixels.
- NDVI normalised difference vegetation index
- HSV hue, saturation and value
- the last stage of the processing method is to convert the list of weed centres to a file, such as a shapefile or weed map, compatible with the selected treatment equipment, such as a sprayer control system with which the file is to be used.
- the area occupied by the weed is designated “spray” and the areas without weeds are designated “do not spray”.
- the file is then transferred to the sprayer’s controller where the onboard equipment sprays the designated “spray” areas only.
- the sensor unit 100 with its flyover data and processing methods generate an accurate reporting tool for land owners and agronomists that greatly assists with weed detection and recognition.
- post-processing of the locational data in the method 300 comprises at 302 combining data captured by the GPS unit, inertial measurement unit (IMU) and digital compass unit 126 mounted to the aerial vehicle 200 with data captured by the ground based GPS base station 144 to generate combined locational data to increase the accuracy of the locational data.
- this can be done by known 3rd party software.
- the method 300 comprises matching the combined locational data with image capture times for images captured by the pair of cameras 118 mounted to the aerial vehicle 200 and height data captured by aerial vehicle 200, in particular height data captured by the LiDAR unit 122.
- Processing of the image data is performed in parallel with the processing of the locational data.
- the output required from image processing is the location and size, in terms of the number of pixels, for each ground feature of interest, in particular weeds.
- the method 300 comprises analysing the image data to determine whether or not each pixel represents the ground feature, in particular weeds. There are a multitude of different ways that this can be achieved by combining multiple different image processing techniques.
- the method 300 comprises at 306A image pre-processing which typically includes one or more of sharpness adjustment, contrast adjustment and one or more noise reduction processes.
- the method 300 comprises one or more thresholding processes.
- the primary calculation is normalised difference vegetation index (NDVI) with a minimum value.
- NDVI normalised difference vegetation index
- Other processes that can be employed use thresholds based on different colour spaces, e.g. minimum and maximum red, green and/or blue values, or converting to HSV colour space and using set thresholds. More complex calculations can be employed by modifying thresholds based on light readings from the light sensor 128, or based on image wide averages, e.g. brightness levels.
- FIG 8 is an example image in which plants in the form of weeds are shown in a specific colour that contrasts well with non-plant (non-weed) areas.
- the specific colour is blue (highlighted with arrows), but other colours can be used.
- the weeds are shown in blue as a result of using custom light filters on the cameras 118 so the cameras detect primarily red and near infrared light. Blue light is strongly blocked by the filter, but as the blue receivers on the camera also see near infrared, which plants highly reflect, plants appear blue.
- FIG 9 shows a thresholded version of the same source image used in the example shown in FIG 8 using thresholds across each image channel in the L*a*b* colour space.
- the thresholds used for each image channel are shown in the graphs in FIG 10.
- FIG 11 shows the same source image used in the example shown in FIG 8 using some different colour spaces, namely RGB, FISV, YCbCr and L*a*b*.
- FIG 12 shows the same source image used in the example shown in FIG 8 using the NDVI method with a minimum NDVI value. This image demonstrates how effective the NDVI method can be.
- White is a high NDVI value, representing the vegetation of interest, i.e. weeds
- black is a low NDVI value, i.e. non-weeds.
- 300 comprises generating a black and white image, with black indicating that a pixel is not a plant (weed) and white indicating that a pixel is a plant (weed).
- the method 300 comprises performing area analysis on the image data to remove false positives.
- the removal of false positive pixels can be based on groups of pixels and/or the proximity of pixels and some different options can be used.
- an erosion process is used comprising removing pixels around the perimeter of an area or blob of pixels to a defined depth.
- a connected elements process is used comprising calculating the number of pixels that are connected allowing for a minimum threshold to be set.
- a blurring process is used comprising analysing an area around a pixel. For example, if 50% of pixels within a 3-pixel radius are positive, the centre pixel passes as a positive pixel and is still true. Multiple different options for the blurring process can be used.
- the method 300 comprises calculating a centroid and radius of each remaining area, or blob representing the ground feature, in particular a weed, and in preferred embodiments is output in a list.
- the method 300 comprises combining the list of centroids and radii representing the ground feature, in particular weeds, with the combined locational data.
- the method 300 comprises generating a list of longitudes, latitudes and radii representing the ground feature, in particular weeds.
- the list of plant centroids and radii is combined with the locational data and converted to longitude and latitude for the centroid and radius in metres via 3D trigonometry.
- the output is a list of the longitude, latitude and radius of the detected plants (weeds).
- the method 300 comprises converting the list of longitudes, latitudes and radii representing the ground feature, in particular weeds, to a format usable by selected treatment equipment, such as a sprayer control system to treat weeds.
- this involves conversion to a shapefile format.
- shapefiles For some equipment guidance systems there are significant memory limitations that need to be overcome to allow for the use of shapefiles. The most basic method is to create a circle for each plant to be treated, but this creates too many polygons and exceeds the memory limitations of the guidance systems.
- Another aspect of the present invention is a method for generating shapefiles to stay within the memory limit of the equipment guidance systems.
- another aspect of the present invention is a method of converting a list of longitudes, latitudes and radii representing a ground feature to be treated, in particular weeds, to a format usable by treatment equipment.
- the method requires a number of inputs.
- One input is the aforementioned list of weeds comprising the coordinates (longitudes, latitudes) and size (radius) of the weeds.
- the list can be obtained from one or more flights of the aerial vehicle 200.
- Another input relates to accuracy, i.e. how much extra around each weed is required to allow for inaccuracies of the entire system.
- a further input is a square size (SS), which can typically be 1 m.
- the square size is effectively the resolution. The smaller the SS, the less chemical is wasted, but the more polygons are required.
- Another input is a bounding box for the entire area under analysis.
- the bounding box can be derived from current shapefiles, or from bounding an area in a map program, such as Google Maps, or the like.
- the bounding box is a saveable variable because it is likely to be re-used, e.g. particular areas of relevance, such as paddocks or fields on a property, can be selected from a drop-down list.
- the method 400 comprises at 402 a processor creating a background array based at least on the bounding box representing an area to be treated.
- creating the background array comprises determining a width and height (m) of the bounding box for the treatment area and optionally adding a safety margin around the whole area, which can be, for example, 20m.
- creating the background array comprises assigning a value of one to the elements of the array, the width and height (inclusive of the safety margin) divided by the square size SS with point 0,0 at, for example, the north-western point of the bounding box.
- the method 400 comprises setting elements representing weeds in the background array to zero. This effectively removes weeds from the background array.
- this step can comprise converting the coordinates representing weeds into the Universal Transverse Mercator (UTM) coordinate system. Based on the weed coordinates, the weed radius and accuracy elements representing squares to be treated are set to zero.
- UDM Universal Transverse Mercator
- the method 400 comprises creating a polygon array equal to the background array.
- the method 400 comprises analysing the background array for elements representing weeds and elements representing non-weeds until all elements in the background array are set to zero and modifying the polygon array correspondingly until all non-weed elements in the polygon array are equal to a value of 1 and individual polygon elements in the polygon array have the same value.
- analysing the background array for elements representing weeds and elements representing non-weeds comprises analysing the background array in sections. For example, a top section of the background array can be analysed first. Processing can comprise starting at 0,0 and working down the columns in the background array until a weed, i.e. a 0 value is found, or one less than the length of the column is reached. All elements in the polygon array covered in this process are set to 1 and all elements in the background array covered in this process are set to 0. Processing comprises moving to the top of the next column and repeating the previous processing steps. A bottom section of the background array can be analysed next. Processing can comprise starting at the bottom left of the background array, i.e.
- Processing comprises setting all elements covered in this process to 2 and setting all elements covered in the background array to 0. Processing comprises moving to the bottom of the next column and repeating the previous step. The remainder of the background array can be analysed next. Processing can comprise setting a counter starting at 3. Starting at 0,0 and working down columns in the background array until a non-weed element, i.e. a value of 1, is found. If the bottom of the column is reached without finding a weed, processing comprises moving to the top of the second column, 0,1 and continuing down that column etc.
- processing comprises setting this element as the global x and y coordinate and setting this position to the local x,y position. If this value is a 1 in the background array, processing comprises working back up the column (towards row 0) until a zero is found, setting this column position to the local y variable and working down the column setting elements to zero and the polygon array equal to the counter. Once a zero is encountered processing comprises moving to the next column and repeating this process. If this value is a 0 in the background array, processing comprises moving down the column until a 1 is found and checking that the value in the polygon array PA in the previous column, same row is equal to the counter value.
- processing comprises setting this column position to the local y variable, moving down this column, setting the background array to 0 and polygon array to the counter value until a zero in the background array is encountered and the process is repeated. If a 1 cannot be found in the background array with the previous column, same row equal to the counter value, then this polygon is finished.
- the counter value is increased by 1 and processing is repeated from the 0,0 point, but from the global x,y position. This is repeated until all elements of the background array are 0. At this stage, all non weed locations in the polygon array equal 1 or greater with individual polygons having the same counter value.
- the method 400 comprises creating polygons comprising points based on the elements of the modified polygon array.
- creating polygons comprises the following steps for each different set of values in the polygon array (i.e. the polygon):
- step 3 repeating step 3 for the bottom list, but processing from right to left (i.e. maximum column to minimum column) and adding one to the y (column) value (i.e. moving it down one square) and appending these to the same list as the top section.
- the method 400 comprises converting the points of the polygons to longitudes and latitudes.
- the method includes writing a shapefile based on the longitudes and latitudes converted from the points of the polygons.
- the shapefile can then be used to treat the weeds by the treatment equipment selected, whether that be spraying, mechanical removal or steam treatment or other method.
- the method 400 comprises analysing the array using a connected elements analysis grouping blobs of weeds together.
- a list of the centroid of each blob is calculated from the connected elements analysis which is then used to create the shortest distance between each centroid.
- a spray path is then generated independently for each blob to ensure each area is sprayed.
- the spray path for each blob is then combined into a list of all the spray paths in the order dictated by the solution to the shortest distance between each blob.
- the output is a list of waypoints that a vehicle can follow to spray all the weeds in the area in the shortest distance possible. Reductions in travel distance of 80% to 90% are achievable with this methodology compared to if the whole area was covered as is typically the case with ground based sprayers.
- Embodiments of the present invention address or at least ameliorate the aforementioned problems of the prior art.
- Embodiments of the present invention are able to detect small plants, e.g. under 5cm in diameter, with resolutions of ⁇ 1 cm GSD and are able to out-perform existing boom mounted technology in weed identification and identify fewer false positives.
- Some embodiments can achieve over 200ha per hour including non-productive time.
- a coverage rate of over 300 ha per hour can be achieved based on an aerial vehicle velocity of 50km/h and 1 cm GSD.
- the existing prototype will cover an area approximately 2.5 times faster than 36m boom-mounted spot sprayers, at around one third of the capital cost of such sprayers. This is achievable, at least in part, due to the minimal overlap of the fields of view of the inclined cameras of the sensing unit thus being able to cover larger areas with each sweep.
- Embodiments of the present invention enable easier quality control by simply scouting for escaped weeds after a spot spray application has been identified and sprayed, especially given that these weeds may be herbicide resistant.
- a double knockdown application can be applied to browned out weeds after an initial spot spray application by using the same weed map.
- Embodiments of the present invention have the ability to target weeds by size and, where a blanket application is required, larger weeds could receive a heavier rate of herbicide or a different herbicide, however individual weeds growing in clumps would be seen as a large weed.
- a radius around that weed could be built into the weed map which would reflect where seeds have fallen from that plant, which would allow very small undetectable weeds to be sprayed in the process of spraying the larger weed and/or pre-emergent herbicides to be selectively applied.
- Weed maps over a season can be accumulated together to provide a seasonal map, which could then be used to selectively apply pre-emergent herbicides before the next season’s weeds germinate.
- Fields can be scouted for isolated weeds that may have escaped a blanket application, especially as it is probable that these weeds are herbicide resistant and can be controlled before they become significant problems. In row crops, weeds growing out of place can be identified and treated.
- Weed maps produced in accordance with some embodiments of the present invention can include a digital elevation model, which provides a map of obstacles in a field from which autonomous vehicles could be guided on an efficient path from weed to weed, using small, light equipment to spot spray fields autonomously.
- machine learning or leaf shape could be incorporated to identify some weeds in crop (green on green). After an autumn flush, winter weed patches such as black oats could be mapped and those patches treated in crop or pre-emergent chemistry applied to those areas. Weed maps could be used to trigger alternative selective weed control, for example mechanical treatment, where tines selectively engage the ground, or microwave weed control.
- Flights of the aerial vehicle comprising the sensor unit are pre-planned and therefore the operator only needs to manage power supply, e.g. batteries, launch the aerial vehicle and comply with CASA regulations.
- power supply e.g. batteries
- CASA regulations e.g. petrol-powered drones and CASA exemption from line of sight, extended flight times make set and forget flights possible.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Chemical & Material Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- General Business, Economics & Management (AREA)
- Agronomy & Crop Science (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Mining & Mineral Resources (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Pathology (AREA)
- Immunology (AREA)
- Quality & Reliability (AREA)
- Marine Sciences & Fisheries (AREA)
- Biochemistry (AREA)
- Animal Husbandry (AREA)
- Analytical Chemistry (AREA)
- Marketing (AREA)
- Botany (AREA)
- Wood Science & Technology (AREA)
- Electromagnetism (AREA)
- Food Science & Technology (AREA)
- Medicinal Chemistry (AREA)
- Aviation & Aerospace Engineering (AREA)
- Geometry (AREA)
- Radiology & Medical Imaging (AREA)
Abstract
Une unité de détection pouvant être montée sur un véhicule aérien comprend un cadre pour monter l'unité de détection sur le véhicule aérien, un cardan couplé au cadre permettant le tangage et le roulis d'un corps monté sur le cardan par rapport au cadre et une ou plusieurs caméras montées sur le corps. L'unité de détection comprend une paire de caméras montées sur le corps selon un angle par rapport à la verticale et inclinées l'une par rapport à l'autre pour minimiser un chevauchement d'images capturées par les caméras. Des données de localisation combinées sont mises en correspondance avec des temps de capture d'images et des données de hauteur capturées par le véhicule aérien. Les données d'image sont analysées pour déterminer les pixels représentant des mauvaises herbes, éliminer les faux positifs, calculer un centroïde et un rayon de chaque zone restante représentant des mauvaises herbes, combiner une liste de centroïdes et de rayons représentant des mauvaises herbes. La liste est convertie en un format, un fichier de mise en forme, utilisable par un équipement de traitement sélectionné, tel qu'un système de commande de pulvérisateur.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/AU2019/051079 WO2021062459A1 (fr) | 2019-10-04 | 2019-10-04 | Cartographie des mauvaises herbes |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/AU2019/051079 WO2021062459A1 (fr) | 2019-10-04 | 2019-10-04 | Cartographie des mauvaises herbes |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021062459A1 true WO2021062459A1 (fr) | 2021-04-08 |
Family
ID=75336285
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/AU2019/051079 WO2021062459A1 (fr) | 2019-10-04 | 2019-10-04 | Cartographie des mauvaises herbes |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021062459A1 (fr) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113329210A (zh) * | 2021-05-28 | 2021-08-31 | 成都励精图信息技术工程有限公司 | 一种智能土地举证***及方法 |
CN114565863A (zh) * | 2022-02-18 | 2022-05-31 | 广州市城市规划勘测设计研究院 | 无人机图像的正射影像实时生成方法、装置、介质及设备 |
CN115251024A (zh) * | 2022-08-29 | 2022-11-01 | 北京大学现代农业研究院 | 除草方式的确定方法、装置、电子设备及除草*** |
WO2022241504A1 (fr) * | 2021-05-17 | 2022-11-24 | Agtecnic Pty Ltd | Commande de têtes de pulvérisation |
WO2023079063A1 (fr) * | 2021-11-08 | 2023-05-11 | Bayer Aktiengesellschaft | Procédé et système de collecte de données sur un champ utilisé pour l'agriculture |
WO2023230730A1 (fr) * | 2022-06-03 | 2023-12-07 | Daniel Mccann | Système et procédé d'application précise d'herbicide résiduel par inférence |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020022929A1 (en) * | 2000-06-05 | 2002-02-21 | Agco | System and method for creating field attribute maps for site-specific farming |
US20160280397A1 (en) * | 2015-03-27 | 2016-09-29 | Konica Minolta Laboratory U.S.A., Inc. | Method and system to avoid plant shadows for vegetation and soil imaging |
WO2017077543A1 (fr) * | 2015-11-08 | 2017-05-11 | Agrowing Ltd | Procédé d'acquisition et d'analyse d'imagerie aérienne |
US20180129210A1 (en) * | 2016-11-04 | 2018-05-10 | Intel Corporation | Unmanned aerial vehicle-based systems and methods for generating landscape models |
US10028426B2 (en) * | 2015-04-17 | 2018-07-24 | 360 Yield Center, Llc | Agronomic systems, methods and apparatuses |
US20180259496A1 (en) * | 2012-06-01 | 2018-09-13 | Agerpoint, Inc. | Systems and methods for monitoring agricultural products |
US20190205610A1 (en) * | 2018-01-04 | 2019-07-04 | Andrew Muehlfeld | Ground control point extraction from planting data |
US20190304120A1 (en) * | 2018-04-03 | 2019-10-03 | Altumview Systems Inc. | Obstacle avoidance system based on embedded stereo vision for unmanned aerial vehicles |
-
2019
- 2019-10-04 WO PCT/AU2019/051079 patent/WO2021062459A1/fr active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020022929A1 (en) * | 2000-06-05 | 2002-02-21 | Agco | System and method for creating field attribute maps for site-specific farming |
US20180259496A1 (en) * | 2012-06-01 | 2018-09-13 | Agerpoint, Inc. | Systems and methods for monitoring agricultural products |
US20160280397A1 (en) * | 2015-03-27 | 2016-09-29 | Konica Minolta Laboratory U.S.A., Inc. | Method and system to avoid plant shadows for vegetation and soil imaging |
US10028426B2 (en) * | 2015-04-17 | 2018-07-24 | 360 Yield Center, Llc | Agronomic systems, methods and apparatuses |
WO2017077543A1 (fr) * | 2015-11-08 | 2017-05-11 | Agrowing Ltd | Procédé d'acquisition et d'analyse d'imagerie aérienne |
US20180129210A1 (en) * | 2016-11-04 | 2018-05-10 | Intel Corporation | Unmanned aerial vehicle-based systems and methods for generating landscape models |
US20190205610A1 (en) * | 2018-01-04 | 2019-07-04 | Andrew Muehlfeld | Ground control point extraction from planting data |
US20190304120A1 (en) * | 2018-04-03 | 2019-10-03 | Altumview Systems Inc. | Obstacle avoidance system based on embedded stereo vision for unmanned aerial vehicles |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022241504A1 (fr) * | 2021-05-17 | 2022-11-24 | Agtecnic Pty Ltd | Commande de têtes de pulvérisation |
CN113329210A (zh) * | 2021-05-28 | 2021-08-31 | 成都励精图信息技术工程有限公司 | 一种智能土地举证***及方法 |
WO2023079063A1 (fr) * | 2021-11-08 | 2023-05-11 | Bayer Aktiengesellschaft | Procédé et système de collecte de données sur un champ utilisé pour l'agriculture |
CN114565863A (zh) * | 2022-02-18 | 2022-05-31 | 广州市城市规划勘测设计研究院 | 无人机图像的正射影像实时生成方法、装置、介质及设备 |
CN114565863B (zh) * | 2022-02-18 | 2023-03-24 | 广州市城市规划勘测设计研究院 | 无人机图像的正射影像实时生成方法、装置、介质及设备 |
WO2023230730A1 (fr) * | 2022-06-03 | 2023-12-07 | Daniel Mccann | Système et procédé d'application précise d'herbicide résiduel par inférence |
CN115251024A (zh) * | 2022-08-29 | 2022-11-01 | 北京大学现代农业研究院 | 除草方式的确定方法、装置、电子设备及除草*** |
CN115251024B (zh) * | 2022-08-29 | 2023-11-21 | 北京大学现代农业研究院 | 除草方式的确定方法、装置、电子设备及除草*** |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021062459A1 (fr) | Cartographie des mauvaises herbes | |
Fernández‐Quintanilla et al. | Is the current state of the art of weed monitoring suitable for site‐specific weed management in arable crops? | |
US20200264154A1 (en) | System for monitoring crops and soil conditions | |
CN107426958B (zh) | 农业监控***和方法 | |
US20190150357A1 (en) | Monitoring and control implement for crop improvement | |
US20150245565A1 (en) | Device and Method for Applying Chemicals to Specific Locations on Plants | |
US11690368B2 (en) | Agricultural plant detection and control system | |
CN109471434B (zh) | 一种新型的变量喷雾路径规划自主导航***及方法 | |
CN112839511A (zh) | 用于将喷洒剂施布到田地上的方法 | |
Herrero-Huerta et al. | Vicarious radiometric calibration of a multispectral sensor from an aerial trike applied to precision agriculture | |
US20220366605A1 (en) | Accurate geolocation in remote-sensing imaging | |
US20180348760A1 (en) | Automatic Change Detection System | |
da Costa Lima et al. | Variable rate application of herbicides for weed management in pre-and postemergence | |
Belcore et al. | Raspberry PI 3 multispectral low-cost sensor for UAV based remote sensing. Case study in south-west Niger | |
Karatzinis et al. | Towards an integrated low-cost agricultural monitoring system with unmanned aircraft system | |
Vanegas et al. | Multi and hyperspectral UAV remote sensing: Grapevine phylloxera detection in vineyards | |
Fasiolo et al. | Towards autonomous mapping in agriculture: A review of supportive technologies for ground robotics | |
Feng et al. | Cotton yield estimation based on plant height from UAV-based imagery data | |
Einzmann et al. | Method analysis for collecting and processing in-situ hyperspectral needle reflectance data for monitoring Norway Spruce | |
Tian | Sensor-based precision chemical application systems | |
EP4014735A1 (fr) | Machine agricole et son procédé de commande | |
Gan et al. | A prototype of an immature citrus fruit yield mapping system | |
do Amaral et al. | Application of drones in agriculture | |
Da Silva et al. | Unimodal and Multimodal Perception for Forest Management: Review and Dataset. Computation 2021, 9, 127 | |
Benet et al. | Fusion between a color camera and a TOF camera to improve traversability of agricultural vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19948075 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19948075 Country of ref document: EP Kind code of ref document: A1 |