WO2024133848A1 - Method for providing control data, a crop failure map and/or a replanting map - Google Patents

Method for providing control data, a crop failure map and/or a replanting map Download PDF

Info

Publication number
WO2024133848A1
WO2024133848A1 PCT/EP2023/087533 EP2023087533W WO2024133848A1 WO 2024133848 A1 WO2024133848 A1 WO 2024133848A1 EP 2023087533 W EP2023087533 W EP 2023087533W WO 2024133848 A1 WO2024133848 A1 WO 2024133848A1
Authority
WO
WIPO (PCT)
Prior art keywords
crop
failure
image
data
providing
Prior art date
Application number
PCT/EP2023/087533
Other languages
French (fr)
Inventor
Sandra SELINGER
Vagner Pasolius Veksel
Jules Jacques Jean PONSAILLE
Mauricio Lopes Agnese
Original Assignee
Basf Agro Trademarks Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Basf Agro Trademarks Gmbh filed Critical Basf Agro Trademarks Gmbh
Publication of WO2024133848A1 publication Critical patent/WO2024133848A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C21/00Methods of fertilising, sowing or planting
    • A01C21/005Following a specific plan, e.g. pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B76/00Parts, details or accessories of agricultural machines or implements, not provided for in groups A01B51/00 - A01B75/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation

Definitions

  • the present disclosure relates to a computer-implemented method for providing control data, a crop failure map and/or a replanting map, a system for providing control data, a crop failure map and/or a replanting map, an apparatus for providing control data, a crop failure map and/or a replanting map, an agricultural device controlled by such control data, a respective computerprogram element and a use of different data in such a computer-implemented method.
  • the general background of this invention is the treatment of plantation in an agricultural field.
  • crops like sugar cane
  • a computer-implemented method for providing crop line failure data comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model, configured to identify at least two identification patterns selected from the group of the patterns consisting of: a crop line pattern, a crop failure pattern and/or a crop pattern; providing a relationship between the at least two identification patterns for which the image classification model is configured to identify; identifying the at least two identification patterns with the image classification model, generating the crop line failure data based on the at least two identification patterns and based on the relationship; wherein the crop line failure data indicate crop failures in a crop line; providing the crop line failure data.
  • control data for an agricultural device may be provided, wherein the control data at least comprise position data of one or more of the determined crop failures; and/or a crop failure map may be provided indicating the crop failures in the agricultural field; and/or a replanting map may be provided indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold.
  • the number of image classification models may be different from the number of relationships.
  • the relationship may also comprise a relationship between the at least two identification patterns and at least one identification pattern for which the image classification model is not configured to identify.
  • the non identified identification pattern may has a relation to the identification patterns that are identified by the image classification model.
  • crop line pattern and a failure pattern may be detected by the model a crop line pattern and/or crop line failure pattern may be detected by the relationship it has to the crop line pattern and a failure pattern.
  • the number of relationships may be different from the number of identified identification patterns.
  • the crop failure map and/or the crop line failure map may be provided as a vector image.
  • a vector image may comprise lines which overlap failures and crop line areas.
  • a vector image may in addition and/or as an alternative indicate a crop line and crops or any other combination.
  • a vector image may also comprise areas in addition and/or as an alternative to lines.
  • identifying the crop line pattern with the image classification model may comprise providing information about the spacing of the crop rows and/or seeding lines and/or crop lines.
  • Crop rows may comprise crop lines.
  • crop rows may comprise at least one crop line.
  • crop rows and crop lines may be used exchangeable, however crop rows may express the seeding aspect whereas crop lines may express the emerging aspect.
  • the information about the spacing of the crop rows and/or seeding lines may be pre-processed, may be manually created, may be retrieved from a database and/or may be provided by a machine or an agricultural device such as a seeder for the same agricultural area from which the image data are generated.
  • an identification processing engine for pattern recognition and/or segmentation may be by-passed and in this way computation power may be reduced as for example instead of using at least two pattern recognition engines of the image classification model only a single engine and a database request may be used to identify the at least two identification patterns.
  • each of the sub-models of the image classification model may either use a pattern recognition and/or segmentation algorithm or a data retrieval algorithm wherein data retrieval may consume less power as pattern recognition.
  • the by-passing of the pattern recognition engine may be controlled.
  • the control of by-passing may be realized by allocating computational power to a pattern recognition algorithm, e.g. allocating a thread in a computer architecture.
  • the determining unit may comprise a scheduling unit.
  • each sub-model may be trained for a specific pattern.
  • the classification model may be trained for recognizing and/or segmenting at least two patterns.
  • a computer-implemented method for providing control data and/or a map comprising providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model; wherein the image classification model comprises at least one image classification algorithm, wherein the at least one image classification algorithm is configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern; determining in the at least one image of the at least one part of the agricultural field a third pattern from the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern by utilizing the two determined patterns; providing, based on the determined third pattern, control data for an agricultural device, wherein the control data at least comprising position data of one or more of the determined third pattern; and/or providing a map indicating the position of the third
  • the image classification model is adapted to identify in the at least one image of the at least one part of the agricultural field at least two of the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern.
  • the map may be a crop failure map, a replanting map, a crop line map, a crop line failure map and/or a crop map.
  • the map may also be a combination of a crop failure map, a replanting map and/or a crop map.
  • the map may include information about the location and/or size of a crop failure pattern, a replanting pattern, crop line pattern, a crop line failure pattern and/or a crop pattern.
  • crop line pattern may be understood broadly and may comprise a crop line, a crop failure, a crop line failure and a crop, respectively.
  • a crop line failure pattern may comprise information about a crop line and failures on this line.
  • a computer-implemented method for providing control data, a crop failure map and/or a replanting map comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing, based on the determined crop failures, control data for an agricultural device, wherein the control data at least comprising position data of one or more of the determined crop failures; and/or providing, based on the determined crop failures, a crop failure map indicating the crop failures in the agricultural field; and/or providing, based on the determined crop failures, a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold.
  • a computer-implemented method for providing control data, and/or a map comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model, configured to identify at least two of the following patterns: a crop line patter, a crop failure pattern and/or a crop pattern; providing, based on the results of the models and the relationships between the patterns, control data for an agricultural device, wherein the control data at least comprising position data of one or more of the determined crop failures; and/or providing a crop failure map indicating the crop failures in the agricultural field; and/or providing a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold
  • a further aspect of the present disclosure relates to a system for providing crop line failure data, comprising: a providing unit configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit configured to provide an image classification model configured to identify at least two identification patterns selected from the group of the following patterns consisting of: a crop line pattern, a crop failure pattern and/or a crop pattern; and the providing unit further configured for providing a relationship between the at least two identification patterns for which the image classification model is configured to identify; the system further comprising a determining unit configured to identify the at least two identification patterns with the image classification model; and the determining unit further configured to generate the crop line failure data based on the at least two identification patterns and based on the relationship ; wherein the crop line failure data indicate crop failures in a crop line; a further providing unit configured to provide the crop line failure data.
  • a system for providing control data and a map comprising: a providing unit configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit configured to provide an image classification model; wherein the image classification model comprises at least one image classification algorithm, wherein the at least one image classification algorithm is configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern; a determining unit configured determine in the at least one image of the at least one part of the agricultural field a third pattern from the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern by utilizing the two determined patterns; a further providing unit configured to provide, based on the third pattern, control data for an agricultural device , wherein the control data at least comprising position data of one or more of the
  • a system for providing control data, a crop failure map and/or a replanting map comprising: a providing unit configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit configured to provide an image classification model configured to identify crop failures in a crop line; a determining unit configured determine crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; a further providing unit configured to provide, based on the determined crop failures, control data for the agricultural device, wherein the control data are at least comprising position data of one or more of the determined crop failures; and/or configured to provide, based on the determined crop failures, a crop failure map indicating the crop failures in the agricultural field; and/or configured to provide, based on the determined crop failures, a replanting map indicating where in the agricultural field a crop failure
  • a further aspect of the present disclosure relates to an apparatus for providing crop line failure data, the apparatus comprising: one or more computing nodes; and one or more computer- readable media having thereon computer-executable instructions that are structured such that, when executed by the one or more computing nodes, cause the apparatus to perform the computer-implemented method.
  • an apparatus for providing control data, a crop failure map and/or a replanting map comprising: one or more computing nodes; and one or more computer-readable media having thereon computer-executable instructions that are structured such that, when executed by the one or more computing nodes, cause the apparatus to perform the following steps: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing, based on the determined crop failures, control data for an agricultural device, wherein the control data at least comprising position data of one or more of the determined crop failures; and/or providing, based on the determined crop failures, a crop failure map indicating the crop failures in the agricultural field; and/or providing, based on the determined
  • a further aspect of the present disclosure relates to an agricultural device, e.g. a planting device for planting crops and/or a transportation unit for transporting stems, seedlings and/or seeds, wherein control data for the agricultural device are at least partially provided according to the disclosed computer-implemented method for providing control data.
  • the agricultural device may use the crop line failure data, e.g. the replanting map, in order to replant crop failures above a predefined threshold and/or size.
  • the crop line failure data e.g. the replanting map
  • a further aspect of the present disclosure relates to a computer program element with instructions, which, when executed on computing devices of a computing environment, is configured to carry out the steps of the computer-implemented method for providing control data for an agricultural device in an apparatus and/or system for providing control data for an agricultural device.
  • the above may be achieved by using a computer-implemented method for providing crop failure data of an agricultural field, comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing crop failure data based on the determined crop failures.
  • a further aspect of the present disclosure relates to a system for providing crop failure data of an agricultural field, comprising: a providing unit configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit configured to provide an image classification model configured to identify crop failures in a crop line; a determining unit configured determine crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; a further providing unit configured to provide crop failure data based on the determined crop failures.
  • a further aspect of the present disclosure relates to an apparatus for providing crop failure data of an agricultural field, the apparatus comprising: one or more computing nodes; and one or more computer-readable media having thereon computer-executable instructions that are structured such that, when executed by the one or more computing nodes, cause the apparatus to perform the following steps: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing crop failure data based on the determined crop failures.
  • a further aspect of the present disclosure relates to a computer program element with instructions, which, when executed on computing devices of a computing environment, is configured to carry out the steps of the computer-implemented method for providing crop failure data of an agricultural field in an apparatus and/or system for providing crop failure data of an agricultural field. The same may be true for the computer implemented method and/or system for providing crop line failure data
  • a further aspect of the present disclosure relates to a navigation device configured to navigate an agricultural device to crop failures in the agricultural field based on the crop failure data and/or cop line failure data, wherein the crop failure data are provided according to the disclosed computer-implemented method for providing crop failure data and/or crop line failure data.
  • a mobile device comprising at least one display unit, wherein the mobile device is configured to display the position of the mobile device and the position of the crop failures in the agricultural field based on the crop failure data and/or crop line failure data, wherein the crop failure data are provided according to the disclosed computer- implemented method for providing crop failure data and/or the computer-implemented method for providing crop line failure data .
  • a further example relates to computer-implemented method for providing a crop failure map of an agricultural field, comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing a crop failure map of the agricultural field based on the determined crop failures.
  • a further example relates to a system for providing a crop failure map of an agricultural field, comprising: a providing unit configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit configured to provide an image classification model configured to identify crop failures in a crop line; a determining unit configured to determine crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; a further providing unit configured to provide a crop failure map of the agricultural field based on the determined crop failures.
  • a further example relates to an apparatus for providing a crop failure map of an agricultural field, the apparatus comprising: one or more computing nodes; and one or more computer- readable media having thereon computer-executable instructions that are structured such that, when executed by the one or more computing nodes, cause the apparatus to perform the following steps: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing a crop failure map of the agricultural field based on the determined crop failures.
  • a further aspect of the present disclosure relates to a computer program element with instructions, which, when executed on computing devices of a computing environment, is configured to carry out the steps of the computer-implemented method for providing a crop failure map of an agricultural field in an apparatus and/or system for providing a crop failure map of an agricultural field and/or the steps of the computer-implemented method for providing crop line failure data and/or a system for providing crop line failure data.
  • a further aspect of the present disclosure relates to a use of image data of at least a part of an agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged and/or a use of an image classification model configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following patterns, e.g. identification patterns: a crop line pattern, a crop failure pattern and/or a crop pattern in one of the computer-implemented methods/systems/apparatuses disclosed herein.
  • identification patterns e.g. identification patterns: a crop line pattern, a crop failure pattern and/or a crop pattern in one of the computer-implemented methods/systems/apparatuses disclosed herein.
  • a use of image data of at least a part of an agricultural field wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged and/or a use of an image classification model configured to identify crop failures in a crop line in one of the computer-implemented methods/systems/apparatuses disclosed herein.
  • the use of the corresponding data in a disclosed method/system/apparatus means that these data are used as input data.
  • a further aspect of the present disclosure relates to a use of image data, e.g. provided by a drone, of at least a part of an agricultural field as input data for a weed classification algorithm.
  • the weed classification algorithm is configured to identify weeds in the provided image data.
  • the identified weeds may be considered when deciding whether or not a replanting should be performed.
  • a further example relates to a method for providing a crop failure and weed map, comprising the following steps: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing one or more image classification models configured to identify crop failures in a crop line, crop line failures and/or configured to identify weeds; determining crop failures and weeds in the at least one image of the at least one part of the agricultural field utilizing the one or more image classification algorithms; providing a crop failure and weed map of the agricultural field based on the determined crop failures and weeds.
  • the crop is a sugar cane. Due to comparably, e.g. to soybean for example, large amounts of active ingredients allowed and used in sugar cane, the protection period of crop protection is longer and often applied around sowing time. The crop protection end date is then also close to the time it takes for the Sugar Cane crop to establish and go into tillering phase where it can be assessed for potential replanting. Therefore, both weed management and crop emergence management may be assessed at the same time. In this way, efficiency in image collection and preprocessing operations may be gained by executing the process just once instead of twice.
  • determining also includes “estimating, calculating, initiating or causing to determine”
  • generating also includes “initiating or causing to generate”
  • providing also includes “initiating or causing to determine, generate, select, send, query or receive”. It is an object of the present invention to allow a more simplified replanting of crop failures and to assist a farmer in carrying out such replanting of crop failures.
  • corresponding apparatuses, systems, devices and/or use cases disclose a significant gain in time, higher yields and a more economically viable crop production.
  • crop failure and/or “crop failure pattern” as used herein is to be understood as gaps between a continuous crop line, e.g. a line of sugar cane or forest plants.
  • crop failure data and/or “crop line failure data” refers to a corresponding data set comprising information about the identified crop failures and/or crop failures in a crop line.
  • crop failure map and/or “crop line failure map” used herein is to be understood broadly and includes any visual representation of the determined crop failures, i.e. in the crop failure map crop failures are specified.
  • the visual representation is a pattern for crop failure and/or a pattern for a gap, in particular a gap in a crop line.
  • crop line failure map may be used.
  • the crop failure map and/or crop line failure map may be divided in cells, preferably in polygon-shaped cells, wherein for each cell, a crop failure value may be determined.
  • a crop failure map and/or crop line failure map may be a vector image comprised of lines of crop failures and crop lines, but it can also be indicating crop line and crops or any other combination.
  • a crop failure map and/or crop line failure map can also be a raster map comprising of crop failure areas and crop lines area, but it can also be indicating crop lines and crops or any other combination of at least two components of the set of crop lines, crops and crop failures. It is also not limited to lines and images but may also indicate areas and is to be understood broadly and includes also corresponding data sets with position coordinates that are not represented in visual form.
  • the crop failure map and/or crop line failure map may also be accompanied or overlapped by a second map defining areas for which the amount of crop failure is defined.
  • the amount of failure may be indicated in percentage but any metric even arbitrary may be used. This may inform the farmer about the severity and may help having a better return on investment.
  • the term “agricultural field” as used herein is to be understood broadly in the present case and presents any area, i.e. surface and subsurface, of a soil to be treated.
  • the agricultural field may be any plant or crop cultivation area, such as a farming field, a greenhouse, or the like.
  • the agricultural field may be identified through its geographical location or geo-referenced location data. A reference coordinate, a size and/or a shape may be used to further specify the agricultural field.
  • image classification model as used herein is to be understood broadly in the present case.
  • the image classification model according to the present disclosure is configured to identify crop failures in a crop line.
  • the image classification model may comprise at least one image classification algorithm configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following: a crop line, a crop failure and/or a crop.
  • the image classification model may be a trained machine learning model and may be executed on a processor.
  • the image classification model may be adapted to recognize a pattern for a crop line, a pattern for crop and/or a pattern for a crop failure.
  • the classification algorithm may comprise identifying at least two sets, at least two patterns and/or at least two categories in the image selected from the group of patterns consisting of a crop line, a crop failure and a crop.
  • the image classification model may comprise a plurality of sub-models which each are focused to a specific task, e.g. crop line pattern recognition, drop failure pattern recognition and/or crop pattern recognition.
  • crop line crop failure
  • crop line crop failure
  • crop line crop failure
  • crop line crop line
  • crop crop crop line ⁇ crop failure
  • a “crop line” and/or a “crop line pattern” is composed of the “crop failure” and the “crop”; the “crop failure” may be determined by taking the “crop” from the “crop line”; the “crop” may be determined by taking the “crop failure” from the “crop line”.
  • the triangular relation may follow the rules of set theory.
  • the triangular relation may allow for detecting at least two patterns of the group of patterns and determining the other pattern or the other category from the at least two patterns. At least two sets of patterns are allocated to disjunct groups. E.g.
  • the set of crop failure patterns and the set of crop patterns form two disjunct groups.
  • another set of patterns may be derived by detecting at least two different sets of patterns.
  • the at least two different sets of patterns may be disjunct.
  • the set of a crop line patterns is formed as a union of the set of crop patterns and the set of crop failure patterns, the set of crop failure patterns is the set difference of the set of crop line patterns and the set of crop patterns, the set of crop patterns is the set difference of the set of crop line patterns and the set of crop failure patterns.
  • the set of crop line patterns may comprise crop line patterns for a plurality of single crop line patterns that are related to the physical single crop lines.
  • a set of a single crop line pattern may be a subset of the set of crop line patterns.
  • the set of crop patterns belonging to the set of the single crop line may be a subset of the set of crop patterns and the set of crop failure patterns belonging to the set of the single crop line may be a subset of the set of crop failure patterns.
  • the image classification algorithm is based on the results of a machinelearning algorithm, wherein the term “machine learning algorithm” may comprise decision trees, naive bayes classifications, nearest neighbors, neural networks, convolutional or recurrent neural networks, transformers, generative adversarial networks, support vector machines, linear regression, logistic regression, random forest and/or gradient boosting algorithms.
  • the result of a machine learning algorithm is used to adjust the application rate decision logic.
  • the machine learning algorithm is organized to process an input having a high dimensionality into an output of a much lower dimensionality.
  • Such a machine learning algorithm is termed “intelligent” because it is capable of being “trained.”
  • the algorithm may be trained using records of training data.
  • a record of training data comprises training input data and corresponding training output data.
  • the training output data of a record of training data is the result that is expected to be produced by the machine learning algorithm when being given the training input data of the same record of training data as input.
  • the deviation between this expected result and the actual result produced by the algorithm is observed and rated by means of a “loss function”.
  • This loss function is used as a feedback for adjusting the parameters of the internal processing chain of the machine learning algorithm. For example, the parameters may be adjusted with the optimization goal of minimizing the values of the loss function that result when all training input data is fed into the machine learning algorithm and the outcome is compared with the corresponding training output data.
  • the result of this training is that given a relatively small number of records of training data as “ground truth”, the machine learning algorithm is enabled to perform its job well for a number of records of input data that higher by many orders of magnitude.
  • An image classification algorithm may be trained to identify, differentiate and/or segment a plurality of patterns.
  • the classification algorithm may comprise a labeling algorithm, wherein the labeling algorithm may be adapted to provide and/or mark a recognized pattern with a corresponding label, e.g. a recognized crop line pattern with a crop line label, a recognized crop pattern with a crop label, e.g. the type of crop, and a crop failure pattern with a crop failure label and/or a gap label.
  • the labeling algorithm may be adapted to provide and/or mark a recognized pattern with a corresponding label, e.g. a recognized crop line pattern with a crop line label, a recognized crop pattern with a crop label, e.g. the type of crop, and a crop failure pattern with a crop failure label and/or a gap label.
  • the labelling algorithm may use a segmentation algorithm and link a recognized pattern to a meaning such as crop, crop failure, crop line failure and/or failure.
  • the classification algorithm may be trained to detect at least two patterns substantially simultaneously.
  • an image is provided to a combined image classification model, wherein the combined image classification model may be trained for recognizing at least two different patterns, e.g. crop patterns and crop failure patterns.
  • a combined image classification model may comprise a single input for providing the image to the combined image classification model and may have at least two outputs, one for each recognized pattern.
  • the pattern recognition may be executed independently from another.
  • the results of different classification operations may not depend on another.
  • the image classification algorithm is based on the results of a classical computer vision algorithm and/or an image recognition algorithm, wherein the term “classical computer vision algorithm” may comprise Hough transforms, Fourier transforms, filters, kernel convolutions, object based image analysis and/or statistical shape analysis.
  • classical computer vision algorithm exploits statistical traits of the difference in color and/or other non-visible wavelengths, intensity, position and/or grouping between areas of interest.
  • the output of the image classification model and/or the output of the image classification algorithm may be further processed in order to prepare the output for a good computer handling.
  • a good handling may be providing a shape of a pattern by a shape analysis algorithm.
  • a shape analysis algorithm may be used. Based on a shape analysis a pattern recognized during classification may be associated with a computer readable structure. In this way a physical structure like a crop line, a crop and/or a crop failure may be converted into a corresponding computer model.
  • a “skeleton” or a “topological skeleton” of a shape is a thin version of that shape that is equidistant to its boundaries. Skeletonization is the process of transforming a blob of pixels into one or multiple lines with 1px width.
  • a polygonal chain is a connected series of line segments. More formally, a polygonal chain P is a curve specified by a sequence of points called its vertices.
  • the curve itself consists of the line segments connecting the consecutive vertices.
  • a polygonal chain may also be called a polygonal curve, polygonal path, polyline, piecewise linear curve, broken line or, in geographic information systems, a linestring or linear ring.
  • Such polylines may be used in the present disclosure, since the pixel output may be cumbersome to work with as the final form is complex.
  • a crop failure threshold may be set to a length above 20 cm, 50 cm or above 1 m. This means that below this threshold a crop failure is not considered as a relevant, for example, since a crop failure below such threshold value is overgrown by the neighboring crop anyway.
  • the image classification may comprise detecting the size of a pattern and ignoring the size below a predefined threshold.
  • the size may be different for crop patterns, crop failure pattens and/or for crop line patterns. The size may be measured in meter, centimeter and/or pixels.
  • the predefined threshold may correspond to a predefined resolution for the pattern. For detecting the size a size detecting sub-method, sub-module and/or size detecting device may be provided.
  • the threshold value may also be expressed as a percentage of area and/or of a line length, e.g. 15% of a square meter.
  • the information gathered for the replanting map may be used in order to generate a hard copy of the replanting map, e.g. by sending control data and/or a file comprising replanting information to a printing device, e.g. a printer and/or plotter.
  • the replanting map may be sent to a mobile device, e.g. a smart phone, and may use the navigation facility of the mobile device to guide a user for replanting.
  • the information gathered for the replanting map and/or the control data may be used to control a planting operation of a planting device and/or of a seeding device, e.g. a smart seeder.
  • control data as used herein is to be understood broadly in the present case and presents any data being configured to operate and control an agricultural device.
  • the control data may be provided by a control unit and may be configured to control one or more technical means of the agricultural device, e.g. the drive control, the steering, etc., but is not limited thereto.
  • the control data at least comprising position data of one or more of the determined crop failures.
  • the term “agricultural device” used herein is to be understood broadly in the present case and represents any device being configured for replanting crops and/or transporting stems, seedlings and/or seeds to a crop failure position.
  • the agricultural device may be a ground or an air vehicle, e.g. a tractor, a transporter, a rail vehicle, a robot, an aircraft, an unmanned aerial vehicle (UAV), a drone, planting device, etc.
  • UAV unmanned aerial vehicle
  • the agricultural device may be an autonomous or a non-autonomous device.
  • navigation device as used as used herein is to be understood broadly in the present case and represents any device configured/usable to navigate a vehicle and/or a person to one or more crop failures.
  • the navigation device is a GPS device.
  • mobile device as used herein is to be understood broadly in the present case and represents any portable device, e.g. smart phone, handheld, laptop, tablet, etc., comprising at least one display unit, with which the position of the mobile device and the position of the crop failures in the agricultural field can be shown.
  • providing is to be understood broadly in the present case and represents any providing, receiving, querying, measuring, calculating, determining, transmitting of data, but is not limited thereto.
  • Data may be provided by a user via a user interface, depicted/shown to a user by a display, and/or received from other devices, queried from other devices, measured other devices, calculated by other device, determined by other devices and/or transmitted by other devices.
  • data as used herein is to be understood broadly in the present case and represents any kind of data.
  • Data may be single numbers/numerical values, a plurality of a numbers/numerical values, a plurality of a numbers/numerical values being arranged within a list, 2 dimensional maps or 3 dimensional maps, but are not limited thereto.
  • the agricultural device may be a planting device for planting crops on and/or a transportation unit for transporting stems, seedlings and/or seeds.
  • the image classification model may comprise at least one image classification algorithm, wherein the at least one image classification algorithm is configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following: a crop line, a crop failure and/or a crop.
  • the image classification model may comprise at least two of the following image classification algorithms: an image classification algorithm configured to identify a crop line in the at least one image of the at least one part of the agricultural field, an image classification algorithm configured to identify a crop failure in the at least one image of the at least one part of the agricultural field.
  • the image classification model comprises at least two of the following image classification algorithms: an image classification algorithm configured to identify the crop line pattern in the at least one image of the at least one part of the agricultural field, an image classification algorithm configured to identify the crop failure pattern in the at least one image of the at least one part of the agricultural field and an image classification algorithm configured to identify a crop pattern in the at least one image of the at least one part of the agricultural field.
  • the crop line failure data are provided in form of control data for an agricultural device, wherein the control data comprise position data of one or more of the crop failures in the crop line; and/or a crop line failure map indicating the crop failures in the agricultural field; and/or a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold.
  • the image classification model comprises a single model configured to identify each of the at least two identification patterns independently from another and/or wherein the image classification model comprises separate sub-models for each of the at least two identification patterns.
  • each of the separate sub-models may operate independently from another and may substantially be adapted for Identifying one of the two identification pattern.
  • a crop failure and/or crop failure pattern may be at a position in a crop line where a sown crop has not emerged and/or a position in a crop line where a crop is smaller than a predetermined size.
  • the method may further comprise: determining whether a crop failure and/or crop failure relating to a crop failure pattern is above a predetermined crop failure threshold; wherein the control data for the agricultural device are only provided in case a crop failure is above the predetermined crop failure threshold.
  • the method may further comprise: providing a crop failure map based on the determined crop failures, in particular wherein the crop line failure map comprises cells, preferably polygon-shaped cells, wherein for each cell, a crop failure value is determined.
  • the failure map in particular the crop line failure map, may comprise crop lines, crop areas, crop failure areas.
  • the failure map comprises crop lines, crop failures and/or crop.
  • the crop failure map in particular the crop line failure map, comprises cells, preferably polygon-shaped cells, wherein for each cell, a crop failure value is determined.
  • the method is further comprising: providing a replanting map indicating where in the agricultural field a crop failure is above the predetermined crop failure threshold.
  • the threshold in an example may relate to the size of the pattern in the image.
  • an addition method and/or device may be provided adapted for measure the current size of the emerged crop in the field.
  • the method is further comprising: providing a weed classification algorithm configured to identify weeds; determining weeds in the at least one image of the at least one part of the agricultural field utilizing the weed image classification algorithm; providing a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold and the weed infestation is below a predetermined weed infestation threshold.
  • the image of the at least one part of the agricultural field may be provided at a time when the crops are in the tillering stage and/or have dimension three times bigger than the resolution of the used image sensors for capturing the image.
  • the method may further comprise: providing several images of the agricultural field and/or of parts of the agricultural field, stitching the images together by means of a stitching algorithm for providing the image data of the agricultural field.
  • the images of the agricultural field are RGB images, with a resolution of around 2.8cm/pixel.
  • the images may be high-resolution images sufficient for a respective image analysis. If necessary, the agricultural field may be divided into suitable segments so that an accordingly high resolution can be provided suitable for the analysis by an image classification algorithm.
  • the images may be provided by at least one image collection device, wherein the image collection device may be an aircraft device, e.g. a drone.
  • the present disclosure is not limited to a specific method for providing image data of the agricultural field and also not to a specific image collection device.
  • the method may further comprise: providing boundary data of the agricultural field and generating image collection path data for the at least one image collection device, wherein the image collection path data preferably comprises data with respect to path locations, position marks, flight heights, landing zones and/or image locations.
  • An analysis algorithm may be used for optimizing a collection path, e.g. maximum coverage in minimal time with minimal number of breaks, landings etc.
  • the image collection device may comprise a communication interface configured to directly or indirectly send the collected images to a computer device, wherein the computer device may be configured to execute the image classification algorithm(s) and to generate the crop failure data.
  • the term computer device is broadly understood and includes all appropriate means on which the image classification algorithm may be executed, for example, cloud computing solutions, a centralized or decentralized computer system, a computer center, etc.
  • the images may be automatically transferred from the image collection device to the computer device, e.g. via an upload center or a cloud connectivity during collection using an appropriate wireless communication interface, e.g. a mobile interface, long range WLAN etc.
  • an appropriate wireless communication interface e.g. a mobile interface, long range WLAN etc.
  • the image collecting device comprises an on-site data transfer interface, e.g. a USB-interface, from which the collected images may be received via a manual transfer and which are then transferred to a respective computer device for further processing.
  • an automatic workflow may be triggered comprising the following steps: stitching the images and providing the image data of the agricultural field, running the image classification algorithm(s) and determining the crop failures, providing the crop failure data and the control data based on the determined crop failures.
  • Figure 1 illustrate example embodiments of a centralized and a decentralized computing environment with computing nodes
  • Figure 2 illustrate example embodiments of a centralized and a decentralized computing environment with computing nodes
  • Figure 3 illustrate an example embodiment of a distributed computing environment
  • Figure 4 illustrates a flow diagram of a computer-implemented method for providing control data for an agricultural device
  • Figure 5 illustrates a system for providing control data for an agricultural device
  • Figures 6 to 14 illustrates the steps for determining crop failures in the provided image data of the agricultural field
  • Figure 15 illustrates examples for a crop failure map comprising polygon-shaped cells
  • Figure 16 illustrates an enlarged section of the crop failure map of figure 15.
  • Figure 17 illustrates exemplarily the different possibilities to receive and process field data.
  • Figures 1 to 3 illustrate different computing environments, central, decentral and distributed.
  • the methods, apparatuses, computer elements of this disclosure may be implemented in decentral or at least partially decentral computing environments.
  • Data sovereignty may be viewed as a core challenge. It can be defined as a natural person’s or corporate entity’s capability of being entirely self-determined with regard to its data.
  • To enable this particular capability related aspects, including requirements for secure and trusted data exchange in business ecosystems, may be implemented across the chemical value chain.
  • chemical industry requires tailored solutions to deliver chemical products in a more sustainable way by using digital ecosystems.
  • Providing, determining or processing of data may be realized by different computing nodes, which may be implemented in a centralized, a decentralized or a distributed computing environment.
  • Figure 1 illustrates an example embodiment of a centralized computing system 20 comprising a central computing node 21 (filled circle in the middle) and several peripheral computing nodes 21.1 to 21. n (denoted as filled circles in the periphery).
  • the term “computing system” is defined herein broadly as including one or more computing nodes, a system of nodes or combinations thereof.
  • the term “computing node” is defined herein broadly and may refer to any device or system that includes at least one physical and tangible processor, and/or a physical and tangible memory capable of having thereon computer-executable instructions that are executed by a processor.
  • Computing nodes are now increasingly taking a wide variety of forms.
  • Computing nodes may, for example, be handheld devices, production facilities, sensors, monitoring systems, control systems, appliances, laptop computers, desktop computers, mainframes, data centers, or even devices that have not conventionally been considered a computing node, such as wearables (e.g., glasses, watches or the like).
  • the memory may take any form and depends on the nature and form of the computing node.
  • the peripheral computing nodes 21.1 to 21. n may be connected to one central computing system (or server). In another example, the peripheral computing nodes 21.1 to 21. n may be attached to the central computing node via e.g. a terminal server (not shown). The majority of functions may be carried out by or obtained from the central computing node (also called remote centralized location).
  • One peripheral computing node 21. n has been expanded to provide an overview of the components present in the peripheral computing node.
  • the central computing node 21 may comprise the same components as described in relation to the peripheral computing node 21. n.
  • Each computing node 21, 21.1 to 21. n may include at least one hardware processor 22 and memory 24.
  • the term “processor” may refer to an arbitrary logic circuitry configured to perform basic operations of a computer or system, and/or, generally, to a device which is configured for performing calculations or logic operations.
  • the processor, or computer processor may be configured for processing basic instructions that drive the computer or system. It may be a semi-conductor based processor, a quantum processor, or any other type of processor configures for processing instructions.
  • the processor may comprise at least one arithmetic logic unit ("ALU"), at least one floating-point unit ("FPU)", such as a math coprocessor or a numeric coprocessor, a plurality of registers, specifically registers configured for supplying operands to the ALU and storing results of operations, and a memory, such as an L1 and L2 cache memory.
  • ALU arithmetic logic unit
  • FPU floating-point unit
  • registers specifically registers configured for supplying operands to the ALU and storing results of operations
  • a memory such as an L1 and L2 cache memory.
  • the processor may be a multicore processor.
  • the processor may be or may comprise a Central Processing Unit (“CPU").
  • the processor may be a (“GPU”) graphics processing unit, (“TPU”) tensor processing unit, (“CISC”) Complex Instruction Set Computing microprocessor, Reduced Instruction Set Computing (“RISC”) microprocessor, Very Long Instruction Word (“VLIW') microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets.
  • the processing means may also be one or more special-purpose processing devices such as an Application- Specific Integrated Circuit (“ASIC”), a Field Programmable Gate Array (“FPGA”), a Complex Programmable Logic Device (“CPLD”), a Digital Signal Processor (“DSP”), a network processor, or the like.
  • ASIC Application- Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • CPLD Complex Programmable Logic Device
  • DSP Digital Signal Processor
  • processor may also refer to one or more processing devices, such as a distributed system of processing devices located across multiple computer systems (e.g., cloud computing), and is not limited to a single device unless otherwise specified.
  • the memory 24 may refer to a physical system memory, which may be volatile, non-volatile, or a combination thereof.
  • the memory may include non-volatile mass storage such as physical storage media.
  • the memory may be a computer-readable storage media such as RAM, ROM, EEPROM, CD-ROM, or other optical disk storage, magnetic disk storage, or other magnetic storage devices, non-magnetic disk storage such as solid-state disk or any other physical and tangible storage medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by the computing system.
  • the memory may be a computer-readable media that carries computer- executable instructions (also called transmission media).
  • program code means in the form of computerexecutable instructions or data structures can be transferred automatically from transmission media to storage media (or vice versa).
  • computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computing system RAM and/or to less volatile storage media at a computing system.
  • a network interface module e.g., a “NIC”
  • storage media can be included in computing components that also (or even primarily) utilize transmission media.
  • the computing nodes 21 , 21.1 to 21. n may include multiple structures 26 often referred to as an “executable component, executable instructions, computer-executable instructions or instructions”.
  • memory 24 of the computing nodes 21, 21.1 to 21. n may be illustrated as including executable component 26.
  • executable component or any equivalent thereof may be the name for a structure that is well understood to one of ordinary skill in the art in the field of computing as being a structure that can be software, hardware, or a combination thereof or which can be implemented in software, hardware, or a combination.
  • an executable component when implemented in software, one of ordinary skill in the art would understand that the structure of an executable component includes software objects, routines, methods, and so forth, that is executed on the computing nodes 21 , 21.1 to 21. n, whether such an executable component exists in the heap of a computing node 21, 21.1 to 21. n, or whether the executable component exists on computer-readable storage media.
  • the structure of the executable component exists on a computer- readable medium such that, when interpreted by one or more processors of a computing node 21, 21.1 to 21. n (e.g., by a processor thread), the computing node 21 , 21.1 to 21n is caused to perform a function.
  • Such a structure may be computer-readable directly by the processors (as is the case if the executable component were binary). Alternatively, the structure may be structured to be interpretable and/or compiled (whether in a single stage or in multiple stages) so as to generate such binary that is directly interpretable by the processors.
  • Such an understanding of example structures of an executable component is well within the understanding of one of ordinary skill in the art of computing when using the term “executable component”.
  • Examples of executable components implemented in hardware include hardcoded or hard-wired logic gates, that are implemented exclusively or near-exclusively in hardware, such as within a field- programmable gate array (FPGA), an application-specific integrated circuit (ASIC), or any other specialized circuit.
  • FPGA field- programmable gate array
  • ASIC application-specific integrated circuit
  • the terms “component”, “agent”, “manager”, “service”, “engine”, “module”, “virtual machine” or the like are used synonymous with the term “executable component.
  • the processor 22 of each computing node 21 , 21.1 to 21. n may direct the operation of each computing node 21, 21.1 to 21. n in response to having executed computer-executable instructions that constitute an executable component.
  • computer-executable instructions may be embodied on one or more computer-readable media that form a computer program product.
  • the computer-executable instructions may be stored in the memory 24 of each computing node 21 , 21.1 to 21. n.
  • Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor 21, cause a general purpose computing node 21 , 21.1 to 21. n, special purpose computing node 21, 21.1 to 21. n, or special purpose processing device to perform a certain function or group of functions.
  • the computer-executable instructions may configure the computing node 21, 21.1 to 21. n to perform a certain function or group of functions.
  • the computer executable instructions may be, for example, binaries or even instructions that undergo some translation (such as compilation) before direct execution by the processors, such as intermediate format instructions such as assembly language, or even source code.
  • Each computing node 21, 21.1 to 21. n may contain communication channels 28 that allow each computing node 21.1 to 21. n to communicate with the central computing node 21, for example, a network (depicted as solid line between peripheral computing nodes and the central computing node in Figure 1).
  • a “network” may be defined as one or more data links that enable the transport of electronic data between computing nodes 21 , 21.1 to 21. n and/or modules and/or other electronic devices.
  • Transmission media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general- purpose or special-purpose computing nodes 21, 21.1 to 21.n. Combinations of the above may also be included within the scope of computer-readable media.
  • the computing node(s) 21 , 21.1 to 21. n may further comprise a user interface system 25 for use in interfacing with a user.
  • the user interface system 25 may include output mechanisms 25A as well as input mechanisms 25B.
  • output mechanisms 25A might include, for instance, displays, speakers, displays, tactile output, holograms and so forth.
  • Examples of input mechanisms 25B might include, for instance, microphones, touchscreens, holograms, cameras, keyboards, mouse or other pointer input, sensors of any type, and so forth.
  • Figure 2 illustrates an example embodiment of a decentralized computing environment 30 with several computing nodes 21.1 to 21. n denoted as filled circles.
  • the computing nodes 21.1 to 21. n of the decentralized computing environment are not connected to a central computing node 21 and are thus not under control of a central computing node. Instead, resources, both hardware and software, may be allocated to each individual computing node 21.1 to 21. n (local or remote computing system) and data may be distributed among various computing nodes 21.1 to 21. n to perform the tasks.
  • program modules may be located in both local and remote memory storage devices.
  • One computing node 21 has been expanded to provide an overview of the components present in the computing node 21.
  • the computing node 21 comprises the same components as described in relation to Figure 1.
  • Figure 3 illustrates an example embodiment of a distributed computing environment 40.
  • distributed computing may refer to any computing that utilizes multiple computing resources. Such use may be realized through virtualization of physical computing resources.
  • cloud computing may refer a model for enabling on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services).
  • cloud computing environments may be distributed internationally within an organization and/or across multiple organizations.
  • the distributed cloud computing environment 40 may contain the following computing resources: mobile device(s) 42, applications 43, databases 44, data storage and server(s) 46.
  • the cloud computing environment 40 may be deployed as public cloud 47, private cloud 48 or hybrid cloud 49.
  • a private cloud 47 may be owned by an organization and only the members of the organization with proper access can use the private cloud 48, rendering the data in the private cloud at least confidential.
  • data stored in a public cloud 48 may be open to anyone over the internet.
  • the hybrid cloud 49 may be a combination of both private and public clouds 47, 48 and may allow to keep some of the data confidential while other data may be publicly available.
  • Figure 4 illustrates a flow diagram of a computer-implemented method for providing control data, a crop failure map and/or a replanting map.
  • image data of at least a part of the agricultural field are provided, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged.
  • an image classification model configured to identify crop failures in a crop line is provided.
  • crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm are determined.
  • control data for the agricultural device are provided, wherein the control data are at least comprising position data of one or more of the determined crop failures.
  • failure data, a failure map and/or a replanting map may be provided.
  • image data of at least a part of the agricultural field are provided, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged.
  • an image classification model configured to identify at least two identification patterns selected from the group of the patterns consisting of: a crop line pattern, a crop failure pattern and/or a crop pattern.
  • a relationship between the at least two identification patterns is provided. The relation expresses the relationship between the patterns for whose identification the image classification model is configured.
  • the method provides for identifying the at least two identification patterns with the image classification model and generating the crop line failure data based on the at least two identification patterns and based on the relationship and in this way determining crop failures and/or the crop line failure data.
  • the crop line failure data indicate crop failures in a crop line.
  • the method ends with providing the crop line failure data in a desired format, e.g. as control data, a crop failure map, a crop line failure map and/or as a replanting map.
  • a desired format e.g. as control data, a crop failure map, a crop line failure map and/or as a replanting map.
  • Figure 5 illustrates a system 10 for providing control data, a crop failure map, a crop line failure map and/or a replanting map, comprising a providing unit 11 configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit 12 configured to provide an image classification model configured to identify crop failures in a crop line; a determining unit 13 configured determine crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; and a further providing unit 14 configured to provide control data for the agricultural device, wherein the control data are at least comprising position data of one or more of the determined crop failures.
  • failure data, a failure map and/or a replanting map may be provided.
  • the system 10 for providing crop line failure data comprises a providing unit 11 configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged.
  • the system 10 further comprises a further providing unit 12 configured to provide an image classification model configured to identify at least two identification patterns selected from the group of the following patterns consisting of: a crop line pattern, a crop failure pattern and/or a crop pattern ; and for providing a relationship between the at least two identification patterns for which the image classification model is configured to identify.
  • the system 10 further comprises a determining unit 13 configured to identify the at least two identification patterns with the image classification model; and to generate the crop line failure data based on the at least two identification patterns and based on the relationship, wherein the crop line failure data indicate crop failures in a crop line
  • the system 10 further comprises a further providing unit 14 configured to provide the crop line failure data.
  • Figures 6 to 14 illustrate an example for determining crop failures in the provided image data of the agricultural field.
  • an image classification model configured to identify crop failures in a crop line may be applied.
  • the image classification model may comprise at least one image classification algorithm configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following: a crop line, a crop failure and/or a crop.
  • two of the mentioned categories i.e. crop line, crop failure, crop
  • a “crop line” is composed of the “crop failure” and the “crop”; the “crop failure” may be determined by taking the “crop” from the “crop line”; the “crop” may be determined by taking the “crop failure” from the “crop line”.
  • a crop failure and/or a crop corresponding pattern may be recognized.
  • Figure 6 shows a section of an agricultural field with several parallel crop lines 100 (only two of the crop lines are provided with a reference sign for better clearness), e.g. sugar cane lines.
  • a grayscale image of an “Excess Green” index is shown to illustrate the difference between crop 110 in the lighter color and background/soil 120 in the darker parts.
  • An image as shown in Figure 6 may be recorded by an optical sensor or camara a board of a UAV or a satellite.
  • the image may comprise a collection of pixels having different grey scale values and/or having different colors.
  • the image as of Figure 6 does not comprise any detected object nor any labels.
  • Crops 110 form continuous lines which can be interrupted by the failure.
  • the combination of line and failure may form a crop line failure.
  • crop lines 100 shown in Figure 6 are provided by the image classification model.
  • the image of Figure 7 may be provided as output of the image classification model.
  • pattern recognized in Figure 7 may comprise recognized objects identified, labeled and/or annotated by a segmentation algorithm.
  • crop line areas 130 are converted into skeletons 140 thereby representing crop lines 100 or crop line patterns 100 with a simple structure of width of 1 pixel instead of the larger area.
  • a structure of may be handled by a computer, e.g. for measuring the size of the structure.
  • a structure may be detected by employing a skeletonizing algorithm to the output of the classification model.
  • the skeletonizing algorithm may substantially find a balancing point and/or the core area of the classified pattern, e.g. of crop line pattern 100 and in particular of the classified pattern 100 of Figure 7 or of the crop line area 130. Skeletonizing may help to generate a vector map.
  • the skeleton 140 (not shown in Figure 9) is converted into polyline skeletons 145 thereby moving from a raster image format and/or a pixel based image format into polylines which are in vector format.
  • Polyline is a chain of line segments forming a continuous line which represents the crop line here. Usage of vector space will aid in simplification process described in more detail in Figure 12.
  • the vector format may be very well handled by a computer and may help to describe a digital twin of the crop line, the crop, the crop line failure and/or of the cop failure.
  • a polyline may be derived from a pixel skeleton and may allow for a good representation of crop lines. It may also support measuring of a size of a corresponding object represented as polyline. Polylines may be simplified by reducing the number of points and/or pixels forming the polyline. It may be possible to go over each point of the lines and if a point matches and/or crosses a failure pattern the corresponding part of the line may be replaced by a failure line.
  • a failure line may replace a corresponding segment of a crop line and/or a row line.
  • a crop line may be a part of the polyline matching crop patterns.
  • a failure line and a crop line may be indicated by different colors and by replacing parts of the crop line and/or row line by a failure line may ensure to measure the size of the line segment allocated to a crop and a failure correctly.
  • Figure 10 shows an example of crop failure areas 130 and/or of a crop failure pattern in a classified image. Failure areas 130 and/or gaps 130 are the interruptions of the crop lines 100. Failures may be also detected next to a crop line 100, but not considered as crop failure may substantially only appear on crop lines 100.
  • Figure 11 shows polyline segments 150 of the crop line which are in addition to being a crop line also identified as crop failures 160 by using the crop failure areas from the Figure 10.
  • Figure 12 shows simplified polylines which eliminate smaller inconsistencies inside the crop lines making them straighter. This also makes downstream pixel-based computations more accurate as crop lines are usually intended by farmers to be as close to perfect lines as possible.
  • Figure 13 shows the notation of crop line on Figures 9 to 12 as dashes perpendicular to the line. This notion may be used to mark recognized crop lines 100.
  • Figure 14 may be used to mark recognized crop failure patterns.
  • Figure 15 illustrates an example of a crop failure map, wherein the crop failure map is provided with polygon-shaped cells, wherein for each cell a crop failure value may be determined.
  • Using of polygon-shaped cells and vectorized symbols for detected patterns may allow for providing a size for a detected pattern. In this way the measurement of the size of a crop failure in relation to a crop line length may be provided. Based on such a measurement a decision of replanting may be made.
  • Figure 16 shows an enlarged section of the crop failure map of Figure 15.
  • Figure 17 illustrates exemplarily the different possibilities to receive and process field data (e.g. image data, control data, etc.).
  • field data can be obtained by all kinds of agricultural equipment 300 (e.g. a tractor 300) as so-called as-applied maps by recording the application rate at the time of application.
  • agricultural equipment comprises sensors (e.g. optical sensors, cameras, infrared sensors, soil sensors, etc.) to provide, for example, a weed distribution map.
  • the yield e.g. in the form of biomass
  • corresponding maps/data can be provided by land-based and/or airborne drones 320 by taking images of the field or a part of it.
  • a geo-referenced visual assessment 330 is performed and that this field data is also processed.
  • Field data collected in this way can then be merged in a computing device 340, where the data can be transmitted and computed, for example, via any wireless link, cloud applications 350 and/or working platforms 360, wherein the field data may also be processed in whole or in part in the cloud application 350 and/or in the working platform 360 (e.g., by cloud computing).
  • the computer program element might therefore be stored on a computing unit of a computing device, which might also be part of an embodiment.
  • This computing unit may be configured to perform or induce performing of the steps of the method described above. Moreover, it may be configured to operate the components of the above described system.
  • the computing unit can be configured to operate automatically and/or to execute the orders of a user.
  • the computing unit may include a data processor.
  • a computer program may be loaded into a working memory of a data processor.
  • the data processor may thus be equipped to carry out the method according to one of the preceding embodiments.
  • This exemplary embodiment of the present disclosure covers both, a computer program that right from the beginning uses the present disclosure and computer program that by means of an update turns an existing program into a program that uses the present disclosure.
  • the computer program element might be able to provide all necessary steps to fulfill the procedure of an exemplary embodiment of the method as described above.
  • a computer readable medium such as a CD- ROM, USB stick, a downloadable executable or the like, is presented wherein the computer readable medium has a computer program element stored on it which computer program element is described by the preceding section.
  • a computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
  • the computer program may also be presented over a network like the World Wide Web and can be downloaded into the working memory of a data processor from such a network.
  • a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the present disclosure.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Soil Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Husbandry (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Mining & Mineral Resources (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Agronomy & Crop Science (AREA)
  • Environmental Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Computer-implemented method for providing control data, a crop failure map and/or a replanting map, comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing, based on the determined crop failures, control data for an agricultural device, wherein the control data at least comprising position data of one or more of the determined crop failures; and/or providing a crop failure map indicating the crop failures in the agricultural field; and/or providing a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold.

Description

METHOD FOR PROVIDING CONTROL DATA, A CROP FAILURE MAP AND/OR A
REPLANTING MAP
TECHNICAL FIELD
The present disclosure relates to a computer-implemented method for providing control data, a crop failure map and/or a replanting map, a system for providing control data, a crop failure map and/or a replanting map, an apparatus for providing control data, a crop failure map and/or a replanting map, an agricultural device controlled by such control data, a respective computerprogram element and a use of different data in such a computer-implemented method.
TECHNICAL BACKGROUND
The general background of this invention is the treatment of plantation in an agricultural field. For some crops, like sugar cane, it is common practice for farmers to replant crops where a crop failure is present, wherein a crop failure is a gap between a continuous crop line.
It has become apparent that there is a need to simplify such replanting of crop failures and to assist a farmer in carrying out such replanting of crop failures.
SUMMARY OF THE INVENTION
In one aspect of the present disclosure, a computer-implemented method for providing crop line failure data is provided comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model, configured to identify at least two identification patterns selected from the group of the patterns consisting of: a crop line pattern, a crop failure pattern and/or a crop pattern; providing a relationship between the at least two identification patterns for which the image classification model is configured to identify; identifying the at least two identification patterns with the image classification model, generating the crop line failure data based on the at least two identification patterns and based on the relationship; wherein the crop line failure data indicate crop failures in a crop line; providing the crop line failure data.
Thus, in an example, based on the results of the models and the relationships between the patterns, control data for an agricultural device may be provided, wherein the control data at least comprise position data of one or more of the determined crop failures; and/or a crop failure map may be provided indicating the crop failures in the agricultural field; and/or a replanting map may be provided indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold.
In an example where at least one of the identification patterns may not be identified by the classification model the number of image classification models may be different from the number of relationships. The relationship may also comprise a relationship between the at least two identification patterns and at least one identification pattern for which the image classification model is not configured to identify.
For example, if the identification pattern is derived from a different source than the image classification model, e.g. derived from a database, the non identified identification pattern may has a relation to the identification patterns that are identified by the image classification model. In another case, for example crop line pattern and a failure pattern may be detected by the model a crop line pattern and/or crop line failure pattern may be detected by the relationship it has to the crop line pattern and a failure pattern.
Consequently, the number of relationships may be different from the number of identified identification patterns.
In an example the crop failure map and/or the crop line failure map may be provided as a vector image. A vector image may comprise lines which overlap failures and crop line areas. A vector image may in addition and/or as an alternative indicate a crop line and crops or any other combination. A vector image may also comprise areas in addition and/or as an alternative to lines.
In an example, identifying the crop line pattern with the image classification model may comprise providing information about the spacing of the crop rows and/or seeding lines and/or crop lines. Crop rows may comprise crop lines. In particular crop rows may comprise at least one crop line. In an example crop rows and crop lines may be used exchangeable, however crop rows may express the seeding aspect whereas crop lines may express the emerging aspect. For example, the information about the spacing of the crop rows and/or seeding lines may be pre-processed, may be manually created, may be retrieved from a database and/or may be provided by a machine or an agricultural device such as a seeder for the same agricultural area from which the image data are generated.
For any pre-processed information an identification processing engine for pattern recognition and/or segmentation may be by-passed and in this way computation power may be reduced as for example instead of using at least two pattern recognition engines of the image classification model only a single engine and a database request may be used to identify the at least two identification patterns.
In other words, each of the sub-models of the image classification model may either use a pattern recognition and/or segmentation algorithm or a data retrieval algorithm wherein data retrieval may consume less power as pattern recognition. Dependent on the information already available from a data source the by-passing of the pattern recognition engine may be controlled. The control of by-passing may be realized by allocating computational power to a pattern recognition algorithm, e.g. allocating a thread in a computer architecture. For such a power optimizing control the determining unit may comprise a scheduling unit.
In another example, each sub-model may be trained for a specific pattern. In yet another example the classification model may be trained for recognizing and/or segmenting at least two patterns.
In an example, a computer-implemented method for providing control data and/or a map may be provided, comprising providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model; wherein the image classification model comprises at least one image classification algorithm, wherein the at least one image classification algorithm is configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern; determining in the at least one image of the at least one part of the agricultural field a third pattern from the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern by utilizing the two determined patterns; providing, based on the determined third pattern, control data for an agricultural device, wherein the control data at least comprising position data of one or more of the determined third pattern; and/or providing a map indicating the position of the third pattern in the agricultural field.
By way of the at least one image classification algorithm the image classification model is adapted to identify in the at least one image of the at least one part of the agricultural field at least two of the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern.
In an example the map may be a crop failure map, a replanting map, a crop line map, a crop line failure map and/or a crop map. The map may also be a combination of a crop failure map, a replanting map and/or a crop map. In other words, the map may include information about the location and/or size of a crop failure pattern, a replanting pattern, crop line pattern, a crop line failure pattern and/or a crop pattern.
The terms “crop line pattern”, “crop failure pattern” , “crop line failure pattern” and “crop pattern” may be understood broadly and may comprise a crop line, a crop failure, a crop line failure and a crop, respectively. A crop line failure pattern may comprise information about a crop line and failures on this line.
In an example, a computer-implemented method for providing control data, a crop failure map and/or a replanting map, comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing, based on the determined crop failures, control data for an agricultural device, wherein the control data at least comprising position data of one or more of the determined crop failures; and/or providing, based on the determined crop failures, a crop failure map indicating the crop failures in the agricultural field; and/or providing, based on the determined crop failures, a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold. In a further example a computer-implemented method for providing control data, and/or a map may be provided, comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model, configured to identify at least two of the following patterns: a crop line patter, a crop failure pattern and/or a crop pattern; providing, based on the results of the models and the relationships between the patterns, control data for an agricultural device, wherein the control data at least comprising position data of one or more of the determined crop failures; and/or providing a crop failure map indicating the crop failures in the agricultural field; and/or providing a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold
A further aspect of the present disclosure relates to a system for providing crop line failure data, comprising: a providing unit configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit configured to provide an image classification model configured to identify at least two identification patterns selected from the group of the following patterns consisting of: a crop line pattern, a crop failure pattern and/or a crop pattern; and the providing unit further configured for providing a relationship between the at least two identification patterns for which the image classification model is configured to identify; the system further comprising a determining unit configured to identify the at least two identification patterns with the image classification model; and the determining unit further configured to generate the crop line failure data based on the at least two identification patterns and based on the relationship ; wherein the crop line failure data indicate crop failures in a crop line; a further providing unit configured to provide the crop line failure data.
In an example a system for providing control data and a map may be provided, comprising: a providing unit configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit configured to provide an image classification model; wherein the image classification model comprises at least one image classification algorithm, wherein the at least one image classification algorithm is configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern; a determining unit configured determine in the at least one image of the at least one part of the agricultural field a third pattern from the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern by utilizing the two determined patterns; a further providing unit configured to provide, based on the third pattern, control data for an agricultural device , wherein the control data at least comprising position data of one or more of the determined third pattern crop failures; and/or configured to provide a map indicating the position of the third pattern in the agricultural field.
In an example a system for providing control data, a crop failure map and/or a replanting map is described, comprising: a providing unit configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit configured to provide an image classification model configured to identify crop failures in a crop line; a determining unit configured determine crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; a further providing unit configured to provide, based on the determined crop failures, control data for the agricultural device, wherein the control data are at least comprising position data of one or more of the determined crop failures; and/or configured to provide, based on the determined crop failures, a crop failure map indicating the crop failures in the agricultural field; and/or configured to provide, based on the determined crop failures, a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold.
A further aspect of the present disclosure relates to an apparatus for providing crop line failure data, the apparatus comprising: one or more computing nodes; and one or more computer- readable media having thereon computer-executable instructions that are structured such that, when executed by the one or more computing nodes, cause the apparatus to perform the computer-implemented method.
In an example an apparatus for providing control data, a crop failure map and/or a replanting map may be provided, the apparatus comprising: one or more computing nodes; and one or more computer-readable media having thereon computer-executable instructions that are structured such that, when executed by the one or more computing nodes, cause the apparatus to perform the following steps: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing, based on the determined crop failures, control data for an agricultural device, wherein the control data at least comprising position data of one or more of the determined crop failures; and/or providing, based on the determined crop failures, a crop failure map indicating the crop failures in the agricultural field; and/or providing, based on the determined crop failures, a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold.
A further aspect of the present disclosure relates to an agricultural device, e.g. a planting device for planting crops and/or a transportation unit for transporting stems, seedlings and/or seeds, wherein control data for the agricultural device are at least partially provided according to the disclosed computer-implemented method for providing control data.
The agricultural device may use the crop line failure data, e.g. the replanting map, in order to replant crop failures above a predefined threshold and/or size.
A further aspect of the present disclosure relates to a computer program element with instructions, which, when executed on computing devices of a computing environment, is configured to carry out the steps of the computer-implemented method for providing control data for an agricultural device in an apparatus and/or system for providing control data for an agricultural device.
The above may be achieved by using a computer-implemented method for providing crop failure data of an agricultural field, comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing crop failure data based on the determined crop failures.
A further aspect of the present disclosure relates to a system for providing crop failure data of an agricultural field, comprising: a providing unit configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit configured to provide an image classification model configured to identify crop failures in a crop line; a determining unit configured determine crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; a further providing unit configured to provide crop failure data based on the determined crop failures.
A further aspect of the present disclosure relates to an apparatus for providing crop failure data of an agricultural field, the apparatus comprising: one or more computing nodes; and one or more computer-readable media having thereon computer-executable instructions that are structured such that, when executed by the one or more computing nodes, cause the apparatus to perform the following steps: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing crop failure data based on the determined crop failures.
A further aspect of the present disclosure relates to a computer program element with instructions, which, when executed on computing devices of a computing environment, is configured to carry out the steps of the computer-implemented method for providing crop failure data of an agricultural field in an apparatus and/or system for providing crop failure data of an agricultural field. The same may be true for the computer implemented method and/or system for providing crop line failure data
A further aspect of the present disclosure relates to a navigation device configured to navigate an agricultural device to crop failures in the agricultural field based on the crop failure data and/or cop line failure data, wherein the crop failure data are provided according to the disclosed computer-implemented method for providing crop failure data and/or crop line failure data.
In an example a mobile device may be provided comprising at least one display unit, wherein the mobile device is configured to display the position of the mobile device and the position of the crop failures in the agricultural field based on the crop failure data and/or crop line failure data, wherein the crop failure data are provided according to the disclosed computer- implemented method for providing crop failure data and/or the computer-implemented method for providing crop line failure data .
A further example relates to computer-implemented method for providing a crop failure map of an agricultural field, comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing a crop failure map of the agricultural field based on the determined crop failures.
A further example relates to a system for providing a crop failure map of an agricultural field, comprising: a providing unit configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit configured to provide an image classification model configured to identify crop failures in a crop line; a determining unit configured to determine crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; a further providing unit configured to provide a crop failure map of the agricultural field based on the determined crop failures. A further example relates to an apparatus for providing a crop failure map of an agricultural field, the apparatus comprising: one or more computing nodes; and one or more computer- readable media having thereon computer-executable instructions that are structured such that, when executed by the one or more computing nodes, cause the apparatus to perform the following steps: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model configured to identify crop failures in a crop line; determining crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; providing a crop failure map of the agricultural field based on the determined crop failures.
A further aspect of the present disclosure relates to a computer program element with instructions, which, when executed on computing devices of a computing environment, is configured to carry out the steps of the computer-implemented method for providing a crop failure map of an agricultural field in an apparatus and/or system for providing a crop failure map of an agricultural field and/or the steps of the computer-implemented method for providing crop line failure data and/or a system for providing crop line failure data.
A further aspect of the present disclosure relates to a use of image data of at least a part of an agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged and/or a use of an image classification model configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following patterns, e.g. identification patterns: a crop line pattern, a crop failure pattern and/or a crop pattern in one of the computer-implemented methods/systems/apparatuses disclosed herein.
In an example a use of image data of at least a part of an agricultural field is provided, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged and/or a use of an image classification model configured to identify crop failures in a crop line in one of the computer-implemented methods/systems/apparatuses disclosed herein. In this context, the use of the corresponding data in a disclosed method/system/apparatus means that these data are used as input data. A further aspect of the present disclosure relates to a use of image data, e.g. provided by a drone, of at least a part of an agricultural field as input data for a weed classification algorithm. The weed classification algorithm is configured to identify weeds in the provided image data. In an embodiment, the identified weeds may be considered when deciding whether or not a replanting should be performed.
A further example relates to a method for providing a crop failure and weed map, comprising the following steps: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing one or more image classification models configured to identify crop failures in a crop line, crop line failures and/or configured to identify weeds; determining crop failures and weeds in the at least one image of the at least one part of the agricultural field utilizing the one or more image classification algorithms; providing a crop failure and weed map of the agricultural field based on the determined crop failures and weeds.
In an embodiment of the method for providing a crop failure and weed map, the crop is a sugar cane. Due to comparably, e.g. to soybean for example, large amounts of active ingredients allowed and used in sugar cane, the protection period of crop protection is longer and often applied around sowing time. The crop protection end date is then also close to the time it takes for the Sugar Cane crop to establish and go into tillering phase where it can be assessed for potential replanting. Therefore, both weed management and crop emergence management may be assessed at the same time. In this way, efficiency in image collection and preprocessing operations may be gained by executing the process just once instead of twice.
This and embodiments described herein relate to the method, the system, the agricultural device, the use, the computer program element lined out above and vice versa.
Advantageously, the benefits provided by any of the embodiments and examples equally apply to all other embodiments and examples and vice versa.
As used herein “determining" also includes “estimating, calculating, initiating or causing to determine", “generating" also includes “initiating or causing to generate", and “providing” also includes “initiating or causing to determine, generate, select, send, query or receive”. It is an object of the present invention to allow a more simplified replanting of crop failures and to assist a farmer in carrying out such replanting of crop failures. In particular, it is an object of the present invention to disclose a method for providing control data for an agricultural device, crop failure data, a crop failure map and/or a replanting map. Moreover, it is a further object of the present invention to disclose corresponding apparatuses, systems, devices and/or use cases. Finally, it is a further object of the present invention to provide a significant gain in time, higher yields and a more economically viable crop production.
These and other objects, which become apparent upon reading the following description, are solved by the subject matters of the independent claims. The dependent claims refer to preferred embodiments of the invention.
The term “crop failure” and/or “crop failure pattern” as used herein is to be understood as gaps between a continuous crop line, e.g. a line of sugar cane or forest plants.
The term “crop failure data” and/or “crop line failure data” refers to a corresponding data set comprising information about the identified crop failures and/or crop failures in a crop line.
The term “crop failure map” and/or “crop line failure map” used herein is to be understood broadly and includes any visual representation of the determined crop failures, i.e. in the crop failure map crop failures are specified. In an example the visual representation is a pattern for crop failure and/or a pattern for a gap, in particular a gap in a crop line. If crop failure may be related to a crop line, the term crop line failure map may be used. The crop failure map and/or crop line failure map may be divided in cells, preferably in polygon-shaped cells, wherein for each cell, a crop failure value may be determined. A crop failure map and/or crop line failure map may be a vector image comprised of lines of crop failures and crop lines, but it can also be indicating crop line and crops or any other combination. A crop failure map and/or crop line failure map can also be a raster map comprising of crop failure areas and crop lines area, but it can also be indicating crop lines and crops or any other combination of at least two components of the set of crop lines, crops and crop failures. It is also not limited to lines and images but may also indicate areas and is to be understood broadly and includes also corresponding data sets with position coordinates that are not represented in visual form. The crop failure map and/or crop line failure map may also be accompanied or overlapped by a second map defining areas for which the amount of crop failure is defined. The amount of failure may be indicated in percentage but any metric even arbitrary may be used. This may inform the farmer about the severity and may help having a better return on investment. The term “agricultural field” as used herein is to be understood broadly in the present case and presents any area, i.e. surface and subsurface, of a soil to be treated. The agricultural field may be any plant or crop cultivation area, such as a farming field, a greenhouse, or the like. The agricultural field may be identified through its geographical location or geo-referenced location data. A reference coordinate, a size and/or a shape may be used to further specify the agricultural field.
The term “image classification model” as used herein is to be understood broadly in the present case. The image classification model according to the present disclosure is configured to identify crop failures in a crop line. In an embodiment, the image classification model may comprise at least one image classification algorithm configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following: a crop line, a crop failure and/or a crop. In an example the image classification model may be a trained machine learning model and may be executed on a processor. The image classification model may be adapted to recognize a pattern for a crop line, a pattern for crop and/or a pattern for a crop failure. In other words, the classification algorithm may comprise identifying at least two sets, at least two patterns and/or at least two categories in the image selected from the group of patterns consisting of a crop line, a crop failure and a crop. The image classification model may comprise a plurality of sub-models which each are focused to a specific task, e.g. crop line pattern recognition, drop failure pattern recognition and/or crop pattern recognition.
Notably, two of the mentioned categories, i.e. crop line, crop failure, crop, are sufficient to derive all necessary information. Since the relation or the relationship between these categories can be represented with any of the following equations: crop line = crop failure u crop crop failure = crop line \ crop crop = crop line \ crop failure
This means that a “crop line” and/or a “crop line pattern” is composed of the “crop failure” and the “crop”; the “crop failure” may be determined by taking the “crop” from the “crop line”; the “crop” may be determined by taking the “crop failure” from the “crop line”. As the sets, patterns and/or categories crop line, crop failure and crop are interrelated to another they may form a triangular relation with another. The triangular relation may follow the rules of set theory. The triangular relation may allow for detecting at least two patterns of the group of patterns and determining the other pattern or the other category from the at least two patterns. At least two sets of patterns are allocated to disjunct groups. E.g. the set of crop failure patterns and the set of crop patterns form two disjunct groups. By way of rules of set theory another set of patterns may be derived by detecting at least two different sets of patterns. In an example the at least two different sets of patterns may be disjunct.
In this way the above-mentioned equations may be expressed as a rule of set theory.
The set of a crop line patterns is formed as a union of the set of crop patterns and the set of crop failure patterns, the set of crop failure patterns is the set difference of the set of crop line patterns and the set of crop patterns, the set of crop patterns is the set difference of the set of crop line patterns and the set of crop failure patterns.
The set of crop line patterns may comprise crop line patterns for a plurality of single crop line patterns that are related to the physical single crop lines. A set of a single crop line pattern may be a subset of the set of crop line patterns. The set of crop patterns belonging to the set of the single crop line may be a subset of the set of crop patterns and the set of crop failure patterns belonging to the set of the single crop line may be a subset of the set of crop failure patterns.
If in an example a crop failure may need to be provided and the image classification model comprises crop line pattern identification and crop pattern identification, the crop line failure data may be generated by applying the relationship crop failure = crop line \ crop and the crop failure is provided as crop line failure data.
It is preferred that the image classification algorithm is based on the results of a machinelearning algorithm, wherein the term “machine learning algorithm” may comprise decision trees, naive bayes classifications, nearest neighbors, neural networks, convolutional or recurrent neural networks, transformers, generative adversarial networks, support vector machines, linear regression, logistic regression, random forest and/or gradient boosting algorithms. Preferably, the result of a machine learning algorithm is used to adjust the application rate decision logic. Preferably, the machine learning algorithm is organized to process an input having a high dimensionality into an output of a much lower dimensionality. Such a machine learning algorithm is termed “intelligent” because it is capable of being “trained.” The algorithm may be trained using records of training data. A record of training data comprises training input data and corresponding training output data. The training output data of a record of training data is the result that is expected to be produced by the machine learning algorithm when being given the training input data of the same record of training data as input. The deviation between this expected result and the actual result produced by the algorithm is observed and rated by means of a “loss function”. This loss function is used as a feedback for adjusting the parameters of the internal processing chain of the machine learning algorithm. For example, the parameters may be adjusted with the optimization goal of minimizing the values of the loss function that result when all training input data is fed into the machine learning algorithm and the outcome is compared with the corresponding training output data. The result of this training is that given a relatively small number of records of training data as “ground truth”, the machine learning algorithm is enabled to perform its job well for a number of records of input data that higher by many orders of magnitude.
An image classification algorithm may be trained to identify, differentiate and/or segment a plurality of patterns.
The classification algorithm may comprise a labeling algorithm, wherein the labeling algorithm may be adapted to provide and/or mark a recognized pattern with a corresponding label, e.g. a recognized crop line pattern with a crop line label, a recognized crop pattern with a crop label, e.g. the type of crop, and a crop failure pattern with a crop failure label and/or a gap label.
The labelling algorithm may use a segmentation algorithm and link a recognized pattern to a meaning such as crop, crop failure, crop line failure and/or failure..
In an example the classification algorithm may be trained to detect at least two patterns substantially simultaneously. In this case an image is provided to a combined image classification model, wherein the combined image classification model may be trained for recognizing at least two different patterns, e.g. crop patterns and crop failure patterns. Such a combined image classification model may comprise a single input for providing the image to the combined image classification model and may have at least two outputs, one for each recognized pattern.
In another example, at least two specialized image classification model may be provided. The image may be forked and/or copied to each of the specialized image classification models and each specialized image classification model may process the image independently from another. In an example a first specialized image classification model may be adapted to recognize a crop pattern and a second specialized image classification model may be adapted to recognize a crop failure pattern.
The pattern recognition may be executed independently from another. In an example the results of different classification operations may not depend on another. In addition or alternatively, it is possible that the image classification algorithm is based on the results of a classical computer vision algorithm and/or an image recognition algorithm, wherein the term “classical computer vision algorithm” may comprise Hough transforms, Fourier transforms, filters, kernel convolutions, object based image analysis and/or statistical shape analysis. Such classical computer vision algorithm exploits statistical traits of the difference in color and/or other non-visible wavelengths, intensity, position and/or grouping between areas of interest.
The output of the image classification model and/or the output of the image classification algorithm may be further processed in order to prepare the output for a good computer handling. One example for a good handling may be providing a shape of a pattern by a shape analysis algorithm.
For shape analysis a shape analysis algorithm may be used. Based on a shape analysis a pattern recognized during classification may be associated with a computer readable structure. In this way a physical structure like a crop line, a crop and/or a crop failure may be converted into a corresponding computer model. In shape analysis, a “skeleton” or a “topological skeleton” of a shape is a thin version of that shape that is equidistant to its boundaries. Skeletonization is the process of transforming a blob of pixels into one or multiple lines with 1px width. In geometry, a polygonal chain is a connected series of line segments. More formally, a polygonal chain P is a curve specified by a sequence of points called its vertices. The curve itself consists of the line segments connecting the consecutive vertices. A polygonal chain may also be called a polygonal curve, polygonal path, polyline, piecewise linear curve, broken line or, in geographic information systems, a linestring or linear ring. Such polylines may be used in the present disclosure, since the pixel output may be cumbersome to work with as the final form is complex.
The term “threshold value” or “crop failure threshold value” as used herein is to be understood broadly in the present case and may be directed to a length of a crop failure in the crop line. For example, a crop failure threshold may be set to a length above 20 cm, 50 cm or above 1 m. This means that below this threshold a crop failure is not considered as a relevant, for example, since a crop failure below such threshold value is overgrown by the neighboring crop anyway. In other words, the image classification may comprise detecting the size of a pattern and ignoring the size below a predefined threshold. The size may be different for crop patterns, crop failure pattens and/or for crop line patterns. The size may be measured in meter, centimeter and/or pixels. The predefined threshold may correspond to a predefined resolution for the pattern. For detecting the size a size detecting sub-method, sub-module and/or size detecting device may be provided.
The threshold value may also be expressed as a percentage of area and/or of a line length, e.g. 15% of a square meter.
The term “replanting map” used herein is to be understood broadly and includes any visual representation of the determined crop failures, which should be replanted and/or indicating where in the agricultural field a crop failure is above the predetermined crop failure threshold. Such a replanting map may be accompanied by recommendation to replant based on the threshold value (e.g. on the area percentage of crop failures and/or on the crop failure length) or a more complex method involving additional data such as age of field, weed distribution, etc., and particularities of the farmer. The replanting map may comprise a type of crop, a seeding rate, a route for replanting through the field and/or a geographical location. The information of the replanting map may be provided in form of a JSON, csv, ISO-XML, geoTIFF and/or Shape format.
The information gathered for the replanting map may be used in order to generate a hard copy of the replanting map, e.g. by sending control data and/or a file comprising replanting information to a printing device, e.g. a printer and/or plotter. The replanting map may be sent to a mobile device, e.g. a smart phone, and may use the navigation facility of the mobile device to guide a user for replanting.
The information gathered for the replanting map and/or the control data may be used to control a planting operation of a planting device and/or of a seeding device, e.g. a smart seeder.
The term “control data” as used herein is to be understood broadly in the present case and presents any data being configured to operate and control an agricultural device. The control data may be provided by a control unit and may be configured to control one or more technical means of the agricultural device, e.g. the drive control, the steering, etc., but is not limited thereto. The control data at least comprising position data of one or more of the determined crop failures.
The term “agricultural device” used herein is to be understood broadly in the present case and represents any device being configured for replanting crops and/or transporting stems, seedlings and/or seeds to a crop failure position. The agricultural device may be a ground or an air vehicle, e.g. a tractor, a transporter, a rail vehicle, a robot, an aircraft, an unmanned aerial vehicle (UAV), a drone, planting device, etc. The agricultural device may be an autonomous or a non-autonomous device.
The term “navigation device” as used as used herein is to be understood broadly in the present case and represents any device configured/usable to navigate a vehicle and/or a person to one or more crop failures. In a simpler embodiment, the navigation device is a GPS device.
The term “mobile device” as used herein is to be understood broadly in the present case and represents any portable device, e.g. smart phone, handheld, laptop, tablet, etc., comprising at least one display unit, with which the position of the mobile device and the position of the crop failures in the agricultural field can be shown.
The term “providing” as used herein is to be understood broadly in the present case and represents any providing, receiving, querying, measuring, calculating, determining, transmitting of data, but is not limited thereto. Data may be provided by a user via a user interface, depicted/shown to a user by a display, and/or received from other devices, queried from other devices, measured other devices, calculated by other device, determined by other devices and/or transmitted by other devices.
The term “data” as used herein is to be understood broadly in the present case and represents any kind of data. Data may be single numbers/numerical values, a plurality of a numbers/numerical values, a plurality of a numbers/numerical values being arranged within a list, 2 dimensional maps or 3 dimensional maps, but are not limited thereto.
In the following particularly preferred embodiments are disclosed, which may be combined with the above-disclosed methods, systems, apparatuses, devices and/or use cases.
In an embodiment, the agricultural device may be a planting device for planting crops on and/or a transportation unit for transporting stems, seedlings and/or seeds.
In an embodiment, the image classification model may comprise at least one image classification algorithm, wherein the at least one image classification algorithm is configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following: a crop line, a crop failure and/or a crop. In this respect, the image classification model may comprise at least two of the following image classification algorithms: an image classification algorithm configured to identify a crop line in the at least one image of the at least one part of the agricultural field, an image classification algorithm configured to identify a crop failure in the at least one image of the at least one part of the agricultural field.
In a further embodiment, the image classification model comprises at least two of the following image classification algorithms: an image classification algorithm configured to identify the crop line pattern in the at least one image of the at least one part of the agricultural field, an image classification algorithm configured to identify the crop failure pattern in the at least one image of the at least one part of the agricultural field and an image classification algorithm configured to identify a crop pattern in the at least one image of the at least one part of the agricultural field.
In yet another embodiment the crop line failure data are provided in form of control data for an agricultural device, wherein the control data comprise position data of one or more of the crop failures in the crop line; and/or a crop line failure map indicating the crop failures in the agricultural field; and/or a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold.
In a further embodiment the image classification model comprises a single model configured to identify each of the at least two identification patterns independently from another and/or wherein the image classification model comprises separate sub-models for each of the at least two identification patterns.
In other words, each of the separate sub-models may operate independently from another and may substantially be adapted for Identifying one of the two identification pattern.
In an embodiment, a crop failure and/or crop failure pattern may be at a position in a crop line where a sown crop has not emerged and/or a position in a crop line where a crop is smaller than a predetermined size.
In an embodiment, the method may further comprise: determining whether a crop failure and/or crop failure relating to a crop failure pattern is above a predetermined crop failure threshold; wherein the control data for the agricultural device are only provided in case a crop failure is above the predetermined crop failure threshold.
In an embodiment, the method may further comprise: providing a crop failure map based on the determined crop failures, in particular wherein the crop line failure map comprises cells, preferably polygon-shaped cells, wherein for each cell, a crop failure value is determined.
In an embodiment, the failure map, in particular the crop line failure map, may comprise crop lines, crop areas, crop failure areas.
In an embodiment, the failure map comprises crop lines, crop failures and/or crop.
In an embodiment, the crop failure map, in particular the crop line failure map, comprises cells, preferably polygon-shaped cells, wherein for each cell, a crop failure value is determined.
In an embodiment, the method is further comprising: providing a replanting map indicating where in the agricultural field a crop failure is above the predetermined crop failure threshold. The threshold in an example may relate to the size of the pattern in the image. In another example an addition method and/or device may be provided adapted for measure the current size of the emerged crop in the field.
In an embodiment, the method is further comprising: providing a weed classification algorithm configured to identify weeds; determining weeds in the at least one image of the at least one part of the agricultural field utilizing the weed image classification algorithm; providing a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold and the weed infestation is below a predetermined weed infestation threshold.
In an embodiment, the image of the at least one part of the agricultural field may be provided at a time when the crops are in the tillering stage and/or have dimension three times bigger than the resolution of the used image sensors for capturing the image.
In an embodiment, the method may further comprise: providing several images of the agricultural field and/or of parts of the agricultural field, stitching the images together by means of a stitching algorithm for providing the image data of the agricultural field. In an embodiment, the images of the agricultural field are RGB images, with a resolution of around 2.8cm/pixel. The images may be high-resolution images sufficient for a respective image analysis. If necessary, the agricultural field may be divided into suitable segments so that an accordingly high resolution can be provided suitable for the analysis by an image classification algorithm. In this respect, the images may be provided by at least one image collection device, wherein the image collection device may be an aircraft device, e.g. a drone. However, the present disclosure is not limited to a specific method for providing image data of the agricultural field and also not to a specific image collection device.
In an embodiment, the method may further comprise: providing boundary data of the agricultural field and generating image collection path data for the at least one image collection device, wherein the image collection path data preferably comprises data with respect to path locations, position marks, flight heights, landing zones and/or image locations. An analysis algorithm may be used for optimizing a collection path, e.g. maximum coverage in minimal time with minimal number of breaks, landings etc. In this respect, the image collection device may comprise a communication interface configured to directly or indirectly send the collected images to a computer device, wherein the computer device may be configured to execute the image classification algorithm(s) and to generate the crop failure data. The term computer device is broadly understood and includes all appropriate means on which the image classification algorithm may be executed, for example, cloud computing solutions, a centralized or decentralized computer system, a computer center, etc. The images may be automatically transferred from the image collection device to the computer device, e.g. via an upload center or a cloud connectivity during collection using an appropriate wireless communication interface, e.g. a mobile interface, long range WLAN etc. Even if the collected images may be transferred via a wireless communication interface, it is also possible that the image collecting device comprises an on-site data transfer interface, e.g. a USB-interface, from which the collected images may be received via a manual transfer and which are then transferred to a respective computer device for further processing. After receiving the images by the computer device(s) an automatic workflow may be triggered comprising the following steps: stitching the images and providing the image data of the agricultural field, running the image classification algorithm(s) and determining the crop failures, providing the crop failure data and the control data based on the determined crop failures.
BRIEF DESCRIPTION OF THE DRAWINGS
In the following, the present disclosure is further described with reference to the enclosed figures: Figure 1 illustrate example embodiments of a centralized and a decentralized computing environment with computing nodes;
Figure 2 illustrate example embodiments of a centralized and a decentralized computing environment with computing nodes;
Figure 3 illustrate an example embodiment of a distributed computing environment;
Figure 4 illustrates a flow diagram of a computer-implemented method for providing control data for an agricultural device;
Figure 5 illustrates a system for providing control data for an agricultural device;
Figures 6 to 14 illustrates the steps for determining crop failures in the provided image data of the agricultural field;
Figure 15 illustrates examples for a crop failure map comprising polygon-shaped cells;
Figure 16 illustrates an enlarged section of the crop failure map of figure 15; and
Figure 17 illustrates exemplarily the different possibilities to receive and process field data.
DETAILED DESCRIPTION OF AN EMBODIMENT
The following embodiments are mere examples for implementing the method, the system, the apparatus, or application device disclosed herein and shall not be considered limiting.
Figures 1 to 3 illustrate different computing environments, central, decentral and distributed. The methods, apparatuses, computer elements of this disclosure may be implemented in decentral or at least partially decentral computing environments. In particular, for data sharing or exchange in ecosystems of multiple players different challenges exist. Data sovereignty may be viewed as a core challenge. It can be defined as a natural person’s or corporate entity’s capability of being entirely self-determined with regard to its data. To enable this particular capability related aspects, including requirements for secure and trusted data exchange in business ecosystems, may be implemented across the chemical value chain. In particular, chemical industry requires tailored solutions to deliver chemical products in a more sustainable way by using digital ecosystems. Providing, determining or processing of data may be realized by different computing nodes, which may be implemented in a centralized, a decentralized or a distributed computing environment.
Figure 1 illustrates an example embodiment of a centralized computing system 20 comprising a central computing node 21 (filled circle in the middle) and several peripheral computing nodes 21.1 to 21. n (denoted as filled circles in the periphery). The term “computing system” is defined herein broadly as including one or more computing nodes, a system of nodes or combinations thereof. The term “computing node” is defined herein broadly and may refer to any device or system that includes at least one physical and tangible processor, and/or a physical and tangible memory capable of having thereon computer-executable instructions that are executed by a processor. Computing nodes are now increasingly taking a wide variety of forms. Computing nodes may, for example, be handheld devices, production facilities, sensors, monitoring systems, control systems, appliances, laptop computers, desktop computers, mainframes, data centers, or even devices that have not conventionally been considered a computing node, such as wearables (e.g., glasses, watches or the like). The memory may take any form and depends on the nature and form of the computing node.
In this example, the peripheral computing nodes 21.1 to 21. n may be connected to one central computing system (or server). In another example, the peripheral computing nodes 21.1 to 21. n may be attached to the central computing node via e.g. a terminal server (not shown). The majority of functions may be carried out by or obtained from the central computing node (also called remote centralized location). One peripheral computing node 21. n has been expanded to provide an overview of the components present in the peripheral computing node. The central computing node 21 may comprise the same components as described in relation to the peripheral computing node 21. n.
Each computing node 21, 21.1 to 21. n may include at least one hardware processor 22 and memory 24. The term “processor” may refer to an arbitrary logic circuitry configured to perform basic operations of a computer or system, and/or, generally, to a device which is configured for performing calculations or logic operations. In particular, the processor, or computer processor may be configured for processing basic instructions that drive the computer or system. It may be a semi-conductor based processor, a quantum processor, or any other type of processor configures for processing instructions. As an example, the processor may comprise at least one arithmetic logic unit ("ALU"), at least one floating-point unit ("FPU)", such as a math coprocessor or a numeric coprocessor, a plurality of registers, specifically registers configured for supplying operands to the ALU and storing results of operations, and a memory, such as an L1 and L2 cache memory. In particular, the processor may be a multicore processor. Specifically, the processor may be or may comprise a Central Processing Unit ("CPU"). The processor may be a (“GPU”) graphics processing unit, (“TPU”) tensor processing unit, ("CISC") Complex Instruction Set Computing microprocessor, Reduced Instruction Set Computing ("RISC") microprocessor, Very Long Instruction Word ("VLIW') microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing means may also be one or more special-purpose processing devices such as an Application- Specific Integrated Circuit ("ASIC"), a Field Programmable Gate Array ("FPGA"), a Complex Programmable Logic Device ("CPLD"), a Digital Signal Processor ("DSP"), a network processor, or the like. The methods, systems and devices described herein may be implemented as software in a DSP, in a micro-controller, or in any other side-processor or as hardware circuit within an ASIC, CPLD, or FPGA. It is to be understood that the term processor may also refer to one or more processing devices, such as a distributed system of processing devices located across multiple computer systems (e.g., cloud computing), and is not limited to a single device unless otherwise specified.
The memory 24 may refer to a physical system memory, which may be volatile, non-volatile, or a combination thereof. The memory may include non-volatile mass storage such as physical storage media. The memory may be a computer-readable storage media such as RAM, ROM, EEPROM, CD-ROM, or other optical disk storage, magnetic disk storage, or other magnetic storage devices, non-magnetic disk storage such as solid-state disk or any other physical and tangible storage medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by the computing system. Moreover, the memory may be a computer-readable media that carries computer- executable instructions (also called transmission media). Further, upon reaching various computing system components, program code means in the form of computerexecutable instructions or data structures can be transferred automatically from transmission media to storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computing system RAM and/or to less volatile storage media at a computing system. Thus, it should be understood that storage media can be included in computing components that also (or even primarily) utilize transmission media.
The computing nodes 21 , 21.1 to 21. n may include multiple structures 26 often referred to as an “executable component, executable instructions, computer-executable instructions or instructions”. For instance, memory 24 of the computing nodes 21, 21.1 to 21. n may be illustrated as including executable component 26. The term “executable component” or any equivalent thereof may be the name for a structure that is well understood to one of ordinary skill in the art in the field of computing as being a structure that can be software, hardware, or a combination thereof or which can be implemented in software, hardware, or a combination. For instance, when implemented in software, one of ordinary skill in the art would understand that the structure of an executable component includes software objects, routines, methods, and so forth, that is executed on the computing nodes 21 , 21.1 to 21. n, whether such an executable component exists in the heap of a computing node 21, 21.1 to 21. n, or whether the executable component exists on computer-readable storage media. In such a case, one of ordinary skill in the art will recognize that the structure of the executable component exists on a computer- readable medium such that, when interpreted by one or more processors of a computing node 21, 21.1 to 21. n (e.g., by a processor thread), the computing node 21 , 21.1 to 21n is caused to perform a function. Such a structure may be computer-readable directly by the processors (as is the case if the executable component were binary). Alternatively, the structure may be structured to be interpretable and/or compiled (whether in a single stage or in multiple stages) so as to generate such binary that is directly interpretable by the processors. Such an understanding of example structures of an executable component is well within the understanding of one of ordinary skill in the art of computing when using the term “executable component”. Examples of executable components implemented in hardware include hardcoded or hard-wired logic gates, that are implemented exclusively or near-exclusively in hardware, such as within a field- programmable gate array (FPGA), an application-specific integrated circuit (ASIC), or any other specialized circuit. In this description, the terms “component”, “agent”, “manager”, “service”, “engine”, “module”, “virtual machine” or the like are used synonymous with the term “executable component.
The processor 22 of each computing node 21 , 21.1 to 21. n may direct the operation of each computing node 21, 21.1 to 21. n in response to having executed computer-executable instructions that constitute an executable component. For example, such computer-executable instructions may be embodied on one or more computer-readable media that form a computer program product. The computer-executable instructions may be stored in the memory 24 of each computing node 21 , 21.1 to 21. n. Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor 21, cause a general purpose computing node 21 , 21.1 to 21. n, special purpose computing node 21, 21.1 to 21. n, or special purpose processing device to perform a certain function or group of functions. Alternatively or in addition, the computer-executable instructions may configure the computing node 21, 21.1 to 21. n to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries or even instructions that undergo some translation (such as compilation) before direct execution by the processors, such as intermediate format instructions such as assembly language, or even source code.
Each computing node 21, 21.1 to 21. n may contain communication channels 28 that allow each computing node 21.1 to 21. n to communicate with the central computing node 21, for example, a network (depicted as solid line between peripheral computing nodes and the central computing node in Figure 1). A “network” may be defined as one or more data links that enable the transport of electronic data between computing nodes 21 , 21.1 to 21. n and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computing node 21, 21.1 to 21. n, the computing node 21, 21.1 to 21. n properly views the connection as a transmission medium. Transmission media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general- purpose or special-purpose computing nodes 21, 21.1 to 21.n. Combinations of the above may also be included within the scope of computer-readable media.
The computing node(s) 21 , 21.1 to 21. n may further comprise a user interface system 25 for use in interfacing with a user. The user interface system 25 may include output mechanisms 25A as well as input mechanisms 25B. The principles described herein are not limited to the precise output mechanisms 25A or input mechanisms 25B as such will depend on the nature of the device. However, output mechanisms 25A might include, for instance, displays, speakers, displays, tactile output, holograms and so forth. Examples of input mechanisms 25B might include, for instance, microphones, touchscreens, holograms, cameras, keyboards, mouse or other pointer input, sensors of any type, and so forth.
Figure 2 illustrates an example embodiment of a decentralized computing environment 30 with several computing nodes 21.1 to 21. n denoted as filled circles. In contrast to the centralized computing environment 20 illustrated in Figure 1, the computing nodes 21.1 to 21. n of the decentralized computing environment are not connected to a central computing node 21 and are thus not under control of a central computing node. Instead, resources, both hardware and software, may be allocated to each individual computing node 21.1 to 21. n (local or remote computing system) and data may be distributed among various computing nodes 21.1 to 21. n to perform the tasks. Thus, in a decentral system environment, program modules may be located in both local and remote memory storage devices. One computing node 21 has been expanded to provide an overview of the components present in the computing node 21. In this example, the computing node 21 comprises the same components as described in relation to Figure 1. Figure 3 illustrates an example embodiment of a distributed computing environment 40. In this description, “distributed computing” may refer to any computing that utilizes multiple computing resources. Such use may be realized through virtualization of physical computing resources. One example of distributed computing is cloud computing. “Cloud computing” may refer a model for enabling on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services). When distributed, cloud computing environments may be distributed internationally within an organization and/or across multiple organizations. In this example, the distributed cloud computing environment 40 may contain the following computing resources: mobile device(s) 42, applications 43, databases 44, data storage and server(s) 46. The cloud computing environment 40 may be deployed as public cloud 47, private cloud 48 or hybrid cloud 49. A private cloud 47 may be owned by an organization and only the members of the organization with proper access can use the private cloud 48, rendering the data in the private cloud at least confidential. In contrast, data stored in a public cloud 48 may be open to anyone over the internet. The hybrid cloud 49 may be a combination of both private and public clouds 47, 48 and may allow to keep some of the data confidential while other data may be publicly available.
Figure 4 illustrates a flow diagram of a computer-implemented method for providing control data, a crop failure map and/or a replanting map. In a first step, image data of at least a part of the agricultural field are provided, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged. in a second step, an image classification model configured to identify crop failures in a crop line is provided. In a further step, crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm are determined. In a further step, control data for the agricultural device are provided, wherein the control data are at least comprising position data of one or more of the determined crop failures. Alternatively or in addition to the control data, failure data, a failure map and/or a replanting map may be provided.
In other words, image data of at least a part of the agricultural field are provided, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged.
Furthermore, an image classification model is provided, configured to identify at least two identification patterns selected from the group of the patterns consisting of: a crop line pattern, a crop failure pattern and/or a crop pattern. In addition, a relationship between the at least two identification patterns is provided. The relation expresses the relationship between the patterns for whose identification the image classification model is configured.
Furthermore, the method provides for identifying the at least two identification patterns with the image classification model and generating the crop line failure data based on the at least two identification patterns and based on the relationship and in this way determining crop failures and/or the crop line failure data.
The crop line failure data indicate crop failures in a crop line.
The method ends with providing the crop line failure data in a desired format, e.g. as control data, a crop failure map, a crop line failure map and/or as a replanting map.
Figure 5 illustrates a system 10 for providing control data, a crop failure map, a crop line failure map and/or a replanting map, comprising a providing unit 11 configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit 12 configured to provide an image classification model configured to identify crop failures in a crop line; a determining unit 13 configured determine crop failures in the at least one image of the at least one part of the agricultural field utilizing the image classification algorithm; and a further providing unit 14 configured to provide control data for the agricultural device, wherein the control data are at least comprising position data of one or more of the determined crop failures. Alternatively or in addition to the control data, failure data, a failure map and/or a replanting map may be provided.
The system 10 for providing crop line failure data comprises a providing unit 11 configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged.
The system 10 further comprises a further providing unit 12 configured to provide an image classification model configured to identify at least two identification patterns selected from the group of the following patterns consisting of: a crop line pattern, a crop failure pattern and/or a crop pattern ; and for providing a relationship between the at least two identification patterns for which the image classification model is configured to identify. The system 10 further comprises a determining unit 13 configured to identify the at least two identification patterns with the image classification model; and to generate the crop line failure data based on the at least two identification patterns and based on the relationship, wherein the crop line failure data indicate crop failures in a crop line
The system 10 further comprises a further providing unit 14 configured to provide the crop line failure data.
Figures 6 to 14 illustrate an example for determining crop failures in the provided image data of the agricultural field. In this respect, an image classification model configured to identify crop failures in a crop line may be applied. For example, the image classification model may comprise at least one image classification algorithm configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following: a crop line, a crop failure and/or a crop. Notably, two of the mentioned categories (i.e. crop line, crop failure, crop) are sufficient to derive all necessary information. Since the relation between these categories can be represented with any of the following equations:
crop line = crop failure u crop crop failure = crop line \ crop crop = crop line \ crop failure
This means that a “crop line” is composed of the “crop failure” and the “crop”; the “crop failure” may be determined by taking the “crop” from the “crop line”; the “crop” may be determined by taking the “crop failure” from the “crop line”. Thus, the steps for determining crop failures in the provided image data of the agricultural field shown in Figures 6 to 14 are only exemplarily and different approaches may be utilized here.
In order to determine a crop line, a crop failure and/or a crop corresponding pattern may be recognized.
Figure 6 shows a section of an agricultural field with several parallel crop lines 100 (only two of the crop lines are provided with a reference sign for better clearness), e.g. sugar cane lines. In this case, a grayscale image of an “Excess Green” index is shown to illustrate the difference between crop 110 in the lighter color and background/soil 120 in the darker parts. An image as shown in Figure 6 may be recorded by an optical sensor or camara a board of a UAV or a satellite. The image may comprise a collection of pixels having different grey scale values and/or having different colors. The image as of Figure 6 does not comprise any detected object nor any labels.
Crops 110 form continuous lines which can be interrupted by the failure. The combination of line and failure may form a crop line failure.
In Figure 7 crop lines 100 shown in Figure 6 are provided by the image classification model. The image of Figure 7 may be provided as output of the image classification model. In an example pattern recognized in Figure 7 may comprise recognized objects identified, labeled and/or annotated by a segmentation algorithm.
In Figure 8, crop line areas 130 are converted into skeletons 140 thereby representing crop lines 100 or crop line patterns 100 with a simple structure of width of 1 pixel instead of the larger area. Such a structure of may be handled by a computer, e.g. for measuring the size of the structure. A structure may be detected by employing a skeletonizing algorithm to the output of the classification model. The skeletonizing algorithm may substantially find a balancing point and/or the core area of the classified pattern, e.g. of crop line pattern 100 and in particular of the classified pattern 100 of Figure 7 or of the crop line area 130. Skeletonizing may help to generate a vector map.
In Figure 9, the skeleton 140 (not shown in Figure 9) is converted into polyline skeletons 145 thereby moving from a raster image format and/or a pixel based image format into polylines which are in vector format. Polyline is a chain of line segments forming a continuous line which represents the crop line here. Usage of vector space will aid in simplification process described in more detail in Figure 12. The vector format may be very well handled by a computer and may help to describe a digital twin of the crop line, the crop, the crop line failure and/or of the cop failure.
A polyline may be derived from a pixel skeleton and may allow for a good representation of crop lines. It may also support measuring of a size of a corresponding object represented as polyline. Polylines may be simplified by reducing the number of points and/or pixels forming the polyline. It may be possible to go over each point of the lines and if a point matches and/or crosses a failure pattern the corresponding part of the line may be replaced by a failure line. A failure line may replace a corresponding segment of a crop line and/or a row line. A crop line may be a part of the polyline matching crop patterns. A failure line and a crop line may be indicated by different colors and by replacing parts of the crop line and/or row line by a failure line may ensure to measure the size of the line segment allocated to a crop and a failure correctly.
Figure 10 shows an example of crop failure areas 130 and/or of a crop failure pattern in a classified image. Failure areas 130 and/or gaps 130 are the interruptions of the crop lines 100. Failures may be also detected next to a crop line 100, but not considered as crop failure may substantially only appear on crop lines 100.
Figure 11 shows polyline segments 150 of the crop line which are in addition to being a crop line also identified as crop failures 160 by using the crop failure areas from the Figure 10.
From there Figure 12 shows simplified polylines which eliminate smaller inconsistencies inside the crop lines making them straighter. This also makes downstream pixel-based computations more accurate as crop lines are usually intended by farmers to be as close to perfect lines as possible.
Figure 13 shows the notation of crop line on Figures 9 to 12 as dashes perpendicular to the line. This notion may be used to mark recognized crop lines 100. Figure 14 shows a crop failure notation on Figures 11 and 12 as dots next to the line. Further, in this case by having crop line and crop failure, there exists enough information to compute crop as per equations described earlier and/or by applying a rule of the set theory. Therefore, by using the equation “crop = crop line \ crop failure”, it can be conclude that the crop is found on Figure 12 by searching for an area which has crop line notation, but does not have crop failure notation.
The notion of Figure 14 may be used to mark recognized crop failure patterns.
Figure 15 illustrates an example of a crop failure map, wherein the crop failure map is provided with polygon-shaped cells, wherein for each cell a crop failure value may be determined.
Using of polygon-shaped cells and vectorized symbols for detected patterns may allow for providing a size for a detected pattern. In this way the measurement of the size of a crop failure in relation to a crop line length may be provided. Based on such a measurement a decision of replanting may be made.
Figure 16 shows an enlarged section of the crop failure map of Figure 15.
Figure 17 illustrates exemplarily the different possibilities to receive and process field data (e.g. image data, control data, etc.). For example, field data can be obtained by all kinds of agricultural equipment 300 (e.g. a tractor 300) as so-called as-applied maps by recording the application rate at the time of application. It is also possible that such agricultural equipment comprises sensors (e.g. optical sensors, cameras, infrared sensors, soil sensors, etc.) to provide, for example, a weed distribution map. It is also possible that during harvesting the yield (e.g. in the form of biomass) is recorded by a harvesting vehicle 310. Furthermore, corresponding maps/data can be provided by land-based and/or airborne drones 320 by taking images of the field or a part of it. Finally, it is also possible that a geo-referenced visual assessment 330 is performed and that this field data is also processed. Field data collected in this way can then be merged in a computing device 340, where the data can be transmitted and computed, for example, via any wireless link, cloud applications 350 and/or working platforms 360, wherein the field data may also be processed in whole or in part in the cloud application 350 and/or in the working platform 360 (e.g., by cloud computing).
Aspects of the present disclosure relates to computer program elements configured to carry out steps of the methods described above. The computer program element might therefore be stored on a computing unit of a computing device, which might also be part of an embodiment. This computing unit may be configured to perform or induce performing of the steps of the method described above. Moreover, it may be configured to operate the components of the above described system. The computing unit can be configured to operate automatically and/or to execute the orders of a user. The computing unit may include a data processor. A computer program may be loaded into a working memory of a data processor. The data processor may thus be equipped to carry out the method according to one of the preceding embodiments. This exemplary embodiment of the present disclosure covers both, a computer program that right from the beginning uses the present disclosure and computer program that by means of an update turns an existing program into a program that uses the present disclosure. Moreover, the computer program element might be able to provide all necessary steps to fulfill the procedure of an exemplary embodiment of the method as described above. According to a further exemplary embodiment of the present disclosure, a computer readable medium, such as a CD- ROM, USB stick, a downloadable executable or the like, is presented wherein the computer readable medium has a computer program element stored on it which computer program element is described by the preceding section. A computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems. However, the computer program may also be presented over a network like the World Wide Web and can be downloaded into the working memory of a data processor from such a network. According to a further exemplary embodiment of the present disclosure, a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the present disclosure.
The present disclosure has been described in conjunction with a preferred embodiment as examples as well. However, other variations can be understood and effected by those persons skilled in the art and practicing the claimed invention, from the studies of the drawings, this disclosure and the claims. Notably, in particular, the any steps presented can be performed in any order, i.e. the present invention is not limited to a specific order of these steps. Moreover, it is also not required that the different steps are performed at a certain place or at one node of a distributed system, i.e. each of the steps may be performed at a different nodes using different equipment/data processing units.
In the claims as well as in the description the word “comprising” does not exclude other elements or steps and the indefinite article “a” or “an” does not exclude a plurality. A single element or other unit may fulfill the functions of several entities or items recited in the claims. The mere fact that certain measures are recited in the mutual different dependent claims does not indicate that a combination of these measures cannot be used in an advantageous implementation.

Claims

Claims
1. Computer-implemented method for providing crop line failure data comprising: providing image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; providing an image classification model, configured to identify at least two identification patterns selected from the group of the patterns consisting of: a crop line pattern (100), a crop failure pattern (160) and/or a crop pattern (110); providing a relationship between the at least two identification patterns for which the image classification model is configured to identify; identifying the at least two identification patterns with the image classification model, generating the crop line failure data based on the at least two identification patterns and based on the relationship; wherein the crop line failure data indicate crop failures (160) in a crop line (100); providing the crop line failure data.
2. Computer-implemented method according to claim 1 , wherein the crop line failure data are provided in form of: control data for an agricultural device (300), wherein the control data comprise position data of one or more of the crop failures (160) in the crop line (100); and/or a crop line failure map indicating the crop failures (160) in the agricultural field; and/or a replanting map indicating where in the agricultural field a crop failure (160) is above a predetermined crop failure threshold.
3. Computer-implemented method according to claim 1 or 2, wherein the image classification model comprises a single model configured to identify each of the at least two identification patterns independently from another; and/or wherein the image classification model comprises separate sub-models for each of the at least two identification patterns.
4. Computer-implemented method according to any one of the preceding claims, wherein the crop failure pattern (160) is located at a position in a crop line (100) where a sown crop has not emerged and/or is located at a position in a crop line (100) where a crop (110) is smaller than a predetermined size.
5. Computer-implemented method according to any one of the preceding claims, wherein the method is further comprising: determining whether a crop failure is above a predetermined crop failure threshold; and wherein the control data for the agricultural device (300) are only provided in case a crop failure is above the predetermined crop failure threshold.
6. Computer-implemented method according to any one of the preceding claims, wherein the crop line failure data, preferably the crop line failure map, comprises at least one of crop line areas (100), crop areas (110) and/or crop failure areas (160).
7. Computer-implemented method according to any one of the preceding claims, wherein the crop line failure map comprises crop lines (100), crop failures (160) and/or crop (110).
8. Computer-implemented method according to any one of the preceding claims, wherein the crop line failure map comprises cells, preferably polygon-shaped cells, wherein for each cell, a crop failure value is determined.
9. Computer-implemented method according any one of the preceding claims, wherein the method is further comprising: providing a weed classification algorithm configured to identify weeds; determining weeds in the at least one image of the at least one part of the agricultural field utilizing the weed image classification algorithm; providing a replanting map indicating where in the agricultural field a crop failure is above a predetermined crop failure threshold and the weed infestation is below a predetermined weed infestation threshold.
10. Computer-implemented method according any one of the preceding claims, wherein the image of the at least one part of the agricultural field is provided at a time when the crops are in the tillering stage.
11. System (10) for providing crop line failure data, comprising: a providing unit (11) configured to provide image data of at least a part of the agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged; a further providing unit (12) configured to provide an image classification model configured to identify at least two identification patterns selected from the group of the following patterns consisting of: a crop line pattern (100), a crop failure pattern (160) and/or a crop pattern (110); ; and for providing a relationship between the at least two identification patterns for which the image classification model is configured to identify; a determining unit (13) configured to identify the at least two identification patterns with the image classification model; and to generate the crop line failure data based on the at least two identification patterns and based on the relationship ; wherein the crop line failure data indicate crop failures (160) in a crop line (100); a further providing unit (14) configured to provide the crop line failure data.
12. An apparatus for providing crop line failure data, the apparatus comprising: one or more computing nodes; and one or more computer-readable media having thereon computerexecutable instructions that are structured such that, when executed by the one or more computing nodes, cause the apparatus to perform the computer-implemented method of any one of claims 1 to 10 .
13. Agricultural device (300) for and/or planting device for planting a crop on an agricultural field, wherein control data for the planting device are at least partially provided according to any one of the claims 1 to 10.
14. Computer program element with instructions, which, when executed on computing devices of a computing environment, is configured to carry out the steps of the computer- implemented method according to any one of the claims 1 to 10 in a system according to claim 11 and/or in an apparatus according to claim 12.
15. Use of image data of at least a part of an agricultural field, wherein the image data comprise at least one image of the at least one part of the agricultural field at a time when sown crops have emerged and/or use of an image classification model configured to identify in the at least one image of the at least one part of the agricultural field at least two of the following patterns: a crop line pattern, a crop failure pattern and/or a crop pattern in a computer-implemented method according to any one of the claims 1 to 10, in a system according to claim 11 and/or in an apparatus according to claim 12; and/or use of image data of at least a part of an agricultural field as input data for a weed classification algorithm .
PCT/EP2023/087533 2022-12-23 2023-12-22 Method for providing control data, a crop failure map and/or a replanting map WO2024133848A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP22216560 2022-12-23
EP22216560.7 2022-12-23

Publications (1)

Publication Number Publication Date
WO2024133848A1 true WO2024133848A1 (en) 2024-06-27

Family

ID=84602665

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/087533 WO2024133848A1 (en) 2022-12-23 2023-12-22 Method for providing control data, a crop failure map and/or a replanting map

Country Status (1)

Country Link
WO (1) WO2024133848A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109240304A (en) * 2018-10-15 2019-01-18 南京林业大学 A kind of precision planting system and method
EP3503025A1 (en) * 2017-12-19 2019-06-26 Accenture Global Solutions Limited Utilizing artificial intelligence with captured images to detect agricultural failure
US20210204467A1 (en) * 2020-01-06 2021-07-08 Deere & Company Replant routing and control of a seed planting machine
US20220245381A1 (en) * 2021-01-29 2022-08-04 Iunu, Inc. Pest infestation detection for horticultural grow operations
DE102021114996A1 (en) * 2021-06-10 2022-12-15 Eto Magnetic Gmbh Device for detecting sprouting of seeds, agricultural sensor device and agricultural monitoring and/or agricultural control method and system
US20220405863A1 (en) * 2019-11-18 2022-12-22 Sony Group Corporation Information processing device, information processing method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3503025A1 (en) * 2017-12-19 2019-06-26 Accenture Global Solutions Limited Utilizing artificial intelligence with captured images to detect agricultural failure
CN109240304A (en) * 2018-10-15 2019-01-18 南京林业大学 A kind of precision planting system and method
US20220405863A1 (en) * 2019-11-18 2022-12-22 Sony Group Corporation Information processing device, information processing method, and program
US20210204467A1 (en) * 2020-01-06 2021-07-08 Deere & Company Replant routing and control of a seed planting machine
US20220245381A1 (en) * 2021-01-29 2022-08-04 Iunu, Inc. Pest infestation detection for horticultural grow operations
DE102021114996A1 (en) * 2021-06-10 2022-12-15 Eto Magnetic Gmbh Device for detecting sprouting of seeds, agricultural sensor device and agricultural monitoring and/or agricultural control method and system

Similar Documents

Publication Publication Date Title
US11703855B2 (en) Adaptive cyber-physical system for efficient monitoring of unstructured environments
Guo et al. UAS-based plant phenotyping for research and breeding applications
Ampatzidis et al. Agroview: Cloud-based application to process, analyze and visualize UAV-collected data for precision agriculture applications utilizing artificial intelligence
Bhat et al. Big data and ai revolution in precision agriculture: Survey and challenges
Srivastava et al. An approach for route optimization in applications of precision agriculture using UAVs
US20200250427A1 (en) Shadow and cloud masking for agriculture applications using convolutional neural networks
EP3816880B1 (en) A yield estimation method for arable crops and grasslands, coping with extreme weather conditions and with limited reference data requirements
Raptis et al. End-to-end precision agriculture UAV-based functionalities tailored to field characteristics
WO2024133848A1 (en) Method for providing control data, a crop failure map and/or a replanting map
Sharma et al. Digital technologies and tools for the ensuing digital era
Yang et al. GEE-Based monitoring method of key management nodes in cotton production
Goel et al. Machine learning-based remote monitoring and predictive analytics system for monitoring and livestock monitoring
Goel et al. Machine learning-based remote monitoring and predictive analytics system for crop and livestock
Phan et al. A predictive model for turfgrass color and quality evaluation using deep learning and UAV imageries
Lyu et al. An efficient pipeline for crop image extraction and vegetation index derivation using unmanned aerial systems
Biju et al. Demarcation and Mapping of a Tree Species Using VHRS Data and Deep learning methods
Singh et al. A tool for climate smart crop insurance: Combining farmers’ pictures with dynamic crop modelling for accurate yield estimation prior to harvest
Aguilar-Ariza et al. UAV-based individual Chinese cabbage weight prediction using multi-temporal data
Carlier et al. Comparing CNNs and PLSr for estimating wheat organs biophysical variables using proximal sensing
Restrepo-Arias et al. Crops Classification in Small Areas Using Unmanned Aerial Vehicles (UAV) and Deep Learning Pre-trained Models from Detectron2
US20240144424A1 (en) Inferring high resolution imagery
Terzi et al. Automatic detection of grape varieties with the newly proposed CNN model using ampelographic characteristics
Zhang et al. Advancements of UAV and Deep Learning Technologies for Weed Management in Farmland
Carroll Development of digital phenotyping methods for breeding applications in stress phenotyping, spatial adjustments, and end season yield prediction
Ma Sensing technologies for high-throughput plant phenotyping: a comprehensive review with a case study

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23837365

Country of ref document: EP

Kind code of ref document: A1