WO2020157878A1 - Computer system, crop growth support method, and program - Google Patents

Computer system, crop growth support method, and program Download PDF

Info

Publication number
WO2020157878A1
WO2020157878A1 PCT/JP2019/003256 JP2019003256W WO2020157878A1 WO 2020157878 A1 WO2020157878 A1 WO 2020157878A1 JP 2019003256 W JP2019003256 W JP 2019003256W WO 2020157878 A1 WO2020157878 A1 WO 2020157878A1
Authority
WO
WIPO (PCT)
Prior art keywords
position information
point
plant
herbicide
image
Prior art date
Application number
PCT/JP2019/003256
Other languages
French (fr)
Japanese (ja)
Inventor
俊二 菅谷
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Priority to JP2020569247A priority Critical patent/JP7068747B2/en
Priority to PCT/JP2019/003256 priority patent/WO2020157878A1/en
Publication of WO2020157878A1 publication Critical patent/WO2020157878A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G7/00Botany in general
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass

Definitions

  • the present invention relates to a computer system, a herbicide spraying support method and a program for supporting spraying of a herbicide.
  • weeds that have grown in the field have been removed by drones.
  • a weeding method there is a configuration in which the position of a weed that has grown in a field is specified, and a drone is moved to this position to perform weeding.
  • a drone is run along a ridge provided in the field, and an image of the ridge is taken by this drone. It is difficult to determine whether or not anything other than crops (for example, weeds) has been photographed by image analysis of this photographed image.If something other than crops is photographed, remove the weeding nail provided on the drone. There is disclosed a configuration in which this is used and weeding is performed (see Patent Document 1).
  • Patent Document 1 is not suitable for the weeding of plants other than the ridges because the target of weeding is only the plants growing on the ridges.
  • An object of the present invention is to provide a computer system, a herbicide spraying support method, and a program, in which a herbicide is sprayed on an arbitrary point and weeds can be easily weeded.
  • the present invention provides the following solutions.
  • the present invention is a computer system that supports the application of herbicides, A first acquisition means for acquiring position information of the seeding point of the crop, Storage means for storing the position information of the seeding point, Second acquisition means for acquiring a photographed image of the field and position information of the photographing point, An image analysis of the photographed image, a detection means for detecting a plant, Positioning information of the plant, specifying means for specifying based on the position information of the shooting point, Based on the position information of the stored sowing point, and the position information of the identified plant, a spraying means for spraying a herbicide to plants other than the seeding point, There is provided a computer system comprising:
  • the computer system that supports the spraying of the herbicide obtains the position information of the seeding point of the crop, stores the position information of the seeding point, and takes a photographed image of the field and the position information of the photographing point. Obtaining, image analysis of the photographed image, the plant is detected, the positional information of the plant is specified based on the positional information of the photographing point, the positional information of the stored seeding point, and the specified plant The herbicide is sprayed on plants other than the seeding point based on the position information of.
  • the present invention is a system category, but also in other categories such as methods and programs, the same action/effect according to the category is exhibited.
  • the present invention is a computer system that supports the application of herbicides,
  • a first acquisition means for acquiring the first captured image of the field and the positional information of the capturing location; Image analysis of the first captured image, first detection means for detecting the shoots of the crop, Position information of the sprout point where the sprout is present, first specifying means for specifying based on the position information of the shooting point, Storage means for storing the position information of the identified sprout point,
  • a second acquisition means for acquiring position information of the second captured image and the capturing location of the field at a time different from the first captured image;
  • An image analysis of the second captured image a second detection means for detecting plants, Position information of the plant, a second specifying means for specifying based on the position information of the shooting point, Based on the stored position information of the sprout point and the position information of the identified plant, spraying means for spraying a herbicide on plants other than the sprout point,
  • a computer system comprising:
  • the computer system that supports the spraying of the herbicide acquires the first photographed image of the field and the positional information of the photographing point, analyzes the image of the first photographed image, and detects the shoots of the crop. Then, the position information of the sprout point where the sprout is present is specified based on the position information of the shooting point, the specified position information of the sprout point is stored, and a field at a time different from the first shooting image is shot.
  • the second photographed image and the position information of the photographing point are acquired, the second photographed image is image-analyzed, the plant is detected, the position information of the plant is specified based on the position information of the photographing point, and stored.
  • the herbicide is sprayed on plants other than the sprout point based on the position information of the sprout point and the position information of the identified plant.
  • the present invention is a system category, but also in other categories such as methods and programs, the same action/effect according to the category is exhibited.
  • the present invention it is possible to provide a computer system, a herbicide spraying support method, and a program in which a herbicide is sprayed on an arbitrary point and weeds are easily weeded.
  • FIG. 1 is a diagram showing an outline of a herbicide spraying support system 1.
  • FIG. 2 is an overall configuration diagram of the herbicide spraying support system 1.
  • FIG. 3 is a diagram showing a flowchart of the seeding point storage processing executed by the computer 10.
  • FIG. 4 is a diagram showing a flowchart of the first herbicide spraying support process executed by the computer 10.
  • FIG. 5 is a diagram showing a flowchart of the first learning process executed by the computer 10.
  • FIG. 6 is a diagram showing a flowchart of a sprout spot storage process executed by the computer 10.
  • FIG. 7 is a diagram showing a flowchart of the second herbicide application support process executed by the computer 10.
  • FIG. 8 is a diagram showing a flowchart of the second learning process executed by the computer 10.
  • FIG. 1 is a diagram for explaining the outline of a herbicide spraying support system 1 which is a preferred embodiment of the present invention.
  • the herbicide spraying support system 1 is a computer system including a computer 10 and supporting spraying of a herbicide.
  • the herbicide spraying support system 1 is a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal owned by a worker who cultivates a crop (for example, a smartphone, a tablet terminal, a personal computer), and other computers such as other computers. Terminals and devices may be included. Further, the herbicide spraying support system 1 may be realized by a single computer such as the computer 10, or may be realized by a plurality of computers such as a cloud computer.
  • the computer 10 is connected to a drone, an agricultural machine tool, a high-performance agricultural machine, a worker terminal, other computers, etc. so as to be able to perform data communication via a public line network, etc., and executes necessary data transmission/reception.
  • the computer 10 acquires the position information on the seeding point of the crop when sowing the crop.
  • the computer 10 acquires the position information of the seeding point from, for example, an agricultural machine tool that has performed seeding, a high-performance agricultural machine, or a worker terminal that is possessed by an operator who performed seeding.
  • the computer 10 stores the position information of this seeding point.
  • the computer 10 acquires a photographed image of the field and position information of the photographing point.
  • the computer 10 acquires, for example, a picked-up image obtained by picking up each point in the field with a drone.
  • the computer 10 acquires from the drone, for example, the position information of the shooting location where the drone shot the shot image.
  • the computer 10 analyzes the photographed image and detects the plants shown in this photographed image.
  • the computer 10 extracts, for example, a feature point (for example, shape, contour, hue) and a feature amount (for example, an average of pixel values, a variance, a statistical value such as a histogram) of the captured image as the image analysis.
  • the computer 10 detects the plant shown in the captured image based on the extracted characteristic points and characteristic amounts.
  • the plants include seeded crops and weeds that can be weeded.
  • the computer 10 identifies the detected positional information of the plant based on the acquired positional information of the photographing location.
  • the computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
  • the computer 10 sprays the herbicide on the plants existing at positions other than this seeding point based on the stored position information of the seeding point and the stored position information of the plant.
  • the computer 10 determines that this plant is a crop and does not spray the herbicide.
  • the computer 10 determines that this plant is a weed, and sprays a herbicide.
  • the computer 10 moves the drone to the position of this weed based on the position information of the plant judged to be a weed, and sends a command to the drone to spray the herbicide by the herbicide spraying device of this drone.
  • the drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point.
  • the computer 10 sprays the herbicide on the plants other than the seeding point.
  • the computer 10 can also learn the detected plant image.
  • the computer 10 detects plants by taking this learning result into consideration when performing image analysis on the acquired captured image.
  • the computer 10 acquires the first photographed image of the field and the position information of the photographing point.
  • the computer 10 acquires, for example, a first photographed image obtained by photographing each point in the field with a drone.
  • the computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the first shot image.
  • the computer 10 analyzes the first photographed image and detects the sprout of the crop shown in this first photographed image.
  • the computer 10 extracts, for example, a feature point or a feature amount of the first captured image as image analysis.
  • the computer 10 detects the sprout of the crop shown in the first photographed image based on the extracted characteristic points and characteristic amounts.
  • the computer 10 identifies the detected position information of the sprout of the crop based on the acquired position information of the shooting point.
  • the computer 10 identifies the position of the sprout of the crop as the position information of the sprout point, assuming that the detected position information of the sprout of the crop matches the position information of the photographing point. As a result, the computer 10 identifies the position of the detected sprout of the crop in the field as the sprout point.
  • the computer 10 stores the position information of this sprout point.
  • the computer 10 acquires the second photographed image of the field photographed at a different time from the first photographed image and the position information of the photographing point.
  • the computer 10 acquires, for example, a second photographed image obtained by photographing each point in the field with a drone.
  • the computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the second shot image.
  • the computer 10 analyzes the image of the second captured image, and detects the plants in the second captured image.
  • the computer 10 extracts the feature points and the feature amount of the second captured image as image analysis, for example.
  • the computer 10 detects the plant in the second captured image based on the extracted feature points and feature amounts.
  • the plants in this case include crops grown from new shoots and weeds that can be weeded.
  • the computer 10 identifies the detected positional information of the plant based on the acquired positional information of the photographing location.
  • the computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
  • the computer 10 sprays the herbicide on the plants existing at the positions other than the sprout point based on the stored position information of the sprout point and the stored position information of the plant.
  • the computer 10 determines that this plant is a crop and does not spray the herbicide.
  • the computer 10 determines that this plant is a weed, and sprays a herbicide.
  • the computer 10 moves the drone to the position of this weed based on the position information of the plant judged to be a weed, and sends a command to the drone to spray the herbicide by the herbicide spraying device of this drone.
  • the drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point.
  • the computer 10 sprays the herbicide on the plants other than the seeding point.
  • the computer 10 can also learn the detected plant image.
  • the computer 10 detects plants by taking this learning result into consideration when performing image analysis on the acquired second captured image.
  • the computer 10 acquires the position information of the seeding point of the crop when sowing the crop (step S01).
  • the computer 10 acquires the position information of the seeding point from the agricultural machine tool that has sowed the seed, the high-performance agricultural machine, or the worker terminal possessed by the worker who has seeded the seed.
  • the computer 10 stores the position information of this seeding point (step S02).
  • the computer 10 acquires a photographed image of the field and position information of the photographing point as photographing data (step S03).
  • the computer 10 acquires, for example, a picked-up image obtained by picking up each point in the field with a drone.
  • the computer 10 acquires from the drone, for example, the position information of the shooting location where the drone shot the shot image.
  • the computer 10 acquires such photographed images and positional information of photographing points as photographing data.
  • the computer 10 analyzes the photographed image to detect the plants in the photographed image (step S04).
  • the computer 10 extracts, for example, a feature point or a feature amount of a captured image as image analysis.
  • the computer 10 detects the plant shown in the captured image based on the extracted characteristic points and characteristic amounts.
  • the plants include seeded crops and weeds that can be weeded.
  • the computer 10 specifies the detected position information of the plant based on the acquired position information of the photographing location (step S05).
  • the computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
  • the computer 10 sprays the herbicide on the plants existing at positions other than the seeding point based on the stored position information of the seeding point and the stored position information of the plant (step S06).
  • the computer 10 determines that this plant is a crop and does not spray the herbicide.
  • the computer 10 determines that this plant is a weed, and sprays a herbicide.
  • the computer 10 moves the drone to the position of this weed based on the position information of the plant determined to be a weed, and transmits a command to the drone to spray the herbicide by the herbicide spraying device of this drone.
  • the drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point.
  • the computer 10 sprays the herbicide on the plants other than the seeding point.
  • the computer 10 acquires the first photographed image of the field and the position information of the photographing point as the first photographing data (step S10).
  • the computer 10 acquires, for example, a first photographed image obtained by photographing each point in the field with a drone.
  • the computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the first shot image.
  • the computer 10 analyzes the image of the first captured image and detects the sprout of the crop shown in the first captured image (step S11).
  • the computer 10 extracts, for example, a feature point or a feature amount of the first captured image as image analysis.
  • the computer 10 detects the sprout of the crop shown in the first photographed image based on the extracted characteristic points and characteristic amounts.
  • the computer 10 identifies the position information of the detected sprout of the crop based on the acquired position information of the shooting point (step S12).
  • the computer 10 identifies the position of the sprout of the crop as the position information of the sprout point, assuming that the detected position information of the sprout of the crop matches the position information of the photographing point. As a result, the computer 10 identifies the position of the detected sprout of the crop in the field as the sprout point.
  • the computer 10 stores the position information of this sprout point (step S13).
  • the computer 10 acquires the second captured image obtained by capturing the field at a time different from the first captured image and the position information of the capturing location as the second captured data (step S14).
  • the computer 10 acquires, for example, a second photographed image obtained by photographing each point in the field at a time different from the first photographed image using a drone.
  • the computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the second shot image.
  • the computer 10 analyzes the image of the second captured image, and detects the plants in the second captured image (step S15).
  • the computer 10 extracts the feature points and the feature amount of the second captured image as image analysis, for example.
  • the computer 10 detects the plant in the second captured image based on the extracted feature points and feature amounts.
  • the plants in this case include crops grown from new shoots and weeds that can be weeded.
  • the computer 10 identifies the detected positional information of the plant based on the acquired positional information of the photographing location (step S16).
  • the computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
  • the computer 10 sprays the herbicide on the plants existing at the positions other than the sprout point based on the stored position information of the sprout point and the specified position information of the plant (step S17).
  • the computer 10 determines that this plant is a crop and does not spray the herbicide.
  • the computer 10 determines that this plant is a weed, and sprays a herbicide.
  • the computer 10 moves the drone to the position of this weed based on the position information of the plant determined to be a weed, and transmits a command to the drone to spray the herbicide by the herbicide spraying device of this drone.
  • the drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point.
  • the computer 10 sprays the herbicide on the plants other than the seeding point.
  • FIG. 2 is a diagram showing a system configuration of a herbicide spraying support system 1 which is a preferred embodiment of the present invention.
  • the herbicide spraying support system 1 is a computer system which is composed of a computer 10 and supports spraying of the herbicide.
  • the computer 10 is connected to a drone, an agricultural machine tool, a high-performance agricultural machine, a worker terminal, and other computers so as to be able to perform data communication via a public line network, etc., and executes necessary data transmission/reception.
  • the herbicide spraying support system 1 may include a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal, other computers, and other terminals and devices, which are not shown. Further, the herbicide spraying support system 1 may be realized by a single computer such as the computer 10, or may be realized by a plurality of computers such as a cloud computer.
  • the computer 10 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, and a device for enabling communication with other terminals or devices, such as the IEEE 802, as a communication unit. .. 11 compliant Wi-Fi (Wireless-Fidelity) compatible devices and the like.
  • the computer 10 also includes, as a storage unit, a data storage unit such as a hard disk, a semiconductor memory, a recording medium, or a memory card. Further, the computer 10 includes various devices that execute various processes as a processing unit.
  • control unit reads a predetermined program so that the seeding position acquisition module 20, the imaging data acquisition module 21, and the command transmission module 22 are realized in cooperation with the communication unit. Further, in the computer 10, the control unit reads a predetermined program to cooperate with the storage unit to realize the storage module 30. Further, in the computer 10, the control unit loads a predetermined program to realize the image analysis module 40, the position identification module 41, the crop identification module 42, the command creation module 43, and the learning module 44 in cooperation with the processing unit. To do.
  • FIG. 3 is a diagram showing a flowchart of the seeding point storage processing executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the seeding position acquisition module 20 acquires the position information of the seeding point of the crop (step S20).
  • the seeding position acquisition module 20 acquires the position information of the seeding point from the seeding equipment or device or the worker terminal possessed by the worker who seeded.
  • the worker terminal acquires its own position information from the GPS (Global Positioning System) or the like at the seeded position.
  • the worker terminal transmits the acquired position information of itself to the computer 10 as the position information of the seeding point.
  • the seeding position acquisition module 20 acquires the position information of the seeding point of this crop by receiving the position information of the seeding point transmitted by the worker terminal.
  • the storage module 30 stores the position information of the seeding point of this crop (step S21).
  • the storage module 30 may store only the position information of the seeding point of this crop, or may store it in association with the operator, the user, the identifier of the field, and the like.
  • FIG. 4 is a diagram showing a flowchart of the first herbicide spraying support process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the photographing data acquisition module 21 acquires the photographed image of the field and the position information of the photographing point as photographing data (step S30).
  • the imaging data acquisition module 21 acquires, as imaging data, for example, imaging images taken by the drone at a plurality of preset imaging points in the field and position information of the imaging points at which the drone images the imaging images.
  • the drone photographs the area directly below itself as a photographed image. That is, the drone captures the captured image from a position vertical to the field.
  • the drone captures a captured image at the capturing location and acquires its own position information from the GPS or the like.
  • the drone handles the acquired position information of itself as the position information of the shooting point.
  • the drone transmits the photographed image and the position information of the photographing point to the computer 10 as photographing data.
  • the imaging data acquisition module 21 acquires imaging data by receiving the imaging data transmitted by the drone.
  • the computer 10 acquires the photographed image of the field and the position information of the photographing point.
  • the image analysis module 40 analyzes the photographed image based on the photographed data (step S31). In step S31, the image analysis module 40 extracts feature points and feature amounts in this captured image. The image analysis module 40 extracts, for example, the shape, hue, etc. of an object existing in a captured image as image analysis.
  • the image analysis module 40 determines whether or not a plant can be detected based on the result of image analysis (step S32).
  • step S32 the image analysis module 40 compares the extracted feature points and feature amounts with the plant database in which the feature points and feature amounts of plants and the identifiers of plants are registered in advance to obtain a captured image. Determine whether plants are present.
  • the image analysis module 40 determines that a plant has been detected when the feature point or feature amount extracted this time and the feature point or feature amount registered in the plant database match, and if they do not match, the plant cannot be detected. Will be judged.
  • the image analysis module 40 also determines the identifier of the plant based on the feature point and the feature amount of the plant and the plant database.
  • step S32 when the image analysis module 40 determines that the plant could not be detected as a result of the image analysis (step S32 NO), the computer 10 ends this processing.
  • the computer 10 may be configured to execute the above-described processing of step S30 and acquire the shooting data different from the shooting data subjected to the image analysis this time.
  • step S32 when the image analysis module 40 determines that the plant can be detected as a result of the image analysis (step S32 YES), the position specifying module 41 determines the detected plant based on the position information in the photographing data.
  • the position information of is identified (step S33).
  • step S33 the position specifying module 41 specifies the position information of the shooting point corresponding to the shot image subjected to the image analysis this time based on the shooting data.
  • the position specifying module 41 specifies the position information of the shooting point as the position information of this plant.
  • the position specifying module 41 specifies the detailed position of this plant based on the position information of this plant and the coordinates in the captured image.
  • the position specifying module 41 sets an orthogonal coordinate system for the picked-up image and specifies the position in the picked-up image in which this plant is detected as the X coordinate and the Y coordinate in this picked-up image.
  • the position specifying module 41 specifies the position information of the plant in the actual field on the basis of the position information at the shooting point and the X and Y coordinates.
  • the position specifying module 41 specifies the center position of the captured image as position information at the shooting point, and specifies the X coordinate and the Y coordinate as positions with respect to the center position.
  • the position specifying module 41 will specify the position of this plant in the field.
  • the crop specifying module 42 determines whether or not this plant is a seed sown crop based on the position information of the seeding point stored by the process of step S21 described above and the position information of the specified plant (step). S34).
  • step S34 the crop identification module 42 compares the position information of this seeding point with the position information of the specified plant, and makes this determination based on whether or not they match. If the crop identification module 42 determines that they match, it determines that this plant is a crop (the sown crop has grown) (step S34 YES), and executes the process described below without executing the process.
  • the computer 10 may be configured to execute the process of step S30 described above and acquire the image data different from the image data subjected to the image analysis this time, similarly to the process of step S32 described above.
  • the computer 10 does not create or transmit the command necessary for spraying the herbicide, and thus does not spray the crop with the herbicide.
  • step S34 if the crop identification module 42 determines that they do not match, it determines that this plant is not a crop (the sown crop is not a grown one but a weed) (step S34 NO),
  • the command creation module 43 creates a command for spraying a drone with a herbicide for weeding this plant (step S35).
  • step S35 the command creation module 43 creates a flight command to the position of the field corresponding to the position information of this plant and a drive command to drive the herbicide spraying device of the drone.
  • the commands created by the command creation module 43 will be described.
  • the command creation module 43 creates, as flight commands, commands necessary to fly to the position information of this plant. Further, the command creation module 43 refers to the herbicide database in which the identifier of the plant and the identifier of the effective herbicide are registered in advance and identifies the effective herbicide for the plant detected this time. Furthermore, the command creation module 43 determines the required amount of herbicide sprayed based on the extracted feature points and feature amounts. For example, the command creation module 43 determines the required spraying amount of the herbicide according to the size and shape of the extracted plant. The command creation module 43 creates, as a drive command, a command necessary to apply the determined herbicide identifier and application amount to the herbicide application device.
  • the command creation module 43 determines only the spraying amount of this herbicide, and determines this spraying amount as the drive command. It is only necessary to create the command necessary for spraying to the herbicide spraying device.
  • the computer 10 executes the above-described processing on a preset area of the field or the entire field. For example, the computer 10 acquires the photographing data for the entire field, judges the presence/absence of a plant at each photographing point, judges the crop of the plant, and executes the process of creating or not creating a command. The computer 10 will execute the process described later when the process is completed for all the photographing data.
  • the computer 10 may be configured to execute the processing described below when it determines that the plant is not a crop for each piece of captured data, not when the processing is completed for all captured data. .. That is, the computer 10 may be configured to execute the processing described below on the individual photographing data when the plant is a crop.
  • the command transmission module 22 transmits the created command to the drone, and sprays the herbicide on the drone (step S36).
  • step S36 the command transmission module 22 transmits the flight command and the drive command described above.
  • the drone receives this flight command and drive command. Based on this flight command, the drone flies to the position of the target plant, and based on the drive command, sprays the herbicide according to the type of herbicide to be sprayed and the spraying amount of this herbicide.
  • the computer 10 sprays the herbicide on the plants other than the seeding point.
  • the target of the command created and transmitted by the computer 10 is not limited to the drone, and may be other agricultural machinery, high-performance agricultural machinery, or the like.
  • the computer 10 instead of the flight command, the computer 10 may be changed to one suitable for an instrument or machine such as a travel command.
  • FIG. 5 is a diagram showing a flowchart of the first learning process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the learning module 44 learns the photographed image of the plant to which the herbicide is sprayed (step S40). In step S40, the learning module 44 learns the captured image of the plant determined by the crop identifying module 42 not to be the planted seed by the processing of step S34 described above. The crop identification module 42 learns the feature points and feature amounts of this plant and that this plant is not a sown crop.
  • the storage module 30 stores the learning result (step S41).
  • the computer 10 executes the processes of steps S32 and S34 described above in consideration of the stored learning result.
  • the image analysis module 40 determines whether or not a plant could be detected by adding the learning result to the image analysis result. For example, the image analysis module 40 determines whether or not the extracted feature point or feature amount is based on the feature point or feature amount extracted as a result of the image analysis and the feature point or feature amount in the learning result. It is also possible to judge.
  • the crop identifying module 42 determines whether or not the plant is a crop based on the feature points and the feature amount in the learning result, in addition to the comparison of the position information. For example, the crop identification module 42 determines that the plant is not a crop if the image of the plant matches or resembles a plant that is not a crop in the learning result, even if the results of the comparison of the position information match. Will also be possible.
  • the above is the first learning process.
  • FIG. 6 is a diagram showing a flowchart of a sprout spot storage process executed by the computer 10. The processing executed by each module described above will be described together with this processing. The detailed description of the same processes as those described above will be omitted. In addition, this treatment is performed at the time when the sown crop grows and a new shoot emerges.
  • the photographing data acquisition module 21 acquires the photographed image of the field and the position information of the photographing point as the first photographing data (step S50).
  • the process of step S50 is similar to the process of step S30 described above.
  • the image analysis module 40 performs image analysis on the captured image based on the first captured data (step S51).
  • the process of step S51 is similar to the process of step S31 described above.
  • the image analysis module 40 determines whether or not the sprout of the crop can be detected based on the result of the image analysis (step S52).
  • step S52 the image analysis module 40 compares the extracted feature points and feature amounts with the sprout database in which the feature points and feature amounts of the sprouts of the crop and the crop identifier are registered in advance to capture an image. Determine if there are crop shoots in the image.
  • the image analysis module 40 judges that the sprout of the crop has been detected when the feature point or the feature amount extracted this time and the feature point or the feature amount registered in the sprout database match, and when they do not match, the sprout of the crop is determined. Will be determined not to be detected.
  • the image analysis module 40 also determines the sprout identifier of the crop based on the sprout characteristic point and the feature amount of the crop and the sprout database.
  • step S52 when the image analysis module 40 determines that the new shoots of the crop could not be detected as a result of the image analysis (step S52: NO), the computer 10 ends this processing.
  • the computer 10 may be configured to execute the process of step S50 described above and acquire the first captured image data different from the first captured image data subjected to the image analysis this time.
  • step S52 when the image analysis module 40 determines that the sprout of the crop can be detected as a result of the image analysis (step S52 YES), the position specifying module 41 determines the position information in the first shooting data based on the position information.
  • the position information of the detected sprout of the crop is specified (step S53).
  • step S53 the position specifying module 41 specifies the position information of the shooting point corresponding to the shot image subjected to the image analysis this time based on the first shooting data.
  • the position specifying module 41 specifies the position information of the shooting point as the position information of the new shoots of this crop.
  • the position specifying module 41 specifies the detailed position of the new shoots of this crop based on the position information of the new shoots of this crop and the coordinates in the captured image. For example, the position specifying module 41 sets an orthogonal coordinate system for the picked-up image, and specifies the position in the picked-up image in which the shoots of this crop are detected as the X coordinate and the Y coordinate in this picked-up image. The position specifying module 41 specifies the position information of the sprout of the crop in the actual field, based on the position information at the shooting point and the X and Y coordinates.
  • the position specifying module 41 specifies the center position of the captured image as position information at the shooting point, and specifies the X coordinate and the Y coordinate as positions with respect to the center position. As a result, the position specifying module 41 specifies the position of the sprout of this crop in the field.
  • the storage module 30 stores the position information of the sprout point where the sprout of this crop exists (step S54).
  • the storage module 30 may store only the position information of this sprout point, or may store it in association with the identifiers of the worker, the user, and the field.
  • FIG. 7 is a diagram showing a flowchart of the second herbicide application support process executed by the computer 10. The processing executed by each module described above will be described together with this processing. The detailed description of the same processes as those described above will be omitted.
  • the photographing data acquisition module 21 acquires, as the second photographing data, the photographed image of the field at a time different from the first photographing data and the position information of the photographing point (step S60).
  • the process of step S60 is similar to the process of step S30 described above.
  • the image analysis module 40 performs image analysis on the captured image based on the second captured data (step S61).
  • the process of step S61 is similar to the process of step S31 described above.
  • the image analysis module 40 determines whether or not the plant can be detected based on the result of the image analysis (step S62).
  • the process of step S62 is similar to the process of step S32 described above.
  • step S62 when the image analysis module 40 determines that the plant cannot be detected as a result of the image analysis (step S62 NO), the computer 10 ends this processing.
  • the computer 10 may be configured to execute the process of step S60 described above and acquire the second shooting data different from the second shooting data subjected to the image analysis this time.
  • step S62 when the image analysis module 40 determines that the plant can be detected as a result of the image analysis (step S62 YES), the position specifying module 41 determines the detected plant based on the position information in the photographing data. The position information of is identified (step S63).
  • the process of step S63 is similar to the process of step S33 described above.
  • the crop specifying module 42 determines whether or not this plant is a crop based on the position information of the sprout point stored by the process of step S51 described above and the position information of the specified plant (step S64).
  • the crop identification module 42 compares the position information of this sprout point with the position information of the specified plant, and makes this determination based on whether or not they match.
  • the plant identifying module 42 determines that the plants match, the plant identifying module 42 determines that the plant is a crop (i.e., a sprout of the crop has grown) (YES in step S64), and the process described below is not performed.
  • the computer 10 is configured to execute the process of step S60 described above and acquire the second image data different from the second image data subjected to the image analysis this time, similarly to the process of step S62 described above. Good.
  • the computer 10 does not create or transmit the command necessary for spraying the herbicide, and thus does not spray the crop with the herbicide.
  • step S64 determines that this plant is not a crop (a new shoot of the crop is not a grown one, but a weed) (step S64 NO), and the command
  • the creation module 43 creates a command for spraying a drone with a herbicide for weeding this plant (step S65).
  • the process of step S65 is similar to the process of step S35 described above.
  • the command transmission module 22 transmits the created command to the drone, and sprays the herbicide on the drone (step S66).
  • the process of step S66 is similar to the process of step S36 described above.
  • the computer 10 will spray the herbicide on plants other than the sprout site.
  • the target of the command created and transmitted by the computer 10 is not limited to the drone, and may be other agricultural machinery, high-performance agricultural machinery, or the like.
  • the computer 10 instead of the flight command, the computer 10 may be changed to one suitable for an instrument or machine such as a travel command.
  • FIG. 8 is a diagram showing a flowchart of the first learning process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the learning module 44 learns the photographed image of the plant to which the herbicide is sprayed (step S70). In step S70, the learning module 44 learns the captured image of the plant that the crop identifying module 42 has determined that the detected plant is not a crop by the process of step S64 described above. The crop identification module 42 learns the feature points and feature quantities of this plant and that this plant is not a crop.
  • the storage module 30 stores the learning result (step S71).
  • the computer 10 executes the processes of steps S62 and S64 described above, taking into consideration the stored learning result.
  • the image analysis module 40 determines whether or not a plant could be detected by adding the learning result to the image analysis result. For example, the image analysis module 40 determines whether or not the extracted feature point or feature amount is based on the feature point or feature amount extracted as a result of the image analysis and the feature point or feature amount in the learning result. It is also possible to judge.
  • the crop identifying module 42 determines whether or not the plant is a crop based on the feature points and the feature amount in the learning result, in addition to the comparison of the position information. For example, the crop identification module 42 determines that the plant is not a crop if the image of the plant matches or resembles a plant that is not a crop in the learning result, even if the results of the comparison of the position information match. Will also be possible.
  • the above means and functions are realized by a computer (including a CPU, an information processing device, various terminals) reading and executing a predetermined program.
  • the program is provided in the form of being provided from a computer via a network (SaaS: software as a service), for example.
  • the program is provided in a form recorded in a computer-readable recording medium such as a flexible disk, a CD (CD-ROM, etc.), a DVD (DVD-ROM, DVD-RAM, etc.).
  • the computer reads the program from the recording medium, transfers the program to the internal recording device or the external recording device, records the program, and executes the program.
  • the program may be recorded in advance in a recording device (recording medium) such as a magnetic disk, an optical disk, a magneto-optical disk, and provided from the recording device to a computer via a communication line.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Pest Control & Pesticides (AREA)
  • Insects & Arthropods (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Engineering & Computer Science (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Botany (AREA)
  • Ecology (AREA)
  • Forests & Forestry (AREA)
  • Catching Or Destruction (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

[Problem] The purpose of the present invention is to provide a computer system, a herbicide spraying support method, and a program, whereby a herbicide can be sprayed and weeds removed, easily at any location. [Solution] A computer system supporting the spraying of a herbicide, that: obtains position information for a sowing location for a crop; stores the position information for the sowing location; obtains a captured image of a cultivated field and obtains position information for the imaging location; analyses the captured image; detects a plant; specifies position information for the plant on the basis of the position information for the imaging location; and causes a herbicide to be sprayed on plants that are not at the sowing location, on the basis of the stored position information for the sowing location and specified position information for the plant.

Description

コンピュータシステム、作物生育支援方法及びプログラムComputer system, crop growth support method and program
 本発明は、除草剤の散布を支援するコンピュータシステム、除草剤散布支援方法及びプログラムに関する。 The present invention relates to a computer system, a herbicide spraying support method and a program for supporting spraying of a herbicide.
 従来より、圃場に生えた雑草をドローン等により除草することが行われている。このような除草方法として、圃場に生えた雑草の位置を特定し、この位置にドローンを移動させ、除草を行う構成がある。 ❖ Traditionally, weeds that have grown in the field have been removed by drones. As such a weeding method, there is a configuration in which the position of a weed that has grown in a field is specified, and a drone is moved to this position to perform weeding.
 このような技術の例として、圃場に設けた畝に沿ってドローンを走行させ、このドローンにより、畝の画像を撮影する。この撮影した画像を画像解析することにより、作物以外のもの(例えば、雑草)が撮影されているか否かを判断しい、作物以外のものが撮影されている場合、ドローンに設けられた除草爪を用い、これを除草する構成が開示されている(特許文献1参照)。 As an example of such technology, a drone is run along a ridge provided in the field, and an image of the ridge is taken by this drone. It is difficult to determine whether or not anything other than crops (for example, weeds) has been photographed by image analysis of this photographed image.If something other than crops is photographed, remove the weeding nail provided on the drone. There is disclosed a configuration in which this is used and weeding is performed (see Patent Document 1).
特開昭62-74204号公報JP-A-62-74204
 しかしながら、特許文献1の構成では、除草する対象が畝上に生えている植物のみを対象とするものであることから、畝以外に存在する植物の除草に適するものではなかった。 However, the configuration of Patent Document 1 is not suitable for the weeding of plants other than the ridges because the target of weeding is only the plants growing on the ridges.
 本発明は、任意の地点に除草剤を散布させ、雑草を除草することが容易なコンピュータシステム、除草剤散布支援方法及びプログラムを提供することを目的とする。 An object of the present invention is to provide a computer system, a herbicide spraying support method, and a program, in which a herbicide is sprayed on an arbitrary point and weeds can be easily weeded.
 本発明では、以下のような解決手段を提供する。 The present invention provides the following solutions.
 本発明は、除草剤の散布を支援するコンピュータシステムであって、
 作物の播種地点の位置情報を取得する第一取得手段と、
 前記播種地点の位置情報を記憶する記憶手段と、
 圃場を撮影した撮影画像及び撮影地点の位置情報を取得する第二取得手段と、
 前記撮影画像を画像解析し、植物を検出する検出手段と、
 前記植物の位置情報を、前記撮影地点の位置情報に基づいて特定する特定手段と、
 記憶した前記播種地点の位置情報と、特定した前記植物の位置情報とに基づいて、当該播種地点以外の植物に除草剤を散布させる散布手段と、
 を備えることを特徴とするコンピュータシステムを提供する。
The present invention is a computer system that supports the application of herbicides,
A first acquisition means for acquiring position information of the seeding point of the crop,
Storage means for storing the position information of the seeding point,
Second acquisition means for acquiring a photographed image of the field and position information of the photographing point,
An image analysis of the photographed image, a detection means for detecting a plant,
Positioning information of the plant, specifying means for specifying based on the position information of the shooting point,
Based on the position information of the stored sowing point, and the position information of the identified plant, a spraying means for spraying a herbicide to plants other than the seeding point,
There is provided a computer system comprising:
 本発明によれば、除草剤の散布を支援するコンピュータシステムは、作物の播種地点の位置情報を取得し、前記播種地点の位置情報を記憶し、圃場を撮影した撮影画像及び撮影地点の位置情報を取得し、前記撮影画像を画像解析し、植物を検出し、前記植物の位置情報を、前記撮影地点の位置情報に基づいて特定し、記憶した前記播種地点の位置情報と、特定した前記植物の位置情報とに基づいて、当該播種地点以外の植物に除草剤を散布させる。 According to the present invention, the computer system that supports the spraying of the herbicide obtains the position information of the seeding point of the crop, stores the position information of the seeding point, and takes a photographed image of the field and the position information of the photographing point. Obtaining, image analysis of the photographed image, the plant is detected, the positional information of the plant is specified based on the positional information of the photographing point, the positional information of the stored seeding point, and the specified plant The herbicide is sprayed on plants other than the seeding point based on the position information of.
 本発明は、システムのカテゴリであるが、方法及びプログラム等の他のカテゴリにおいても、そのカテゴリに応じた同様の作用・効果を発揮する。 The present invention is a system category, but also in other categories such as methods and programs, the same action/effect according to the category is exhibited.
 また、本発明は、除草剤の散布を支援するコンピュータシステムであって、
 圃場を撮影した第一撮影画像及び撮影地点の位置情報を取得する第一取得手段と、
 前記第一撮影画像を画像解析し、作物の新芽を検出する第一検出手段と、
 前記新芽が存在する新芽地点の位置情報を、前記撮影地点の位置情報に基づいて特定する第一特定手段と、
 特定した前記新芽地点の位置情報を記憶する記憶手段と、
 第一撮影画像とは異なる時期の圃場を撮影した第二撮影画像及び撮影地点の位置情報を取得する第二取得手段と、
 前記第二撮影画像を画像解析し、植物を検出する第二検出手段と、
 前記植物の位置情報を、前記撮影地点の位置情報に基づいて特定する第二特定手段と、
 記憶した前記新芽地点の位置情報と、特定した前記植物の位置情報とに基づいて、当該新芽地点以外の植物に除草剤を散布させる散布手段と、
 を備えることを特徴とするコンピュータシステムを提供する。
In addition, the present invention is a computer system that supports the application of herbicides,
A first acquisition means for acquiring the first captured image of the field and the positional information of the capturing location;
Image analysis of the first captured image, first detection means for detecting the shoots of the crop,
Position information of the sprout point where the sprout is present, first specifying means for specifying based on the position information of the shooting point,
Storage means for storing the position information of the identified sprout point,
A second acquisition means for acquiring position information of the second captured image and the capturing location of the field at a time different from the first captured image;
An image analysis of the second captured image, a second detection means for detecting plants,
Position information of the plant, a second specifying means for specifying based on the position information of the shooting point,
Based on the stored position information of the sprout point and the position information of the identified plant, spraying means for spraying a herbicide on plants other than the sprout point,
There is provided a computer system comprising:
 本発明によれば、除草剤の散布を支援するコンピュータシステムは、圃場を撮影した第一撮影画像及び撮影地点の位置情報を取得し、前記第一撮影画像を画像解析し、作物の新芽を検出し、前記新芽が存在する新芽地点の位置情報を、前記撮影地点の位置情報に基づいて特定し、特定した前記新芽地点の位置情報を記憶し、第一撮影画像とは異なる時期の圃場を撮影した第二撮影画像及び撮影地点の位置情報を取得し、前記第二撮影画像を画像解析し、植物を検出し、前記植物の位置情報を、前記撮影地点の位置情報に基づいて特定し、記憶した前記新芽地点の位置情報と、特定した前記植物の位置情報とに基づいて、当該新芽地点以外の植物に除草剤を散布させる。 According to the present invention, the computer system that supports the spraying of the herbicide acquires the first photographed image of the field and the positional information of the photographing point, analyzes the image of the first photographed image, and detects the shoots of the crop. Then, the position information of the sprout point where the sprout is present is specified based on the position information of the shooting point, the specified position information of the sprout point is stored, and a field at a time different from the first shooting image is shot. The second photographed image and the position information of the photographing point are acquired, the second photographed image is image-analyzed, the plant is detected, the position information of the plant is specified based on the position information of the photographing point, and stored. The herbicide is sprayed on plants other than the sprout point based on the position information of the sprout point and the position information of the identified plant.
 本発明は、システムのカテゴリであるが、方法及びプログラム等の他のカテゴリにおいても、そのカテゴリに応じた同様の作用・効果を発揮する。 The present invention is a system category, but also in other categories such as methods and programs, the same action/effect according to the category is exhibited.
 本発明によれば、任意の地点に除草剤を散布させ、雑草を除草することが容易なコンピュータシステム、除草剤散布支援方法及びプログラムを提供することが可能となる。 According to the present invention, it is possible to provide a computer system, a herbicide spraying support method, and a program in which a herbicide is sprayed on an arbitrary point and weeds are easily weeded.
図1は、除草剤散布支援システム1の概要を示す図である。FIG. 1 is a diagram showing an outline of a herbicide spraying support system 1. 図2は、除草剤散布支援システム1の全体構成図である。FIG. 2 is an overall configuration diagram of the herbicide spraying support system 1. 図3は、コンピュータ10が実行する播種地点記憶処理のフローチャートを示す図である。FIG. 3 is a diagram showing a flowchart of the seeding point storage processing executed by the computer 10. 図4は、コンピュータ10が実行する第一除草剤散布支援処理のフローチャートを示す図である。FIG. 4 is a diagram showing a flowchart of the first herbicide spraying support process executed by the computer 10. 図5は、コンピュータ10が実行する第一学習処理のフローチャートを示す図である。FIG. 5 is a diagram showing a flowchart of the first learning process executed by the computer 10. 図6は、コンピュータ10が実行する新芽地点記憶処理のフローチャートを示す図である。FIG. 6 is a diagram showing a flowchart of a sprout spot storage process executed by the computer 10. 図7は、コンピュータ10が実行する第二除草剤散布支援処理のフローチャートを示す図である。FIG. 7 is a diagram showing a flowchart of the second herbicide application support process executed by the computer 10. 図8は、コンピュータ10が実行する第二学習処理のフローチャートを示す図である。FIG. 8 is a diagram showing a flowchart of the second learning process executed by the computer 10.
 以下、本発明を実施するための最良の形態について図を参照しながら説明する。なお、これはあくまでも例であって、本発明の技術的範囲はこれに限られるものではない。 Hereinafter, the best mode for carrying out the present invention will be described with reference to the drawings. Note that this is merely an example, and the technical scope of the present invention is not limited to this.
 [除草剤散布支援システム1の概要]
 本発明の好適な実施形態の概要について、図1に基づいて説明する。図1は、本発明の好適な実施形態である除草剤散布支援システム1の概要を説明するための図である。除草剤散布支援システム1は、コンピュータ10から構成され、除草剤の散布を支援するコンピュータシステムである。
[Outline of herbicide spraying support system 1]
An outline of a preferred embodiment of the present invention will be described with reference to FIG. FIG. 1 is a diagram for explaining the outline of a herbicide spraying support system 1 which is a preferred embodiment of the present invention. The herbicide spraying support system 1 is a computer system including a computer 10 and supporting spraying of a herbicide.
 なお、除草剤散布支援システム1は、ドローン、農機具、高性能農業機械、作物を栽培する作業者が所持する作業者端末(例えば、スマートフォンやタブレット端末やパーソナルコンピュータ)、その他のコンピュータ等のその他の端末や装置類が含まれていてもよい。また、除草剤散布支援システム1は、例えば、コンピュータ10等の1台のコンピュータで実現されてもよいし、クラウドコンピュータのように、複数のコンピュータで実現されてもよい。 In addition, the herbicide spraying support system 1 is a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal owned by a worker who cultivates a crop (for example, a smartphone, a tablet terminal, a personal computer), and other computers such as other computers. Terminals and devices may be included. Further, the herbicide spraying support system 1 may be realized by a single computer such as the computer 10, or may be realized by a plurality of computers such as a cloud computer.
 コンピュータ10は、ドローン、農機具、高性能農業機械、作業者端末、その他のコンピュータ等と、公衆回線網等を介して、データ通信可能に接続されており、必要なデータの送受信を実行する。 The computer 10 is connected to a drone, an agricultural machine tool, a high-performance agricultural machine, a worker terminal, other computers, etc. so as to be able to perform data communication via a public line network, etc., and executes necessary data transmission/reception.
 コンピュータ10は、作物の播種時、作物の播種地点の位置情報を取得する。コンピュータ10は、この播種地点の位置情報を、例えば、播種を行った農機具や高性能農業機械又は播種を行った作業者が所持する作業者端末等から取得する。 The computer 10 acquires the position information on the seeding point of the crop when sowing the crop. The computer 10 acquires the position information of the seeding point from, for example, an agricultural machine tool that has performed seeding, a high-performance agricultural machine, or a worker terminal that is possessed by an operator who performed seeding.
 コンピュータ10は、この播種地点の位置情報を記憶する。 The computer 10 stores the position information of this seeding point.
 コンピュータ10は、圃場を撮影した撮影画像及び撮影地点の位置情報を取得する。コンピュータ10は、例えば、ドローンにより圃場の各地点を撮影した撮影画像を取得する。コンピュータ10は、例えば、ドローンが撮影画像を撮影した撮影地点の位置情報を、ドローンから取得する。 The computer 10 acquires a photographed image of the field and position information of the photographing point. The computer 10 acquires, for example, a picked-up image obtained by picking up each point in the field with a drone. The computer 10 acquires from the drone, for example, the position information of the shooting location where the drone shot the shot image.
 コンピュータ10は、撮影画像を画像解析し、この撮影画像に写っている植物を検出する。コンピュータ10は、例えば、画像解析として、撮影画像の特徴点(例えば、形状、輪郭、色相)や特徴量(例えば、画素値の平均、分散、ヒストグラム等の統計的な数値)を抽出する。コンピュータ10は、抽出した特徴点や特徴量に基づいて、この撮影画像に写っている植物を検出する。この場合における植物は、播種した作物や除草する対象となりうる雑草を含むものである。 The computer 10 analyzes the photographed image and detects the plants shown in this photographed image. The computer 10 extracts, for example, a feature point (for example, shape, contour, hue) and a feature amount (for example, an average of pixel values, a variance, a statistical value such as a histogram) of the captured image as the image analysis. The computer 10 detects the plant shown in the captured image based on the extracted characteristic points and characteristic amounts. In this case, the plants include seeded crops and weeds that can be weeded.
 コンピュータ10は、検出した植物の位置情報を、取得した撮影地点の位置情報に基づいて特定する。コンピュータ10は、検出した植物の位置情報が、撮影地点の位置情報に一致するものとして、この植物の位置情報を特定する。その結果、コンピュータ10は、圃場における検出した植物の位置を特定することになる。 The computer 10 identifies the detected positional information of the plant based on the acquired positional information of the photographing location. The computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
 コンピュータ10は、記憶した播種地点の位置情報と、特定した植物の位置情報とに基づいて、この播種地点以外の位置に存在する植物に除草剤を散布させる。コンピュータ10は、特定した植物の位置情報が、播種地点の位置情報と一致する場合、この植物が作物であるものと判断し、除草剤を散布させない。一方、コンピュータ10は、特定した植物の位置情報が、播種地点の位置情報と一致しない場合、この植物が雑草であるものと判断し、除草剤を散布させる。 The computer 10 sprays the herbicide on the plants existing at positions other than this seeding point based on the stored position information of the seeding point and the stored position information of the plant. When the position information of the specified plant matches the position information of the seeding point, the computer 10 determines that this plant is a crop and does not spray the herbicide. On the other hand, when the position information of the specified plant does not match the position information of the seeding point, the computer 10 determines that this plant is a weed, and sprays a herbicide.
 コンピュータ10は、雑草と判断した植物の位置情報に基づいて、この雑草の位置にドローンを移動させ、このドローンが有する除草剤散布装置により除草剤を散布させるコマンドをドローンに送信する。ドローンは、このコマンドを受信し、このコマンドに基づいて、播種地点以外の位置に存在する植物に除草剤を散布する。その結果、コンピュータ10は、播種地点以外の植物に除草剤を散布させることになる。 The computer 10 moves the drone to the position of this weed based on the position information of the plant judged to be a weed, and sends a command to the drone to spray the herbicide by the herbicide spraying device of this drone. The drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point. As a result, the computer 10 sprays the herbicide on the plants other than the seeding point.
 なお、コンピュータ10は、検出した植物の画像を学習することも可能である。この場合、コンピュータ10は、取得した撮影画像を画像解析する際、この学習結果を加味して、植物を検出することになる。 Note that the computer 10 can also learn the detected plant image. In this case, the computer 10 detects plants by taking this learning result into consideration when performing image analysis on the acquired captured image.
 本発明の変形例について説明する。 A modification of the present invention will be described.
 上述した実施形態との相違点は、上述した播種地点の位置情報の代わりに、作物の新芽の位置情報を用いる点である。 The difference from the above-described embodiment is that the position information of the sprout of the crop is used instead of the position information of the sowing point described above.
 コンピュータ10は、圃場を撮影した第一撮影画像及び撮影地点の位置情報を取得する。コンピュータ10は、例えば、ドローンにより圃場の各地点を撮影した第一撮影画像を取得する。コンピュータ10は、例えば、ドローンが第一撮影画像を撮影した撮影地点の位置情報を、ドローンから取得する。 The computer 10 acquires the first photographed image of the field and the position information of the photographing point. The computer 10 acquires, for example, a first photographed image obtained by photographing each point in the field with a drone. The computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the first shot image.
 コンピュータ10は、第一撮影画像を画像解析し、この第一撮影画像に写っている作物の新芽を検出する。コンピュータ10は、例えば、画像解析として、第一撮影画像の特徴点や特徴量を抽出する。コンピュータ10は、抽出した特徴点や特徴量に基づいて、この第一撮影画像に写っている作物の新芽を検出する。 The computer 10 analyzes the first photographed image and detects the sprout of the crop shown in this first photographed image. The computer 10 extracts, for example, a feature point or a feature amount of the first captured image as image analysis. The computer 10 detects the sprout of the crop shown in the first photographed image based on the extracted characteristic points and characteristic amounts.
 コンピュータ10は、検出した作物の新芽の位置情報を、取得した撮影地点の位置情報に基づいて特定する。コンピュータ10は、検出した作物の新芽の位置情報が、撮影地点の位置情報に一致するものとして、この作物の新芽の位置を、新芽地点の位置情報として特定する。その結果、コンピュータ10は、圃場における検出した作物の新芽の位置を新芽地点として特定することになる。 The computer 10 identifies the detected position information of the sprout of the crop based on the acquired position information of the shooting point. The computer 10 identifies the position of the sprout of the crop as the position information of the sprout point, assuming that the detected position information of the sprout of the crop matches the position information of the photographing point. As a result, the computer 10 identifies the position of the detected sprout of the crop in the field as the sprout point.
 コンピュータ10は、この新芽地点の位置情報を記憶する。 The computer 10 stores the position information of this sprout point.
 コンピュータ10は、第一撮影画像とは異なる時期の圃場を撮影した第二撮影画像及び撮影地点の位置情報を取得する。コンピュータ10は、例えば、ドローンにより圃場の各地点を撮影した第二撮影画像を取得する。コンピュータ10は、例えば、ドローンが第二撮影画像を撮影した撮影地点の位置情報を、ドローンから取得する。 The computer 10 acquires the second photographed image of the field photographed at a different time from the first photographed image and the position information of the photographing point. The computer 10 acquires, for example, a second photographed image obtained by photographing each point in the field with a drone. The computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the second shot image.
 コンピュータ10は、第二撮影画像を画像解析し、この第二撮影画像に写っている植物を検出する。コンピュータ10は、例えば、画像解析として、第二撮影画像の特徴点や特徴量を抽出する。コンピュータ10は、抽出した特徴点や特徴量に基づいて、この第二撮影画像に写っている植物を検出する。この場合における植物は、新芽から成長した作物や除草する対象となりうる雑草を含むものである。 The computer 10 analyzes the image of the second captured image, and detects the plants in the second captured image. The computer 10 extracts the feature points and the feature amount of the second captured image as image analysis, for example. The computer 10 detects the plant in the second captured image based on the extracted feature points and feature amounts. The plants in this case include crops grown from new shoots and weeds that can be weeded.
 コンピュータ10は、検出した植物の位置情報を、取得した撮影地点の位置情報に基づいて特定する。コンピュータ10は、検出した植物の位置情報が、撮影地点の位置情報に一致するものとして、この植物の位置情報を特定する。その結果、コンピュータ10は、圃場における検出した植物の位置を特定することになる。 The computer 10 identifies the detected positional information of the plant based on the acquired positional information of the photographing location. The computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
 コンピュータ10は、記憶した新芽地点の位置情報と、特定した植物の位置情報とに基づいて、この新芽地点以外の位置に存在する植物に除草剤を散布させる。コンピュータ10は、特定した植物の位置情報が、新芽地点の位置情報と一致する場合、この植物が作物であるものと判断し、除草剤を散布させない。一方、コンピュータ10は、特定した植物の位置情報が、新芽地点の位置情報と一致しない場合、この植物が雑草であるものと判断し、除草剤を散布させる。 The computer 10 sprays the herbicide on the plants existing at the positions other than the sprout point based on the stored position information of the sprout point and the stored position information of the plant. When the position information of the specified plant matches the position information of the sprout point, the computer 10 determines that this plant is a crop and does not spray the herbicide. On the other hand, if the position information of the specified plant does not match the position information of the sprout point, the computer 10 determines that this plant is a weed, and sprays a herbicide.
 コンピュータ10は、雑草と判断した植物の位置情報に基づいて、この雑草の位置にドローンを移動させ、このドローンが有する除草剤散布装置により除草剤を散布させるコマンドをドローンに送信する。ドローンは、このコマンドを受信し、このコマンドに基づいて、播種地点以外の位置に存在する植物に除草剤を散布する。その結果、コンピュータ10は、播種地点以外の植物に除草剤を散布させることになる。 The computer 10 moves the drone to the position of this weed based on the position information of the plant judged to be a weed, and sends a command to the drone to spray the herbicide by the herbicide spraying device of this drone. The drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point. As a result, the computer 10 sprays the herbicide on the plants other than the seeding point.
 なお、コンピュータ10は、検出した植物の画像を学習することも可能である。この場合、コンピュータ10は、取得した第二撮影画像を画像解析する際、この学習結果を加味して、植物を検出することになる。 Note that the computer 10 can also learn the detected plant image. In this case, the computer 10 detects plants by taking this learning result into consideration when performing image analysis on the acquired second captured image.
 次に、除草剤散布支援システム1が実行する処理の概要について説明する。 Next, the outline of the processing executed by the herbicide spraying support system 1 will be described.
 コンピュータ10は、作物の播種時、作物の播種地点の位置情報を取得する(ステップS01)。コンピュータ10は、播種を行った農機具や高性能農業機械又は播種を行った作業者が所持する作業者端末等から、この播種地点の位置情報を取得する。 The computer 10 acquires the position information of the seeding point of the crop when sowing the crop (step S01). The computer 10 acquires the position information of the seeding point from the agricultural machine tool that has sowed the seed, the high-performance agricultural machine, or the worker terminal possessed by the worker who has seeded the seed.
 コンピュータ10は、この播種地点の位置情報を記憶する(ステップS02)。 The computer 10 stores the position information of this seeding point (step S02).
 コンピュータ10は、圃場を撮影した撮影画像及び撮影地点の位置情報を撮影データとして取得する(ステップS03)。コンピュータ10は、例えば、ドローンにより圃場の各地点を撮影した撮影画像を取得する。コンピュータ10は、例えば、ドローンが撮影画像を撮影した撮影地点の位置情報を、ドローンから取得する。コンピュータ10は、このような撮影画像及び撮影地点の位置情報を撮影データとして取得する。 The computer 10 acquires a photographed image of the field and position information of the photographing point as photographing data (step S03). The computer 10 acquires, for example, a picked-up image obtained by picking up each point in the field with a drone. The computer 10 acquires from the drone, for example, the position information of the shooting location where the drone shot the shot image. The computer 10 acquires such photographed images and positional information of photographing points as photographing data.
 コンピュータ10は、撮影画像を画像解析し、この撮影画像に写っている植物を検出する(ステップS04)。コンピュータ10は、例えば、画像解析として、撮影画像の特徴点や特徴量を抽出する。コンピュータ10は、抽出した特徴点や特徴量に基づいて、この撮影画像に写っている植物を検出する。この場合における植物は、播種した作物や除草する対象となりうる雑草を含むものである。 The computer 10 analyzes the photographed image to detect the plants in the photographed image (step S04). The computer 10 extracts, for example, a feature point or a feature amount of a captured image as image analysis. The computer 10 detects the plant shown in the captured image based on the extracted characteristic points and characteristic amounts. In this case, the plants include seeded crops and weeds that can be weeded.
 コンピュータ10は、検出した植物の位置情報を、取得した撮影地点の位置情報に基づいて特定する(ステップS05)。コンピュータ10は、検出した植物の位置情報が、撮影地点の位置情報に一致するものとして、この植物の位置情報を特定する。その結果、コンピュータ10は、圃場における検出した植物の位置を特定することになる。 The computer 10 specifies the detected position information of the plant based on the acquired position information of the photographing location (step S05). The computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
 コンピュータ10は、記憶した播種地点の位置情報と、特定した植物の位置情報とに基づいて、この播種地点以外の位置に存在する植物に除草剤を散布させる(ステップS06)。コンピュータ10は、特定した植物の位置情報が、播種地点の位置情報と一致する場合、この植物が作物であるものと判断し、除草剤を散布させない。一方、コンピュータ10は、特定した植物の位置情報が、播種地点の位置情報と一致しない場合、この植物が雑草であるものと判断し、除草剤を散布させる。このとき、コンピュータ10は、雑草と判断した植物の位置情報に基づいて、この雑草の位置にドローンを移動させ、このドローンが有する除草剤散布装置により除草剤を散布させるコマンドをドローンに送信する。ドローンは、このコマンドを受信し、このコマンドに基づいて、播種地点以外の位置に存在する植物に除草剤を散布する。その結果、コンピュータ10は、播種地点以外の植物に除草剤を散布させることになる。 The computer 10 sprays the herbicide on the plants existing at positions other than the seeding point based on the stored position information of the seeding point and the stored position information of the plant (step S06). When the position information of the specified plant matches the position information of the seeding point, the computer 10 determines that this plant is a crop and does not spray the herbicide. On the other hand, when the position information of the specified plant does not match the position information of the seeding point, the computer 10 determines that this plant is a weed, and sprays a herbicide. At this time, the computer 10 moves the drone to the position of this weed based on the position information of the plant determined to be a weed, and transmits a command to the drone to spray the herbicide by the herbicide spraying device of this drone. The drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point. As a result, the computer 10 sprays the herbicide on the plants other than the seeding point.
 以上が、除草剤散布支援システム1が実行する処理の概要である。 The above is an outline of the processing executed by the herbicide spraying support system 1.
 除草剤散布支援システム1の変形例が実行する処理の概要について説明する。 An outline of the processing executed by the modified example of the herbicide spraying support system 1 will be described.
 コンピュータ10は、圃場を撮影した第一撮影画像及び撮影地点の位置情報を第一撮影データとして取得する(ステップS10)。コンピュータ10は、例えば、ドローンにより圃場の各地点を撮影した第一撮影画像を取得する。コンピュータ10は、例えば、ドローンが第一撮影画像を撮影した撮影地点の位置情報を、ドローンから取得する。 The computer 10 acquires the first photographed image of the field and the position information of the photographing point as the first photographing data (step S10). The computer 10 acquires, for example, a first photographed image obtained by photographing each point in the field with a drone. The computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the first shot image.
 コンピュータ10は、第一撮影画像を画像解析し、この第一撮影画像に写っている作物の新芽を検出する(ステップS11)。コンピュータ10は、例えば、画像解析として、第一撮影画像の特徴点や特徴量を抽出する。コンピュータ10は、抽出した特徴点や特徴量に基づいて、この第一撮影画像に写っている作物の新芽を検出する。 The computer 10 analyzes the image of the first captured image and detects the sprout of the crop shown in the first captured image (step S11). The computer 10 extracts, for example, a feature point or a feature amount of the first captured image as image analysis. The computer 10 detects the sprout of the crop shown in the first photographed image based on the extracted characteristic points and characteristic amounts.
 コンピュータ10は、検出した作物の新芽の位置情報を、取得した撮影地点の位置情報に基づいて特定する(ステップS12)。コンピュータ10は、検出した作物の新芽の位置情報が、撮影地点の位置情報に一致するものとして、この作物の新芽の位置を、新芽地点の位置情報として特定する。その結果、コンピュータ10は、圃場における検出した作物の新芽の位置を新芽地点として特定することになる。 The computer 10 identifies the position information of the detected sprout of the crop based on the acquired position information of the shooting point (step S12). The computer 10 identifies the position of the sprout of the crop as the position information of the sprout point, assuming that the detected position information of the sprout of the crop matches the position information of the photographing point. As a result, the computer 10 identifies the position of the detected sprout of the crop in the field as the sprout point.
 コンピュータ10は、この新芽地点の位置情報を記憶する(ステップS13)。 The computer 10 stores the position information of this sprout point (step S13).
 コンピュータ10は、第一撮影画像とは異なる時期の圃場を撮影した第二撮影画像及び撮影地点の位置情報を第二撮影データとして取得する(ステップS14)。コンピュータ10は、例えば、ドローンにより第一撮影画像とは異なる時期の圃場の各地点を撮影した第二撮影画像を取得する。コンピュータ10は、例えば、ドローンが第二撮影画像を撮影した撮影地点の位置情報を、ドローンから取得する。 The computer 10 acquires the second captured image obtained by capturing the field at a time different from the first captured image and the position information of the capturing location as the second captured data (step S14). The computer 10 acquires, for example, a second photographed image obtained by photographing each point in the field at a time different from the first photographed image using a drone. The computer 10 acquires from the drone, for example, position information of the shooting location where the drone shot the second shot image.
 コンピュータ10は、第二撮影画像を画像解析し、この第二撮影画像に写っている植物を検出する(ステップS15)。コンピュータ10は、例えば、画像解析として、第二撮影画像の特徴点や特徴量を抽出する。コンピュータ10は、抽出した特徴点や特徴量に基づいて、この第二撮影画像に写っている植物を検出する。この場合における植物は、新芽から成長した作物や除草する対象となりうる雑草を含むものである。 The computer 10 analyzes the image of the second captured image, and detects the plants in the second captured image (step S15). The computer 10 extracts the feature points and the feature amount of the second captured image as image analysis, for example. The computer 10 detects the plant in the second captured image based on the extracted feature points and feature amounts. The plants in this case include crops grown from new shoots and weeds that can be weeded.
 コンピュータ10は、検出した植物の位置情報を、取得した撮影地点の位置情報に基づいて特定する(ステップS16)。コンピュータ10は、検出した植物の位置情報が、撮影地点の位置情報に一致するものとして、この植物の位置情報を特定する。その結果、コンピュータ10は、圃場における検出した植物の位置を特定することになる。 The computer 10 identifies the detected positional information of the plant based on the acquired positional information of the photographing location (step S16). The computer 10 identifies the position information of the plant as the detected position information of the plant matches the position information of the photographing location. As a result, the computer 10 will specify the position of the detected plant in the field.
 コンピュータ10は、記憶した新芽地点の位置情報と、特定した植物の位置情報とに基づいて、この新芽地点以外の位置に存在する植物に除草剤を散布させる(ステップS17)。コンピュータ10は、特定した植物の位置情報が、新芽地点の位置情報と一致する場合、この植物が作物であるものと判断し、除草剤を散布させない。一方、コンピュータ10は、特定した植物の位置情報が、新芽地点の位置情報と一致しない場合、この植物が雑草であるものと判断し、除草剤を散布させる。このとき、コンピュータ10は、雑草と判断した植物の位置情報に基づいて、この雑草の位置にドローンを移動させ、このドローンが有する除草剤散布装置により除草剤を散布させるコマンドをドローンに送信する。ドローンは、このコマンドを受信し、このコマンドに基づいて、播種地点以外の位置に存在する植物に除草剤を散布する。その結果、コンピュータ10は、播種地点以外の植物に除草剤を散布させることになる。 The computer 10 sprays the herbicide on the plants existing at the positions other than the sprout point based on the stored position information of the sprout point and the specified position information of the plant (step S17). When the position information of the specified plant matches the position information of the sprout point, the computer 10 determines that this plant is a crop and does not spray the herbicide. On the other hand, if the position information of the specified plant does not match the position information of the sprout point, the computer 10 determines that this plant is a weed, and sprays a herbicide. At this time, the computer 10 moves the drone to the position of this weed based on the position information of the plant determined to be a weed, and transmits a command to the drone to spray the herbicide by the herbicide spraying device of this drone. The drone receives this command and, on the basis of this command, sprays the herbicide to the plants existing at the position other than the sowing point. As a result, the computer 10 sprays the herbicide on the plants other than the seeding point.
 以上が、除草剤散布支援システム1の変形例が実行する処理の概要である。 The above is the outline of the processing executed by the modified example of the herbicide spraying support system 1.
 [除草剤散布支援システム1のシステム構成]
 図2に基づいて、本発明の好適な実施形態である除草剤散布支援システム1のシステム構成について説明する。図2は、本発明の好適な実施形態である除草剤散布支援システム1のシステム構成を示す図である。図2において、除草剤散布支援システム1は、コンピュータ10から構成され、除草剤の散布を支援するコンピュータシステムである。
[System configuration of herbicide spraying support system 1]
Based on FIG. 2, the system configuration of the herbicide spraying support system 1, which is a preferred embodiment of the present invention, will be described. FIG. 2 is a diagram showing a system configuration of a herbicide spraying support system 1 which is a preferred embodiment of the present invention. In FIG. 2, the herbicide spraying support system 1 is a computer system which is composed of a computer 10 and supports spraying of the herbicide.
 コンピュータ10は、ドローン、農機具、高性能農業機械、作業者端末、その他のコンピュータ等と公衆回線網等を介してデータ通信可能に接続されており、必要なデータの送受信を実行する。 The computer 10 is connected to a drone, an agricultural machine tool, a high-performance agricultural machine, a worker terminal, and other computers so as to be able to perform data communication via a public line network, etc., and executes necessary data transmission/reception.
 なお、除草剤散布支援システム1は、図示していないドローン、農機具、高性能農業機械、作業者端末、その他のコンピュータ等やその他の端末や装置類が含まれていてもよい。また、除草剤散布支援システム1は、例えば、コンピュータ10等の1台のコンピュータで実現されてもよいし、クラウドコンピュータのように、複数のコンピュータで実現されてもよい。 The herbicide spraying support system 1 may include a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal, other computers, and other terminals and devices, which are not shown. Further, the herbicide spraying support system 1 may be realized by a single computer such as the computer 10, or may be realized by a plurality of computers such as a cloud computer.
 コンピュータ10は、CPU(Central Processing Unit)、RAM(Random Access Memory)、ROM(Read Only Memory)等を備え、通信部として、他の端末や装置等と通信可能にするためのデバイス、例えば、IEEE802.11に準拠したWi―Fi(Wireless―Fidelity)対応デバイス等を備える。また、コンピュータ10は、記憶部として、ハードディスクや半導体メモリ、記録媒体、メモリカード等によるデータのストレージ部を備える。また、コンピュータ10は、処理部として、各種処理を実行する各種デバイス等を備える。 The computer 10 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, and a device for enabling communication with other terminals or devices, such as the IEEE 802, as a communication unit. .. 11 compliant Wi-Fi (Wireless-Fidelity) compatible devices and the like. The computer 10 also includes, as a storage unit, a data storage unit such as a hard disk, a semiconductor memory, a recording medium, or a memory card. Further, the computer 10 includes various devices that execute various processes as a processing unit.
 コンピュータ10において、制御部が所定のプログラムを読み込むことにより、通信部と協働して、播種位置取得モジュール20、撮影データ取得モジュール21、コマンド送信モジュール22を実現する。また、コンピュータ10において、制御部が所定のプログラムを読み込むことにより、記憶部と協働して、記憶モジュール30を実現する。また、コンピュータ10において、制御部が所定のプログラムを読み込むことにより、処理部と協働して、画像解析モジュール40、位置特定モジュール41、作物特定モジュール42、コマンド作成モジュール43、学習モジュール44を実現する。 In the computer 10, the control unit reads a predetermined program so that the seeding position acquisition module 20, the imaging data acquisition module 21, and the command transmission module 22 are realized in cooperation with the communication unit. Further, in the computer 10, the control unit reads a predetermined program to cooperate with the storage unit to realize the storage module 30. Further, in the computer 10, the control unit loads a predetermined program to realize the image analysis module 40, the position identification module 41, the crop identification module 42, the command creation module 43, and the learning module 44 in cooperation with the processing unit. To do.
 [播種地点記憶処理]
 図3に基づいて、除草剤散布支援システム1が実行する播種地点記憶処理について説明する。図3は、コンピュータ10が実行する播種地点記憶処理のフローチャートを示す図である。上述した各モジュールが実行する処理について、本処理に併せて説明する。
[Seeding point memory processing]
The seeding point storage process executed by the herbicide application support system 1 will be described with reference to FIG. FIG. 3 is a diagram showing a flowchart of the seeding point storage processing executed by the computer 10. The processing executed by each module described above will be described together with this processing.
 播種位置取得モジュール20は、作物の播種地点の位置情報を取得する(ステップS20)。ステップS20において、播種位置取得モジュール20は、播種を行った機器や装置等又は播種を行った作業者が所持する作業者端末から、この播種地点の位置情報を取得する。例えば、作業者端末は、播種を行った位置において、自身の位置情報を、GPS(Global Positioning System)等から取得する。作業者端末は、取得した自身の位置情報を、播種地点の位置情報として、コンピュータ10に送信する。播種位置取得モジュール20は、作業者端末が送信した播種地点の位置情報を受信することにより、この作物の播種地点の位置情報を取得する。 The seeding position acquisition module 20 acquires the position information of the seeding point of the crop (step S20). In step S20, the seeding position acquisition module 20 acquires the position information of the seeding point from the seeding equipment or device or the worker terminal possessed by the worker who seeded. For example, the worker terminal acquires its own position information from the GPS (Global Positioning System) or the like at the seeded position. The worker terminal transmits the acquired position information of itself to the computer 10 as the position information of the seeding point. The seeding position acquisition module 20 acquires the position information of the seeding point of this crop by receiving the position information of the seeding point transmitted by the worker terminal.
 記憶モジュール30は、この作物の播種地点の位置情報を記憶する(ステップS21)。ステップS21において、記憶モジュール30は、この作物の播種地点の位置情報のみを記憶してもよいし、作業者、利用者、圃場の識別子等と対応付けて記憶してもよい。 The storage module 30 stores the position information of the seeding point of this crop (step S21). In step S21, the storage module 30 may store only the position information of the seeding point of this crop, or may store it in association with the operator, the user, the identifier of the field, and the like.
 以上が、播種地点記憶処理である。 The above is the sowing point memory processing.
 [第一除草剤散布支援処理]
 図4に基づいて、除草剤散布支援システム1が実行する第一除草剤散布支援処理について説明する。図4は、コンピュータ10が実行する第一除草剤散布支援処理のフローチャートを示す図である。上述した各モジュールが実行する処理について、本処理に併せて説明する。
[First herbicide spraying support processing]
The first herbicide application support process executed by the herbicide application support system 1 will be described with reference to FIG. FIG. 4 is a diagram showing a flowchart of the first herbicide spraying support process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
 撮影データ取得モジュール21は、圃場を撮影した撮影画像及び撮影地点の位置情報を、撮影データとして取得する(ステップS30)。ステップS30において、撮影データ取得モジュール21は、例えば、ドローンが圃場の予め設定された複数の撮影地点において撮影した撮影画像と、ドローンが撮影画像を撮影した撮影地点の位置情報とを撮影データとして取得する。ドローンは、自身の直下を、撮影画像として撮影する。すなわち、ドローンは、圃場に対して垂直な位置から撮影画像を撮影することになる。ドローンは、撮影地点において、撮影画像を撮影するとともに、自身の位置情報をGPS等から取得する。ドローンは、この取得した自身の位置情報を撮影地点の位置情報として扱う。ドローンは、この撮影画像と、撮影地点の位置情報とを撮影データとして、コンピュータ10に送信する。撮影データ取得モジュール21は、ドローンが送信した撮影データを受信することにより、撮影データを取得する。その結果、コンピュータ10は、圃場を撮影した撮影画像及び撮影地点の位置情報を取得することになる。 The photographing data acquisition module 21 acquires the photographed image of the field and the position information of the photographing point as photographing data (step S30). In step S30, the imaging data acquisition module 21 acquires, as imaging data, for example, imaging images taken by the drone at a plurality of preset imaging points in the field and position information of the imaging points at which the drone images the imaging images. To do. The drone photographs the area directly below itself as a photographed image. That is, the drone captures the captured image from a position vertical to the field. The drone captures a captured image at the capturing location and acquires its own position information from the GPS or the like. The drone handles the acquired position information of itself as the position information of the shooting point. The drone transmits the photographed image and the position information of the photographing point to the computer 10 as photographing data. The imaging data acquisition module 21 acquires imaging data by receiving the imaging data transmitted by the drone. As a result, the computer 10 acquires the photographed image of the field and the position information of the photographing point.
 画像解析モジュール40は、撮影データに基づいて、撮影画像を画像解析する(ステップS31)。ステップS31において、画像解析モジュール40は、この撮影画像における特徴点や特徴量を抽出する。画像解析モジュール40は、例えば、画像解析として、撮影画像に存在する物体の形状、色相等を抽出する。 The image analysis module 40 analyzes the photographed image based on the photographed data (step S31). In step S31, the image analysis module 40 extracts feature points and feature amounts in this captured image. The image analysis module 40 extracts, for example, the shape, hue, etc. of an object existing in a captured image as image analysis.
 画像解析モジュール40は、画像解析の結果に基づいて、植物が検出できたか否かを判断する(ステップS32)。ステップS32において、画像解析モジュール40は、抽出した特徴点や特徴量と、予め植物の特徴点や特徴量と植物の識別子とを対応付けて登録した植物データベースとを比較することにより、撮影画像に植物が存在するか否かを判断する。画像解析モジュール40は、今回抽出した特徴点や特徴量と、植物データベースに登録された特徴点や特徴量とが一致する場合、植物が検出できたと判断し、一致しない場合、植物が検出できなかったと判断することになる。画像解析モジュール40は、植物が検出できた場合、この植物の特徴点や特徴量と、植物データベースとに基づいて、この植物の識別子を併せて判断する。 The image analysis module 40 determines whether or not a plant can be detected based on the result of image analysis (step S32). In step S32, the image analysis module 40 compares the extracted feature points and feature amounts with the plant database in which the feature points and feature amounts of plants and the identifiers of plants are registered in advance to obtain a captured image. Determine whether plants are present. The image analysis module 40 determines that a plant has been detected when the feature point or feature amount extracted this time and the feature point or feature amount registered in the plant database match, and if they do not match, the plant cannot be detected. Will be judged. When the plant can be detected, the image analysis module 40 also determines the identifier of the plant based on the feature point and the feature amount of the plant and the plant database.
 ステップS32において、画像解析モジュール40は、画像解析の結果、植物が検出できなかったと判断した場合(ステップS32 NO)、コンピュータ10は、本処理を終了する。なお、コンピュータ10は、この場合、上述したステップS30の処理を実行し、今回画像解析を行った撮影データとは異なる撮影データを取得する構成であってもよい。 In step S32, when the image analysis module 40 determines that the plant could not be detected as a result of the image analysis (step S32 NO), the computer 10 ends this processing. In this case, the computer 10 may be configured to execute the above-described processing of step S30 and acquire the shooting data different from the shooting data subjected to the image analysis this time.
 一方、ステップS32において、画像解析モジュール40は、画像解析の結果、植物が検出できたと判断した場合(ステップS32 YES)、位置特定モジュール41は、この撮影データにおける位置情報に基づいて、検出した植物の位置情報を特定する(ステップS33)。ステップS33において、位置特定モジュール41は、今回画像解析を行った撮影画像に対応する撮影地点の位置情報を、撮影データに基づいて特定する。位置特定モジュール41は、撮影地点の位置情報を、この植物の位置情報として特定する。さらに、位置特定モジュール41は、この植物の位置情報と、撮影画像における座標とに基づいて、この植物の詳細な位置を特定する。例えば、位置特定モジュール41は、撮影画像に対して、直交座標系を設定し、この植物を検出した撮影画像における位置を、この撮影画像におけるX座標及びY座標として特定する。位置特定モジュール41は、撮影地点における位置情報と、このX座標及びY座標とに基づいて、実際の圃場における植物の位置情報を特定する。このとき、位置特定モジュール41は、撮影画像の中心の位置が、撮影地点における位置情報に該当し、X座標及びY座標がこの中心の位置に対する位置として特定する。その結果、位置特定モジュール41は、圃場におけるこの植物の位置を特定することになる。 On the other hand, in step S32, when the image analysis module 40 determines that the plant can be detected as a result of the image analysis (step S32 YES), the position specifying module 41 determines the detected plant based on the position information in the photographing data. The position information of is identified (step S33). In step S33, the position specifying module 41 specifies the position information of the shooting point corresponding to the shot image subjected to the image analysis this time based on the shooting data. The position specifying module 41 specifies the position information of the shooting point as the position information of this plant. Furthermore, the position specifying module 41 specifies the detailed position of this plant based on the position information of this plant and the coordinates in the captured image. For example, the position specifying module 41 sets an orthogonal coordinate system for the picked-up image and specifies the position in the picked-up image in which this plant is detected as the X coordinate and the Y coordinate in this picked-up image. The position specifying module 41 specifies the position information of the plant in the actual field on the basis of the position information at the shooting point and the X and Y coordinates. At this time, the position specifying module 41 specifies the center position of the captured image as position information at the shooting point, and specifies the X coordinate and the Y coordinate as positions with respect to the center position. As a result, the position specifying module 41 will specify the position of this plant in the field.
 作物特定モジュール42は、上述したステップS21の処理により記憶した播種地点の位置情報と、特定した植物の位置情報とに基づいて、この植物が播種された作物であるか否かを判断する(ステップS34)。ステップS34において、作物特定モジュール42は、この播種地点の位置情報と、特定した植物の位置情報とを比較し、其々が一致するか否かに基づいて、この判断を実行する。作物特定モジュール42は、一致すると判断した場合、この植物は、作物である(播種された作物が成長したものである)と判断し(ステップS34 YES)、後述する処理を実行せずに、本処理を終了する。なお、コンピュータ10は、上述したステップS32の処理と同様に、上述したステップS30の処理を実行し、今回画像解析を行った撮影データとは異なる撮影データを取得する構成であってもよい。 The crop specifying module 42 determines whether or not this plant is a seed sown crop based on the position information of the seeding point stored by the process of step S21 described above and the position information of the specified plant (step). S34). In step S34, the crop identification module 42 compares the position information of this seeding point with the position information of the specified plant, and makes this determination based on whether or not they match. If the crop identification module 42 determines that they match, it determines that this plant is a crop (the sown crop has grown) (step S34 YES), and executes the process described below without executing the process. The process ends. Note that the computer 10 may be configured to execute the process of step S30 described above and acquire the image data different from the image data subjected to the image analysis this time, similarly to the process of step S32 described above.
 その結果、コンピュータ10は、検出した植物が作物である場合、除草剤の散布に必要なコマンドの作成及び送信を行わないため、この作物に対して除草剤の散布を行わせないことになる。 As a result, when the detected plant is a crop, the computer 10 does not create or transmit the command necessary for spraying the herbicide, and thus does not spray the crop with the herbicide.
 一方、ステップS34において、作物特定モジュール42は、一致しないと判断した場合、この植物は、作物ではない(播種された作物が成長したものではなく雑草である)と判断し(ステップS34 NO)、コマンド作成モジュール43は、この植物を除草する除草剤をドローンに散布させるコマンドを作成する(ステップS35)。ステップS35において、コマンド作成モジュール43は、この植物の位置情報に対応する圃場の位置への飛行コマンド、ドローンが有する除草剤散布装置を駆動させる駆動コマンドを作成する。 On the other hand, in step S34, if the crop identification module 42 determines that they do not match, it determines that this plant is not a crop (the sown crop is not a grown one but a weed) (step S34 NO), The command creation module 43 creates a command for spraying a drone with a herbicide for weeding this plant (step S35). In step S35, the command creation module 43 creates a flight command to the position of the field corresponding to the position information of this plant and a drive command to drive the herbicide spraying device of the drone.
 コマンド作成モジュール43が作成するコマンドについて説明する。コマンド作成モジュール43は、飛行コマンドとして、この植物の位置情報に飛行するために必要なコマンドを作成する。また、コマンド作成モジュール43は、予め植物の識別子と有効な除草剤の識別子とを対応付けて登録した除草剤データベースを参照し、今回検出した植物に有効な除草剤を特定する。さらに、コマンド作成モジュール43は、抽出した特徴点や特徴量に基づいて、必要な除草剤の散布量を判断する。例えば、コマンド作成モジュール43は、抽出した植物の大きさや形状に応じて、必要な除草剤の散布量を判断する。コマンド作成モジュール43は、駆動コマンドとして、この判断した除草剤の識別子及び散布量を、除草剤散布装置に散布させるために必要なコマンドを作成する。 The commands created by the command creation module 43 will be described. The command creation module 43 creates, as flight commands, commands necessary to fly to the position information of this plant. Further, the command creation module 43 refers to the herbicide database in which the identifier of the plant and the identifier of the effective herbicide are registered in advance and identifies the effective herbicide for the plant detected this time. Furthermore, the command creation module 43 determines the required amount of herbicide sprayed based on the extracted feature points and feature amounts. For example, the command creation module 43 determines the required spraying amount of the herbicide according to the size and shape of the extracted plant. The command creation module 43 creates, as a drive command, a command necessary to apply the determined herbicide identifier and application amount to the herbicide application device.
 なお、除草剤散布装置が予め設定された一の除草剤のみを散布するものである場合、コマンド作成モジュール43は、この除草剤の散布量のみを判断し、駆動コマンドとして、この判断した散布量を、除草剤散布装置に散布させるために必要なコマンドを作成すればよい。 If the herbicide spraying device sprays only one preset herbicide, the command creation module 43 determines only the spraying amount of this herbicide, and determines this spraying amount as the drive command. It is only necessary to create the command necessary for spraying to the herbicide spraying device.
 コンピュータ10は、上述した処理を、圃場の予め設定された領域又は圃場全体に対して実行する。例えば、コンピュータ10は、圃場全体に対して、撮影データを取得し、各撮影地点における植物の有無判断、植物の作物判断を実行し、コマンドの作成または非作成の処理を実行することになる。コンピュータ10は、全ての撮影データに対して処理が完了した時、後述する処理を実行することになる。 The computer 10 executes the above-described processing on a preset area of the field or the entire field. For example, the computer 10 acquires the photographing data for the entire field, judges the presence/absence of a plant at each photographing point, judges the crop of the plant, and executes the process of creating or not creating a command. The computer 10 will execute the process described later when the process is completed for all the photographing data.
 なお、コンピュータ10は、全ての撮影データに対して処理が完了した時ではなく、各撮影データに対して、植物が作物ではないと判断した場合、後述する処理を実行する構成であってもよい。すなわち、コンピュータ10は、個別の撮影データに対して、植物が作物である場合、後述する処理を実行する構成であってもよい。 It should be noted that the computer 10 may be configured to execute the processing described below when it determines that the plant is not a crop for each piece of captured data, not when the processing is completed for all captured data. .. That is, the computer 10 may be configured to execute the processing described below on the individual photographing data when the plant is a crop.
 コマンド送信モジュール22は、作成したコマンドをドローンに送信し、ドローンに除草剤を散布させる(ステップS36)。ステップS36において、コマンド送信モジュール22は、上述した飛行コマンド及び駆動コマンドを送信する。ドローンは、この飛行コマンド及び駆動コマンドを受信する。ドローンは、この飛行コマンドに基づいて、目的とする植物の位置まで飛行し、駆動コマンドに基づいて、散布する除草剤の種類及びこの除草剤の散布量に従った除草剤を散布する。 The command transmission module 22 transmits the created command to the drone, and sprays the herbicide on the drone (step S36). In step S36, the command transmission module 22 transmits the flight command and the drive command described above. The drone receives this flight command and drive command. Based on this flight command, the drone flies to the position of the target plant, and based on the drive command, sprays the herbicide according to the type of herbicide to be sprayed and the spraying amount of this herbicide.
 この結果、コンピュータ10は、播種地点以外の植物に除草剤を散布させることになる。 As a result, the computer 10 sprays the herbicide on the plants other than the seeding point.
 なお、コンピュータ10が作成し送信するコマンドの対象は、ドローンに限らず、その他の農機具、高性能農業機械等であってもよい。この場合、コンピュータ10は、飛行コマンドの代わりに、走行コマンド等の器具や機械に合わせたものに変更すればよい。 The target of the command created and transmitted by the computer 10 is not limited to the drone, and may be other agricultural machinery, high-performance agricultural machinery, or the like. In this case, instead of the flight command, the computer 10 may be changed to one suitable for an instrument or machine such as a travel command.
 以上が、第一除草剤散布支援処理である。 The above is the first herbicide spraying support processing.
 [第一学習処理]
 図5に基づいて、除草剤散布支援システム1が実行する第一学習処理について説明する。図5は、コンピュータ10が実行する第一学習処理のフローチャートを示す図である。上述した各モジュールが実行する処理について、本処理に併せて説明する。
[First learning process]
The first learning process executed by the herbicide spraying support system 1 will be described with reference to FIG. FIG. 5 is a diagram showing a flowchart of the first learning process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
 学習モジュール44は、除草剤を散布させた植物の撮影画像を学習する(ステップS40)。ステップS40において、学習モジュール44は、上述したステップS34の処理により、作物特定モジュール42が、検出した植物が播種された作物ではないと判断した植物の撮影画像を学習する。作物特定モジュール42は、この植物の特徴点や特徴量と、この植物が播種された作物ではないこととを学習する。 The learning module 44 learns the photographed image of the plant to which the herbicide is sprayed (step S40). In step S40, the learning module 44 learns the captured image of the plant determined by the crop identifying module 42 not to be the planted seed by the processing of step S34 described above. The crop identification module 42 learns the feature points and feature amounts of this plant and that this plant is not a sown crop.
 記憶モジュール30は、学習結果を記憶する(ステップS41)。 The storage module 30 stores the learning result (step S41).
 コンピュータ10は、記憶した学習結果を加味して、上述したステップS32及びS34の処理を実行する。具体的には、画像解析モジュール40は、画像解析の結果に、学習結果を加味して、植物が検出できたか否かを判断する。例えば、画像解析モジュール40は、画像解析の結果抽出した特徴点や特徴量と、学習結果における特徴点や特徴量とに基づいて、抽出した特徴点や特徴量が植物のものであるか否かを判断することも可能となる。また、作物特定モジュール42は、位置情報の比較に加えて、学習結果における特徴点や特徴量に基づいて、植物が作物であるか否かを判断する。例えば、作物特定モジュール42は、位置情報の比較の結果、一致した場合であっても、植物の画像が学習結果における作物ではない植物と一致又は類似する場合、植物が作物ではないと判断することも可能となる。 The computer 10 executes the processes of steps S32 and S34 described above in consideration of the stored learning result. Specifically, the image analysis module 40 determines whether or not a plant could be detected by adding the learning result to the image analysis result. For example, the image analysis module 40 determines whether or not the extracted feature point or feature amount is based on the feature point or feature amount extracted as a result of the image analysis and the feature point or feature amount in the learning result. It is also possible to judge. Further, the crop identifying module 42 determines whether or not the plant is a crop based on the feature points and the feature amount in the learning result, in addition to the comparison of the position information. For example, the crop identification module 42 determines that the plant is not a crop if the image of the plant matches or resembles a plant that is not a crop in the learning result, even if the results of the comparison of the position information match. Will also be possible.
 以上が、第一学習処理である。 The above is the first learning process.
 [新芽点記憶処理]
 図6に基づいて、除草剤散布支援システム1が実行する新芽地点記憶処理について説明する。図6は、コンピュータ10が実行する新芽地点記憶処理のフローチャートを示す図である。上述した各モジュールが実行する処理について、本処理に併せて説明する。なお、上述した処理と同様の処理については、その詳細な説明は省略する。また、本処理は、播種した作物が成長し、新芽が出た時期に行うものである。
[Sprouting point memory processing]
Based on FIG. 6, the sprout point storage processing executed by the herbicide spraying support system 1 will be described. FIG. 6 is a diagram showing a flowchart of a sprout spot storage process executed by the computer 10. The processing executed by each module described above will be described together with this processing. The detailed description of the same processes as those described above will be omitted. In addition, this treatment is performed at the time when the sown crop grows and a new shoot emerges.
 撮影データ取得モジュール21は、圃場を撮影した撮影画像及び撮影地点の位置情報を、第一撮影データとして取得する(ステップS50)。ステップS50の処理は、上述したステップS30の処理と同様である。 The photographing data acquisition module 21 acquires the photographed image of the field and the position information of the photographing point as the first photographing data (step S50). The process of step S50 is similar to the process of step S30 described above.
 画像解析モジュール40は、第一撮影データに基づいて、撮影画像を画像解析する(ステップS51)。ステップS51の処理は、上述したステップS31の処理と同様である。 The image analysis module 40 performs image analysis on the captured image based on the first captured data (step S51). The process of step S51 is similar to the process of step S31 described above.
 画像解析モジュール40は、画像解析の結果に基づいて、作物の新芽が検出できたか否かを判断する(ステップS52)。ステップS52において、画像解析モジュール40は、抽出した特徴点や特徴量と、予め作物の新芽の特徴点や特徴量と作物の識別子とを対応付けて登録した新芽データベースとを比較することにより、撮影画像に作物の新芽が存在するか否かを判断する。画像解析モジュール40は、今回抽出した特徴点や特徴量と、新芽データベースに登録された特徴点や特徴量とが一致する場合、作物の新芽が検出できたと判断し、一致しない場合、作物の新芽が検出できなかったと判断することになる。画像解析モジュール40は、作物の新芽が検出できた場合、この作物の新芽の特徴点や特徴量と、新芽データベースに基づいて、この作物の新芽の識別子を併せて判断する。 The image analysis module 40 determines whether or not the sprout of the crop can be detected based on the result of the image analysis (step S52). In step S52, the image analysis module 40 compares the extracted feature points and feature amounts with the sprout database in which the feature points and feature amounts of the sprouts of the crop and the crop identifier are registered in advance to capture an image. Determine if there are crop shoots in the image. The image analysis module 40 judges that the sprout of the crop has been detected when the feature point or the feature amount extracted this time and the feature point or the feature amount registered in the sprout database match, and when they do not match, the sprout of the crop is determined. Will be determined not to be detected. When the sprout of the crop can be detected, the image analysis module 40 also determines the sprout identifier of the crop based on the sprout characteristic point and the feature amount of the crop and the sprout database.
 ステップS52において、画像解析モジュール40は、画像解析の結果、作物の新芽が検出できなかったと判断した場合(ステップS52 NO)、コンピュータ10は、本処理を終了する。なお、コンピュータ10は、この場合、上述したステップS50の処理を実行し、今回画像解析を行った第一撮影データとは異なる第一撮影データを取得する構成であってもよい。 In step S52, when the image analysis module 40 determines that the new shoots of the crop could not be detected as a result of the image analysis (step S52: NO), the computer 10 ends this processing. In this case, the computer 10 may be configured to execute the process of step S50 described above and acquire the first captured image data different from the first captured image data subjected to the image analysis this time.
 一方、ステップS52において、画像解析モジュール40は、画像解析の結果、作物の新芽が検出できたと判断した場合(ステップS52 YES)、位置特定モジュール41は、この第一撮影データにおける位置情報に基づいて、検出した作物の新芽の位置情報を特定する(ステップS53)。ステップS53において、位置特定モジュール41は、今回画像解析を行った撮影画像に対応する撮影地点の位置情報を、第一撮影データに基づいて特定する。位置特定モジュール41は、撮影地点の位置情報を、この作物の新芽の位置情報として特定する。さらに、位置特定モジュール41は、この作物の新芽の位置情報と、撮影画像における座標とに基づいて、この作物の新芽の詳細な位置を特定する。例えば、位置特定モジュール41は、撮影画像に対して、直交座標系を設定し、この作物の新芽を検出した撮影画像における位置を、この撮影画像におけるX座標及びY座標として特定する。位置特定モジュール41は、撮影地点における位置情報と、このX座標及びY座標とに基づいて、実際の圃場における作物の新芽の位置情報を特定する。このとき、位置特定モジュール41は、撮影画像の中心の位置が、撮影地点における位置情報に該当し、X座標及びY座標がこの中心の位置に対する位置として特定する。その結果、位置特定モジュール41は、圃場におけるこの作物の新芽の位置を特定することになる。 On the other hand, in step S52, when the image analysis module 40 determines that the sprout of the crop can be detected as a result of the image analysis (step S52 YES), the position specifying module 41 determines the position information in the first shooting data based on the position information. The position information of the detected sprout of the crop is specified (step S53). In step S53, the position specifying module 41 specifies the position information of the shooting point corresponding to the shot image subjected to the image analysis this time based on the first shooting data. The position specifying module 41 specifies the position information of the shooting point as the position information of the new shoots of this crop. Furthermore, the position specifying module 41 specifies the detailed position of the new shoots of this crop based on the position information of the new shoots of this crop and the coordinates in the captured image. For example, the position specifying module 41 sets an orthogonal coordinate system for the picked-up image, and specifies the position in the picked-up image in which the shoots of this crop are detected as the X coordinate and the Y coordinate in this picked-up image. The position specifying module 41 specifies the position information of the sprout of the crop in the actual field, based on the position information at the shooting point and the X and Y coordinates. At this time, the position specifying module 41 specifies the center position of the captured image as position information at the shooting point, and specifies the X coordinate and the Y coordinate as positions with respect to the center position. As a result, the position specifying module 41 specifies the position of the sprout of this crop in the field.
 記憶モジュール30は、この作物の新芽が存在する新芽地点の位置情報を記憶する(ステップS54)。ステップS54において、記憶モジュール30は、この新芽地点の位置情報のみを記憶してもよいし、作業者、利用者、圃場の識別子と対応付けて記憶してもよい。 The storage module 30 stores the position information of the sprout point where the sprout of this crop exists (step S54). In step S54, the storage module 30 may store only the position information of this sprout point, or may store it in association with the identifiers of the worker, the user, and the field.
 以上が、新芽地点記憶処理である。 The above is the new shoot location memory processing.
 [第二除草剤散布支援処理]
 図7に基づいて、除草剤散布支援システム1が実行する第二除草剤散布支援処理について説明する。図7は、コンピュータ10が実行する第二除草剤散布支援処理のフローチャートを示す図である。上述した各モジュールが実行する処理について、本処理に併せて説明する。なお、上述した処理と同様の処理については、その詳細な説明は省略する。
[Second herbicide spraying support processing]
The second herbicide application support process executed by the herbicide application support system 1 will be described with reference to FIG. 7. FIG. 7 is a diagram showing a flowchart of the second herbicide application support process executed by the computer 10. The processing executed by each module described above will be described together with this processing. The detailed description of the same processes as those described above will be omitted.
 撮影データ取得モジュール21は、第一撮影データとは異なる時期の圃場を撮影した撮影画像及び撮影地点の位置情報を、第二撮影データとして取得する(ステップS60)。ステップS60の処理は、上述したステップS30の処理と同様である。 The photographing data acquisition module 21 acquires, as the second photographing data, the photographed image of the field at a time different from the first photographing data and the position information of the photographing point (step S60). The process of step S60 is similar to the process of step S30 described above.
 画像解析モジュール40は、第二撮影データに基づいて、撮影画像を画像解析する(ステップS61)。ステップS61の処理は、上述したステップS31の処理と同様である。 The image analysis module 40 performs image analysis on the captured image based on the second captured data (step S61). The process of step S61 is similar to the process of step S31 described above.
 画像解析モジュール40は、画像解析の結果に基づいて、植物が検出できたか否かを判断する(ステップS62)。ステップS62の処理は、上述したステップS32の処理と同様である。 The image analysis module 40 determines whether or not the plant can be detected based on the result of the image analysis (step S62). The process of step S62 is similar to the process of step S32 described above.
 ステップS62において、画像解析モジュール40は、画像解析の結果、植物が検出できなかったと判断した場合(ステップS62 NO)、コンピュータ10は、本処理を終了する。なお、コンピュータ10は、この場合、上述したステップS60の処理を実行し、今回画像解析を行った第二撮影データとは異なる第二撮影データを取得する構成であってもよい。 In step S62, when the image analysis module 40 determines that the plant cannot be detected as a result of the image analysis (step S62 NO), the computer 10 ends this processing. In this case, the computer 10 may be configured to execute the process of step S60 described above and acquire the second shooting data different from the second shooting data subjected to the image analysis this time.
 一方、ステップS62において、画像解析モジュール40は、画像解析の結果、植物が検出できたと判断した場合(ステップS62 YES)、位置特定モジュール41は、この撮影データにおける位置情報に基づいて、検出した植物の位置情報を特定する(ステップS63)。ステップS63の処理は、上述したステップS33の処理と同様である。 On the other hand, in step S62, when the image analysis module 40 determines that the plant can be detected as a result of the image analysis (step S62 YES), the position specifying module 41 determines the detected plant based on the position information in the photographing data. The position information of is identified (step S63). The process of step S63 is similar to the process of step S33 described above.
 作物特定モジュール42は、上述したステップS51の処理により記憶した新芽地点の位置情報と、特定した植物の位置情報とに基づいて、この植物が作物であるか否かを判断する(ステップS64)。ステップS64において、作物特定モジュール42は、この新芽地点の位置情報と、特定した植物の位置情報とを比較し、其々が一致するか否かに基づいて、この判断を実行する。作物特定モジュール42は、一致すると判断した場合、この植物は、作物である(作物の新芽が成長したものである)と判断し(ステップS64 YES)、後述する処理を実行せずに、本処理を終了する。なお、コンピュータ10は、上述したステップS62の処理と同様に、上述したステップS60の処理を実行し、今回画像解析を行った第二撮影データとは異なる第二撮影データを取得する構成であってもよい。 The crop specifying module 42 determines whether or not this plant is a crop based on the position information of the sprout point stored by the process of step S51 described above and the position information of the specified plant (step S64). In step S64, the crop identification module 42 compares the position information of this sprout point with the position information of the specified plant, and makes this determination based on whether or not they match. When the crop identifying module 42 determines that the plants match, the plant identifying module 42 determines that the plant is a crop (i.e., a sprout of the crop has grown) (YES in step S64), and the process described below is not performed. To finish. Note that the computer 10 is configured to execute the process of step S60 described above and acquire the second image data different from the second image data subjected to the image analysis this time, similarly to the process of step S62 described above. Good.
 その結果、コンピュータ10は、検出した植物が作物である場合、除草剤の散布に必要なコマンドの作成及び送信を行わないため、この作物に対して除草剤の散布を行わせないことになる。 As a result, when the detected plant is a crop, the computer 10 does not create or transmit the command necessary for spraying the herbicide, and thus does not spray the crop with the herbicide.
 一方、ステップS64において、作物特定モジュール42は、一致しないと判断した場合、この植物は、作物ではない(作物の新芽が成長したものではなく雑草である)と判断し(ステップS64 NO)、コマンド作成モジュール43は、この植物を除草する除草剤をドローンに散布させるコマンドを作成する(ステップS65)。ステップS65の処理は、上述したステップS35の処理と同様である。 On the other hand, if the crop identification module 42 determines in step S64 that they do not match, it determines that this plant is not a crop (a new shoot of the crop is not a grown one, but a weed) (step S64 NO), and the command The creation module 43 creates a command for spraying a drone with a herbicide for weeding this plant (step S65). The process of step S65 is similar to the process of step S35 described above.
 コマンド送信モジュール22は、作成したコマンドをドローンに送信し、ドローンに除草剤を散布させる(ステップS66)。ステップS66の処理は、上述したステップS36の処理と同様である。 The command transmission module 22 transmits the created command to the drone, and sprays the herbicide on the drone (step S66). The process of step S66 is similar to the process of step S36 described above.
 この結果、コンピュータ10は、新芽地点以外の植物に除草剤を散布させることになる。 As a result, the computer 10 will spray the herbicide on plants other than the sprout site.
 なお、コンピュータ10が作成し送信するコマンドの対象は、ドローンに限らず、その他の農機具、高性能農業機械等であってもよい。この場合、コンピュータ10は、飛行コマンドの代わりに、走行コマンド等の器具や機械に合わせたものに変更すればよい。 The target of the command created and transmitted by the computer 10 is not limited to the drone, and may be other agricultural machinery, high-performance agricultural machinery, or the like. In this case, instead of the flight command, the computer 10 may be changed to one suitable for an instrument or machine such as a travel command.
 以上が、第二除草剤散布支援処理である。 The above is the second herbicide spraying support processing.
 [第二学習処理]
 図8に基づいて、除草剤散布支援システム1が実行する第一学習処理について説明する。図8は、コンピュータ10が実行する第一学習処理のフローチャートを示す図である。上述した各モジュールが実行する処理について、本処理に併せて説明する。
[Second learning process]
Based on FIG. 8, the first learning process executed by the herbicide spraying support system 1 will be described. FIG. 8 is a diagram showing a flowchart of the first learning process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
 学習モジュール44は、除草剤を散布させた植物の撮影画像を学習する(ステップS70)。ステップS70において、学習モジュール44は、上述したステップS64の処理により、作物特定モジュール42が、検出した植物が作物ではないと判断した植物の撮影画像を学習する。作物特定モジュール42は、この植物の特徴点や特徴量と、この植物が作物ではないこととを学習する。 The learning module 44 learns the photographed image of the plant to which the herbicide is sprayed (step S70). In step S70, the learning module 44 learns the captured image of the plant that the crop identifying module 42 has determined that the detected plant is not a crop by the process of step S64 described above. The crop identification module 42 learns the feature points and feature quantities of this plant and that this plant is not a crop.
 記憶モジュール30は、学習結果を記憶する(ステップS71)。 The storage module 30 stores the learning result (step S71).
 コンピュータ10は、記憶した学習結果を加味して、上述したステップS62及びS64の処理を実行する。具体的には、画像解析モジュール40は、画像解析の結果に、学習結果を加味して、植物が検出できたか否かを判断する。例えば、画像解析モジュール40は、画像解析の結果抽出した特徴点や特徴量と、学習結果における特徴点や特徴量とに基づいて、抽出した特徴点や特徴量が植物のものであるか否かを判断することも可能となる。また、作物特定モジュール42は、位置情報の比較に加えて、学習結果における特徴点や特徴量に基づいて、植物が作物であるか否かを判断する。例えば、作物特定モジュール42は、位置情報の比較の結果、一致した場合であっても、植物の画像が学習結果における作物ではない植物と一致又は類似する場合、植物が作物ではないと判断することも可能となる。 The computer 10 executes the processes of steps S62 and S64 described above, taking into consideration the stored learning result. Specifically, the image analysis module 40 determines whether or not a plant could be detected by adding the learning result to the image analysis result. For example, the image analysis module 40 determines whether or not the extracted feature point or feature amount is based on the feature point or feature amount extracted as a result of the image analysis and the feature point or feature amount in the learning result. It is also possible to judge. Further, the crop identifying module 42 determines whether or not the plant is a crop based on the feature points and the feature amount in the learning result, in addition to the comparison of the position information. For example, the crop identification module 42 determines that the plant is not a crop if the image of the plant matches or resembles a plant that is not a crop in the learning result, even if the results of the comparison of the position information match. Will also be possible.
 以上が、第二学習処理である。 The above is the second learning process.
 上述した手段、機能は、コンピュータ(CPU、情報処理装置、各種端末を含む)が、所定のプログラムを読み込んで、実行することによって実現される。プログラムは、例えば、コンピュータからネットワーク経由で提供される(SaaS:ソフトウェア・アズ・ア・サービス)形態で提供される。また、プログラムは、例えば、フレキシブルディスク、CD(CD-ROMなど)、DVD(DVD-ROM、DVD-RAMなど)等のコンピュータ読取可能な記録媒体に記録された形態で提供される。この場合、コンピュータはその記録媒体からプログラムを読み取って内部記録装置又は外部記録装置に転送し記録して実行する。また、そのプログラムを、例えば、磁気ディスク、光ディスク、光磁気ディスク等の記録装置(記録媒体)に予め記録しておき、その記録装置から通信回線を介してコンピュータに提供するようにしてもよい。 The above means and functions are realized by a computer (including a CPU, an information processing device, various terminals) reading and executing a predetermined program. The program is provided in the form of being provided from a computer via a network (SaaS: software as a service), for example. Further, the program is provided in a form recorded in a computer-readable recording medium such as a flexible disk, a CD (CD-ROM, etc.), a DVD (DVD-ROM, DVD-RAM, etc.). In this case, the computer reads the program from the recording medium, transfers the program to the internal recording device or the external recording device, records the program, and executes the program. Alternatively, the program may be recorded in advance in a recording device (recording medium) such as a magnetic disk, an optical disk, a magneto-optical disk, and provided from the recording device to a computer via a communication line.
 以上、本発明の実施形態について説明したが、本発明は上述したこれらの実施形態に限るものではない。また、本発明の実施形態に記載された効果は、本発明から生じる最も好適な効果を列挙したに過ぎず、本発明による効果は、本発明の実施形態に記載されたものに限定されるものではない。 Although the embodiments of the present invention have been described above, the present invention is not limited to these embodiments described above. Further, the effects described in the embodiments of the present invention only list the most suitable effects that occur from the present invention, and the effects according to the present invention are limited to those described in the embodiments of the present invention. is not.
 1 除草剤散布支援システム、10 コンピュータ 1 herbicide spraying support system, 10 computers

Claims (8)

  1.  除草剤の散布を支援するコンピュータシステムであって、
     作物の播種地点の位置情報を取得する第一取得手段と、
     前記播種地点の位置情報を記憶する記憶手段と、
     圃場を撮影した撮影画像及び撮影地点の位置情報を取得する第二取得手段と、
     前記撮影画像を画像解析し、植物を検出する検出手段と、
     前記植物の位置情報を、前記撮影地点の位置情報に基づいて特定する特定手段と、
     記憶した前記播種地点の位置情報と、特定した前記植物の位置情報とに基づいて、当該播種地点以外の植物に除草剤を散布させる散布手段と、
     を備えることを特徴とするコンピュータシステム。
    A computer system that supports the application of herbicides,
    A first acquisition means for acquiring position information of the seeding point of the crop,
    Storage means for storing the position information of the seeding point,
    Second acquisition means for acquiring a photographed image of the field and position information of the photographing point,
    An image analysis of the photographed image, a detection means for detecting a plant,
    Positioning information of the plant, specifying means for specifying based on the position information of the shooting point,
    Based on the position information of the stored sowing point, and the position information of the identified plant, a spraying means for spraying a herbicide to plants other than the seeding point,
    A computer system comprising:
  2.  除草剤の散布を支援するコンピュータシステムであって、
     圃場を撮影した第一撮影画像及び撮影地点の位置情報を取得する第一取得手段と、
     前記第一撮影画像を画像解析し、作物の新芽を検出する第一検出手段と、
     前記新芽が存在する新芽地点の位置情報を、前記撮影地点の位置情報に基づいて特定する第一特定手段と、
     特定した前記新芽地点の位置情報を記憶する記憶手段と、
     第一撮影画像とは異なる時期の圃場を撮影した第二撮影画像及び撮影地点の位置情報を取得する第二取得手段と、
     前記第二撮影画像を画像解析し、植物を検出する第二検出手段と、
     前記植物の位置情報を、前記撮影地点の位置情報に基づいて特定する第二特定手段と、
     記憶した前記新芽地点の位置情報と、特定した前記植物の位置情報とに基づいて、当該新芽地点以外の植物に除草剤を散布させる散布手段と、
     を備えることを特徴とするコンピュータシステム。
    A computer system that supports the application of herbicides,
    A first acquisition means for acquiring the first captured image of the field and the positional information of the capturing location;
    Image analysis of the first captured image, first detection means for detecting the shoots of the crop,
    Position information of the sprout point where the sprout is present, first specifying means for specifying based on the position information of the shooting point,
    Storage means for storing the position information of the identified sprout point,
    A second acquisition means for acquiring position information of the second captured image and the capturing location of the field at a time different from the first captured image;
    An image analysis of the second captured image, a second detection means for detecting plants,
    Position information of the plant, a second specifying means for specifying based on the position information of the shooting point,
    Based on the stored position information of the sprout point and the position information of the identified plant, spraying means for spraying a herbicide on plants other than the sprout point,
    A computer system comprising:
  3.  前記除草剤を散布させた植物の画像を学習する学習手段と、
     をさらに備え、
     前記検出手段は、学習結果と、画像解析の結果とに基づいて、前記植物を検出する、
     ことを特徴とする請求項1に記載のコンピュータシステム。
    Learning means for learning the image of the plant sprayed with the herbicide,
    Further equipped with,
    The detection means detects the plant based on a learning result and a result of image analysis,
    The computer system according to claim 1, wherein:
  4.  前記除草剤を散布させた植物の画像を学習する学習手段と、
     をさらに備え、
     前記第二検出手段は、学習結果と、画像解析の結果とに基づいて、前記植物を検出する、
     ことを特徴とする請求項2に記載のコンピュータシステム。
    Learning means for learning the image of the plant sprayed with the herbicide,
    Further equipped with,
    The second detection means, based on the learning result and the result of the image analysis, to detect the plant,
    The computer system according to claim 2, wherein:
  5.  除草剤の散布を支援するコンピュータシステムが実行する除草剤散布支援方法であって、
     作物の播種地点の位置情報を取得するステップと、
     前記播種地点の位置情報を記憶するステップと、
     圃場を撮影した撮影画像及び撮影地点の位置情報を取得するステップと、
     前記撮影画像を画像解析し、植物を検出するステップと、
     前記植物の位置情報を、前記撮影地点の位置情報に基づいて特定するステップと、
     記憶した前記播種地点の位置情報と、特定した前記植物の位置情報とに基づいて、当該播種地点以外の植物に除草剤を散布させるステップと、
     を備えることを特徴とする除草剤散布支援方法。
    A method for supporting herbicide application, which is executed by a computer system that supports the application of herbicide, comprising:
    A step of acquiring the position information of the seeding point of the crop,
    Storing the position information of the seeding point,
    A step of acquiring a photographed image of the field and position information of the photographing point;
    Image-analyzing the photographed image, detecting a plant,
    Positioning the plant position information based on the position information of the shooting location,
    Based on the position information of the stored seeding point and the position information of the identified plant, a step of spraying the herbicide on plants other than the seeding point,
    A herbicide spraying support method comprising:
  6.  除草剤の散布を支援するコンピュータシステムが実行する除草剤散布支援方法であって、
     圃場を撮影した第一撮影画像及び撮影地点の位置情報を取得するステップと、
     前記第一撮影画像を画像解析し、作物の新芽を検出するステップと、
     前記新芽が存在する新芽地点の位置情報を、前記撮影地点の位置情報に基づいて特定するステップと、
     特定した前記新芽地点の位置情報を記憶するステップと、
     第一撮影画像とは異なる時期の圃場を撮影した第二撮影画像及び撮影地点の位置情報を取得するステップと、
     前記第二撮影画像を画像解析し、植物を検出するステップと、
     前記植物の位置情報を、前記撮影地点の位置情報に基づいて特定するステップと、
     記憶した前記新芽地点の位置情報と、特定した前記植物の位置情報とに基づいて、当該新芽地点以外の植物に除草剤を散布させるステップと、
     を備えることを特徴とする除草剤散布支援方法。
    A method for supporting herbicide application, which is executed by a computer system that supports the application of herbicide, comprising:
    A step of acquiring the first photographed image of the field and the positional information of the photographing point;
    Image analysis of the first captured image, the step of detecting the sprout of the crop,
    Positioning information of the sprout point where the sprout exists, based on the position information of the shooting point,
    Storing the position information of the identified sprout point,
    A step of acquiring the second photographed image of the field at a time different from the first photographed image and the positional information of the photographing point;
    Image analysis of the second captured image, detecting a plant,
    Positioning the plant position information based on the position information of the shooting location,
    Based on the position information of the stored sprout point and the position information of the identified plant, a step of spraying a herbicide on plants other than the sprout point,
    A herbicide spraying support method comprising:
  7.  除草剤の散布を支援するコンピュータシステムに、
     作物の播種地点の位置情報を取得するステップ、
     前記播種地点の位置情報を記憶するステップ、
     圃場を撮影した撮影画像及び撮影地点の位置情報を取得するステップ、
     前記撮影画像を画像解析し、植物を検出するステップ、
     前記植物の位置情報を、前記撮影地点の位置情報に基づいて特定するステップ、
     記憶した前記播種地点の位置情報と、特定した前記植物の位置情報とに基づいて、当該播種地点以外の植物に除草剤を散布させるステップ、
     を実行させるためのコンピュータ読み取り可能なプログラム。
    A computer system that supports the application of herbicides,
    A step of acquiring the position information of the seeding point of the crop,
    Storing the position information of the seeding point,
    A step of acquiring a photographed image of the field and position information of the photographing point;
    Image-analyzing the photographed image, detecting a plant,
    Positioning the plant position information based on the position information of the shooting location,
    Based on the position information of the stored sowing point and the position information of the identified plant, a step of spraying a herbicide on plants other than the sowing point,
    A computer-readable program for executing.
  8.  除草剤の散布を支援するコンピュータシステムに、
     圃場を撮影した第一撮影画像及び撮影地点の位置情報を取得するステップ、
     前記第一撮影画像を画像解析し、作物の新芽を検出するステップ、
     前記新芽が存在する新芽地点の位置情報を、前記撮影地点の位置情報に基づいて特定するステップ、
     特定した前記新芽地点の位置情報を記憶するステップ、
     第一撮影画像とは異なる時期の圃場を撮影した第二撮影画像及び撮影地点の位置情報を取得するステップ、
     前記第二撮影画像を画像解析し、植物を検出するステップ、
     前記植物の位置情報を、前記撮影地点の位置情報に基づいて特定するステップ、
     記憶した前記新芽地点の位置情報と、特定した前記植物の位置情報とに基づいて、当該新芽地点以外の植物に除草剤を散布させるステップ、
     を実行させるためのコンピュータ読み取り可能なプログラム。
    A computer system that supports the application of herbicides,
    A step of acquiring the first photographed image of the farm field and the position information of the photographing point;
    Image analysis of the first captured image, detecting the sprout of the crop,
    Positioning information of the sprout point where the sprout exists, based on the position information of the shooting point,
    Storing the position information of the identified sprout point,
    A step of acquiring position information of a second photographed image and a photographing point of a field photographed at a time different from that of the first photographed image,
    Image analysis of the second captured image, the step of detecting plants,
    Positioning the plant position information based on the position information of the shooting location,
    Based on the stored position information of the sprout point and the position information of the identified plant, a step of spraying a herbicide on plants other than the sprout point,
    A computer-readable program for executing.
PCT/JP2019/003256 2019-01-30 2019-01-30 Computer system, crop growth support method, and program WO2020157878A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020569247A JP7068747B2 (en) 2019-01-30 2019-01-30 Computer system, crop growth support method and program
PCT/JP2019/003256 WO2020157878A1 (en) 2019-01-30 2019-01-30 Computer system, crop growth support method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/003256 WO2020157878A1 (en) 2019-01-30 2019-01-30 Computer system, crop growth support method, and program

Publications (1)

Publication Number Publication Date
WO2020157878A1 true WO2020157878A1 (en) 2020-08-06

Family

ID=71841293

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/003256 WO2020157878A1 (en) 2019-01-30 2019-01-30 Computer system, crop growth support method, and program

Country Status (2)

Country Link
JP (1) JP7068747B2 (en)
WO (1) WO2020157878A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7000619B1 (en) 2021-05-21 2022-02-14 株式会社オプティム Programs, methods, injection devices, and systems
JP2022065581A (en) * 2020-10-15 2022-04-27 西武建設株式会社 Weeder using unmanned ari vehicle
JP7498203B2 (en) 2022-02-09 2024-06-11 フタバ産業株式会社 Mobile weeding equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001292684A (en) * 2000-04-11 2001-10-23 Hokkaido National Agricultural Experiment Station System for controlling application of agrochemicals or fertilizer
JP2013059296A (en) * 2011-09-14 2013-04-04 Chugoku Electric Power Co Inc:The Method and system for managing farmland
WO2017106903A1 (en) * 2015-12-23 2017-06-29 Aerobugs Pty Ltd A particulate dispersal assembly and method of use
JP2018525976A (en) * 2015-07-02 2018-09-13 エコロボティクス・ソシエテ・アノニム Robotic vehicle for automatic processing of plant organisms and method of using a robot
WO2018173577A1 (en) * 2017-03-23 2018-09-27 日本電気株式会社 Vegetation index calculation device, vegetation index calculation method, and computer readable recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001292684A (en) * 2000-04-11 2001-10-23 Hokkaido National Agricultural Experiment Station System for controlling application of agrochemicals or fertilizer
JP2013059296A (en) * 2011-09-14 2013-04-04 Chugoku Electric Power Co Inc:The Method and system for managing farmland
JP2018525976A (en) * 2015-07-02 2018-09-13 エコロボティクス・ソシエテ・アノニム Robotic vehicle for automatic processing of plant organisms and method of using a robot
WO2017106903A1 (en) * 2015-12-23 2017-06-29 Aerobugs Pty Ltd A particulate dispersal assembly and method of use
WO2018173577A1 (en) * 2017-03-23 2018-09-27 日本電気株式会社 Vegetation index calculation device, vegetation index calculation method, and computer readable recording medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022065581A (en) * 2020-10-15 2022-04-27 西武建設株式会社 Weeder using unmanned ari vehicle
JP7000619B1 (en) 2021-05-21 2022-02-14 株式会社オプティム Programs, methods, injection devices, and systems
JP2022178975A (en) * 2021-05-21 2022-12-02 株式会社オプティム Program, method, injection device, and system
JP7498203B2 (en) 2022-02-09 2024-06-11 フタバ産業株式会社 Mobile weeding equipment

Also Published As

Publication number Publication date
JPWO2020157878A1 (en) 2021-06-03
JP7068747B2 (en) 2022-05-17

Similar Documents

Publication Publication Date Title
US12026944B2 (en) Generation of digital cultivation maps
WO2020157878A1 (en) Computer system, crop growth support method, and program
CN109197278B (en) Method and device for determining operation strategy and method for determining drug spraying strategy
EP3741214A1 (en) Method for plantation treatment based on image recognition
JP7086203B2 (en) Plant cultivation data measurement method, work route planning method and equipment, system
JP2019520631A (en) Weed recognition in the natural environment
US11632907B2 (en) Agricultural work apparatus, agricultural work management system, and program
JP2022542764A (en) Method for generating application maps for treating farms with agricultural equipment
JP7075171B2 (en) Computer systems, pest detection methods and programs
CN111479459A (en) System, method, and program for predicting growth status or disease/pest occurrence status
KR20200077808A (en) Video-based monitoring apparatus and operation method
CN106645147A (en) Method for pest and disease damage monitoring
JP7066258B2 (en) Computer system, crop growth support method and program
US20220392214A1 (en) Scouting functionality emergence
Liu et al. Development of a proximal machine vision system for off-season weed mapping in broadacre no-tillage fallows
EP4187344B1 (en) Work machine distance prediction and action control
WO2019244156A1 (en) System for in-situ imaging of plant tissue
JP2022064532A (en) Weeding device, automatic weeding method, and automatic weeding program
KR102371433B1 (en) System and method for generating farming map of agricultural robot based on artificial intelligence
TW202219879A (en) Plant growth identification system
WO2021149355A1 (en) Information processing device, information processing method, and program
KR20180133610A (en) Insect pest image acquisition method for insect pest prediction system of cash crops
US20220406039A1 (en) Method for Treating Plants in a Field
WO2023105112A1 (en) Weeding robot
Karkee et al. Advanced Technologies for Crop-Load Management

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19913309

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020569247

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19913309

Country of ref document: EP

Kind code of ref document: A1