WO2020174645A1 - Computer system, pest detection method, program - Google Patents

Computer system, pest detection method, program Download PDF

Info

Publication number
WO2020174645A1
WO2020174645A1 PCT/JP2019/007762 JP2019007762W WO2020174645A1 WO 2020174645 A1 WO2020174645 A1 WO 2020174645A1 JP 2019007762 W JP2019007762 W JP 2019007762W WO 2020174645 A1 WO2020174645 A1 WO 2020174645A1
Authority
WO
WIPO (PCT)
Prior art keywords
pest
position information
computer
image
pests
Prior art date
Application number
PCT/JP2019/007762
Other languages
French (fr)
Japanese (ja)
Inventor
俊二 菅谷
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Priority to JP2021501481A priority Critical patent/JP7075171B2/en
Priority to PCT/JP2019/007762 priority patent/WO2020174645A1/en
Publication of WO2020174645A1 publication Critical patent/WO2020174645A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass

Definitions

  • the present invention relates to a computer system for detecting pests by image analysis, a pest detection method, and a program.
  • a system for controlling pests based on a live image of a field taken by a drone in real time detects the presence or absence of a pest by acquiring the live image and the position information of the shooting point of the live image, and comparing the live image with a specific image in which a pest or the like is captured in advance.
  • the system detects a pest, the system outputs the position information of the shooting point where the live image was shot to a drone or the like to exterminate the pest.
  • Patent Document 1 the detection of pests is limited to live images taken in real time, which may not be effective for captured images other than such live images.
  • the present invention is not limited to live images, and an object of the present invention is to provide a computer system, a pest detection method, and a program that facilitate detection and control of pests based on captured images.
  • the present invention provides the following solutions.
  • the present invention is a computer system for detecting pests by image analysis, An acquisition means for acquiring a photographed image of the field and position information of the photographing point; Image analysis of the captured image acquired, a detection means for detecting pests, Position information output means for outputting the position information of the photographing point in the photographed image in which the pest is detected,
  • a computer system comprising:
  • a computer system for detecting a pest by image analysis the captured image of the field and position information of the shooting point is acquired, image analysis of the captured image acquired, the pest is detected, the pest The positional information of the photographing point in the photographed image in which is detected is output.
  • the present invention is a system category, but also in other categories such as methods and programs, the same action/effect according to the category is exhibited.
  • a computer system a pest detection method, and a program, which are not limited to live images and which can easily detect and exterminate pests based on captured images.
  • FIG. 1 is a diagram showing an outline of the pest detection system 1.
  • FIG. 2 is an overall configuration diagram of the pest detection system 1.
  • FIG. 3 is a diagram showing a flowchart of the position information output process executed by the computer 10.
  • FIG. 4 is a diagram showing a flowchart of the instruction output process executed by the computer 10.
  • FIG. 5 is a diagram schematically illustrating an example of the pest map created by the computer 10.
  • FIG. 1 is a diagram for explaining an outline of a pest detection system 1 which is a preferred embodiment of the present invention.
  • the pest detection system 1 is a computer system that includes a computer 10 and detects a pest by image analysis.
  • the pest detection system 1 is a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal possessed by a worker who cultivates a crop (for example, a smartphone, a tablet terminal, a personal computer), other terminals such as other computers, Devices may be included. Further, the pest detection system 1 may be realized by one computer such as the computer 10 or may be realized by a plurality of computers such as a cloud computer.
  • the computer 10 is connected to a drone, an agricultural machine tool, a high-performance agricultural machine, a worker terminal, other computers, and the like so as to be able to perform data communication via a public line network, etc., and transmits and receives necessary data.
  • the computer 10 acquires a photographed image of the field and position information of the photographing point.
  • the computer 10 for example, acquires a photographed image of each point in the field by a drone.
  • the computer 10 acquires from the drone, for example, the position information of the shooting location where the drone shot the shot image.
  • the computer 10 analyzes the acquired photographed image and detects the pests shown in this photographed image.
  • the computer 10 extracts, for example, as image analysis, characteristic points (for example, shape, contour, and hue) and characteristic amounts (for example, average of pixel values, variance, and statistical values such as histogram) of the captured image.
  • characteristic points for example, shape, contour, and hue
  • characteristic amounts for example, average of pixel values, variance, and statistical values such as histogram
  • the computer 10 identifies the position information of the detected pest based on the acquired position information of the shooting point. For example, the computer 10 identifies the position information of the detected pest, assuming that the detected position information of the pest matches the position information of the photographing point. As a result, the computer 10 will specify the position of the detected pest in the field.
  • the computer 10 outputs the position information of the specified pest. At this time, the computer 10 outputs this position information to the above-mentioned drone, agricultural equipment, high-performance agricultural machine, worker terminal, other computer, or the like.
  • the computer 10 can also be configured to output an instruction for spraying pesticide or fertilizing based on this position information.
  • the computer 10 moves the drone to the position of the pest based on the position information, and transmits an instruction to spray the pesticide or fertilizer to the drone as a command by the pesticide spraying device or the fertilizer applying device included in the drone.
  • the drone receives this command and sprays pesticides or fertilizers against the pests.
  • the computer 10 outputs a pesticide spraying or fertilizing instruction based on this position information.
  • the computer 10 acquires a photographed image of the field and position information of the photographing point as photographing data (step S01).
  • the computer 10 acquires, for example, captured images obtained by capturing images of various points in the field by the drone.
  • the computer 10 acquires from the drone, for example, the position information of the shooting location where the drone shot the shot image.
  • the computer 10 acquires such photographed images and positional information of photographing points as photographing data.
  • the computer 10 analyzes the photographed image and detects the pests shown in this photographed image (step S02).
  • the computer 10 extracts, for example, a feature point or a feature amount of a captured image as image analysis.
  • the computer 10 detects the pests shown in this captured image based on the extracted characteristic points and characteristic amounts.
  • the computer 10 identifies the position information of the detected pest based on the acquired position information of the shooting point (step S03).
  • the computer 10 specifies the position information of the detected pest, assuming that the detected position information of the pest matches the position information of the photographing point. As a result, the computer 10 will specify the position of the detected pest in the field.
  • the computer 10 outputs the position information of the specified pest (step S04).
  • the computer 10 outputs this position information to, for example, the above-mentioned drone, agricultural machinery, high-performance agricultural machine, worker terminal, and other computers.
  • FIG. 2 is a diagram showing the system configuration of the pest detection system 1 according to the preferred embodiment of the present invention.
  • the pest detection system 1 is a computer system that includes a computer 10 and detects a pest by image analysis.
  • the computer 10 is connected to a drone, an agricultural machine tool, a high-performance agricultural machine, a worker terminal, other computers, etc. so as to be able to perform data communication via a public network, etc., and transmits/receives necessary data.
  • the pest detection system 1 may include a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal, other computers, and other terminals and devices that are not shown. Further, the pest detection system 1 may be realized by one computer such as the computer 10 or may be realized by a plurality of computers such as a cloud computer.
  • the computer 10 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), etc., and enables communication with other terminals or devices as a communication unit.
  • Device for example, a device compatible with Wi-Fi (Wireless-Fidelity) compliant with IEEE 802.11.
  • the computer 10 also includes, as a recording unit, a data storage unit such as a hard disk, a semiconductor memory, a recording medium, and a memory card. Further, the computer 10 includes various devices that execute various processes as a processing unit.
  • control unit reads a predetermined program to realize the imaging data acquisition module 20, the pest data output module 21, and the command output module 22 in cooperation with the communication unit. Further, in the computer 10, the control unit reads a predetermined program to realize the recording module 30 in cooperation with the recording unit. Further, in the computer 10, the control unit reads a predetermined program, and in cooperation with the processing unit, the image analysis module 40, the position identification module 41, the countermeasure data identification module 42, the pest map creation module 43, the command creation module. Achieve 44.
  • FIG. 3 is a diagram showing a flowchart of the position information output process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the shooting data acquisition module 20 acquires the shooting image of the field and the position information of the shooting point as the shooting data (step S10).
  • the shooting data acquisition module 20 for example, the shooting images taken by the drone at each of a plurality of preset shooting points in the field, and the position information of each shooting point at which the drone has taken the shooting images. And are acquired as shooting data.
  • the drone shoots the area directly below itself as a captured image. That is, the drone captures the captured image from a position perpendicular to the field.
  • the drone captures a captured image at the capturing location and acquires its own position information from the GPS or the like.
  • the drone handles the acquired position information of itself as position information of the shooting point.
  • the drone transmits the photographed image and the position information of the photographing point to the computer 10 as photographing data.
  • the imaging data acquisition module 20 acquires imaging data by receiving the imaging data transmitted by the drone.
  • the computer 10 acquires the photographed image of the field and the position information of the photographing point.
  • the image analysis module 40 analyzes the captured image based on the captured data (step S11). In step S11, the image analysis module 40 extracts feature points and feature amounts in this captured image. The image analysis module 40 extracts, for example, the shape, hue, etc. of an object existing in a captured image as image analysis.
  • the image analysis module 40 determines whether the pest could be detected based on the result of the image analysis (step S12).
  • step S12 the image analysis module 40 compares the extracted feature points and feature amounts with the pest database in which the feature points and feature amounts of the pests and the pest identifiers are registered in advance to form a captured image. Determine whether pests are present.
  • the image analysis module 40 determines that the pest can be detected when the feature point or the feature amount extracted this time and the feature point or the feature amount registered in the pest database are judged to have been detected. If they do not match, the pest cannot be detected. Will be judged.
  • the image analysis module 40 also determines the pest identifier based on the feature point and the feature amount of the pest and the pest database.
  • step S12 when the image analysis module 40 determines that the pest could not be detected as a result of the image analysis (step S12 NO), the computer 10 ends this processing.
  • the computer 10 may be configured to execute the above-described processing of step S10 and acquire shooting data different from the shooting data subjected to the image analysis this time.
  • step S12 when the image analysis module 40 determines that the pest can be detected as a result of the image analysis (YES in step S12), the position specifying module 41 determines the detected pest based on the position information in the imaging data.
  • the position information of is specified (step S13).
  • step S13 the position specifying module 41 specifies the position information of the shooting point corresponding to the shot image subjected to the image analysis this time based on the shooting data.
  • the position specifying module 41 specifies the position information of the shooting point as the position information of this pest.
  • the position specifying module 41 specifies the detailed position of the pest based on the position information of the pest and the coordinates in the captured image.
  • the position specifying module 41 sets an orthogonal coordinate system for the picked-up image and specifies the position in the picked-up image in which this pest is detected as the X coordinate and the Y coordinate in this picked-up image.
  • the position specifying module 41 specifies the position information of the pest in the actual field based on the position information at the shooting point and the X and Y coordinates.
  • the position specifying module 41 specifies the center position of the captured image as position information at the shooting point, and specifies the X coordinate and the Y coordinate as positions with respect to the center position.
  • the position specifying module 41 specifies the position of this pest in the field.
  • the countermeasure data identification module 42 identifies, as countermeasure data, the identifier of the pesticide or fertilizer effective for the pest identified this time and the sprayed amount or fertilizer application amount of this pesticide or fertilizer (step S14).
  • the countermeasure data identification module 42 refers to the pest countermeasure database in which the identifier of the pest and the identifier of the effective pesticide or fertilizer are registered in advance to identify the pesticide or fertilizer effective for the pest detected this time. To do.
  • the countermeasure data identification module 42 identifies the sprayed amount or fertilized amount of this pesticide or fertilizer based on the extracted characteristic points or characteristic amounts. For example, the countermeasure data identification module 42 identifies the required spraying amount of pesticides and fertilizer application amount according to the size and shape of the extracted pests.
  • the recording module 30 records the identifier of the specified pest, the position information of the pest, and the countermeasure data as the pest data (step S15).
  • the computer 10 repeats the processing of steps S10 to S14 described above, and records the identifiers, position information, and countermeasure data of each detected pest for the entire field.
  • the computer 10 may be configured to execute the process described below without recording pest data on the entire field.
  • the computer 10 may be configured to record the one pest data when the one pest data is specified and use the one pest data in the process described later.
  • the pest map creation module 43 creates a pest map showing the identifiers and positions of the detected pests in this field based on the recorded pest data (step S16).
  • the pest map creation module 43 creates an entire map of a preset area or the entire field. This entire map is created based on the preset area or position information of the entire farm field. For example, when a rectangular area is set, an entire map is created based on the position information of each vertex of this rectangle.
  • the pest map creation module 43 creates a pest map in which the pest icon is superimposed on the position corresponding to the position information of the specified pest in the entire map.
  • the pest icon is, for example, a symbol such as a circle, a square, or a star, text indicating an identifier of the pest, or an illustration of the pest.
  • the pest icon can also indicate the type of pest and the degree of damage at this location, depending on the shape, size, and the like.
  • the pest map creation module 43 superimposes or measures the pest map on the pest map on the basis of the recorded pest data with a measure identifier indicating a drug identifier and a spray amount or a fertilizer identifier and a fertilizer application amount that are effective as measures against this pest. It is also possible to arrange it at.
  • This countermeasure icon is, for example, an identifier of a necessary pesticide or fertilizer such as "Pesticide A 1 liter spray", a text indicating the spray amount or the fertilizer application amount, and an illustration showing these.
  • the pest map creation module 43 can also show the position of the crop to be cultivated on this entire map with icons, texts, illustrations, and the like.
  • FIG. 5 schematically shows an example of the pest map.
  • the pest map 100 has a pest icon 110 and a countermeasure icon 120 superimposed on the entire map.
  • the pest icon 110 is arranged on the entire map at a position based on the pest data described above.
  • the pest icon 110 is also displayed together with text indicating the identifier of this pest.
  • the pest icon 110 also schematically shows the damage situation of the pest according to the size of the icon itself.
  • the pest icon 110 schematically shows the type of pest according to the shape of the icon.
  • the countermeasure icon 120 is arranged near the pest icon 110.
  • the countermeasure icon 120 displays a text indicating an identifier of a necessary pesticide or fertilizer and a spraying amount or a fertilizing amount.
  • the display mode of the pest icon 110 is not limited to the one described above, and can be changed as appropriate. Further, the position, display mode, and the like of the countermeasure icon 120 are not limited to those described above, and can be changed as appropriate.
  • the pest data output module 21 outputs the recorded pest data in the field (step S17).
  • step S17 the pest data output module 21 outputs the pest data by outputting the pest map described above.
  • the pest data output module 21 outputs the pest map to a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal, another computer, or the like. At this time, the pest data output module 21 transmits this pest map to a drone, agricultural machinery, high-performance agricultural machine, worker terminal, other computer, or the like.
  • the pest data output module 21 may output the pest data instead of outputting the pest map. In this case, the pest data output module 21 may omit the process of step S16 described above and output the recorded pest data.
  • the pest data output module 21 outputs this pest map to the worker terminal.
  • the computer 10 causes the worker terminal to output the pest map by transmitting the pest map to the worker terminal and displaying the pest map on the worker terminal.
  • the pest data output module 21 sends the pest map to the worker terminal.
  • the worker terminal receives this pest map.
  • the worker terminal displays the received pest map on its own display unit. By browsing this pest map, the operator can easily understand the position and the amount of the agricultural field where pesticide application or fertilization is necessary.
  • the worker terminal records the received pest map in its own recording unit, etc., and displays it by accepting the input from the worker.
  • the worker can easily browse the pest map at the time of working, and can easily apply the pesticide or apply fertilizer without using a real-time live image.
  • the computer 10 By sending the pest map to the drone, the computer 10 causes the drone to output this pest map.
  • the pest data output module 21 sends the pest map to the drone.
  • the drone receives this pest map.
  • the drone can easily fly or move to the position of the pest based on an operation input or the like from the operator of the drone.
  • the settings for spraying pesticides or fertilizer application are made in advance, pesticide spraying or fertilizing is performed based on the recorded pest map, so that pesticide spraying or fertilization is performed without using real-time live images. It will be easy.
  • the pest data output module 21 may be configured to output the pest map after a predetermined time has elapsed from the time when the recording module 30 recorded the pest data.
  • the predetermined time is, for example, several hours, one day, or several days later.
  • FIG. 4 is a diagram showing a flowchart of the instruction output process executed by the computer 10. The processing executed by each module described above will be described together with this processing. The detailed description of the same processes as those described above will be omitted.
  • the command creation module 44 creates, as a command, an instruction to spray or fertilize a pesticide or fertilizer as a countermeasure for the detected pest based on the recorded pest data (step S20).
  • the command creation module 44 creates this command according to the output destination device or apparatus. For example, when the output destination is a device or device for spraying or applying fertilizer or fertilizer set in advance, the command creation module 44 identifies the device or device for spraying or applying fertilizer of agricultural chemicals or fertilizer in the countermeasure data. Or create the commands required for the device to apply or fertilize.
  • the command is for this equipment or device to spray or fertilize pesticide or fertilizer, to specify the spraying amount of this pesticide or the fertilizing amount of fertilizer, and to fly or move to the position of this pest. Is a command for.
  • the command creation module 44 identifies the device or apparatus for spraying or applying a pesticide or fertilizer in the countermeasure data, Select to create the required command to spray or apply this pesticide or fertilizer.
  • the command in this case is for this equipment or device to specify the command for selecting this pesticide or fertilizer, the command for spraying or fertilizing this pesticide or fertilizer, the spraying amount of this pesticide or the fertilizing amount of fertilizer. And commands for flying and moving to the location of this pest.
  • the output destination device or device is not movable independently, it is not necessary to create a command for flying or moving to the position of the pest, as a command created by the command creating module 44, and other commands. Any configuration may be used as long as it is created.
  • the command creation module 44 creates a command for a drone that can spray or apply a plurality of pesticides or fertilizers will be described as an example.
  • the command creation module 44 identifies a drone to which the pesticide or fertilizer in the countermeasure data can be applied or fertilized based on the countermeasure data in the pest data.
  • the command creation module 44 creates a command for this drone to spray or apply pesticide or fertilizer.
  • the command creation module 44 creates a command for the drone to specify the spraying amount of the pesticide or the fertilizer applying amount.
  • the command creation module 44 creates a command for flying or traveling to this position based on the position of the pest in the countermeasure data.
  • the command creation module 44 creates this command for each position of each pest in the pest map.
  • the command output module 22 outputs the created command (step S21).
  • step S21 the command output module 22 outputs the created command to the specified device or apparatus.
  • the computer 10 outputs an instruction for spraying agricultural chemicals or fertilizing to the equipment or device. That is, the computer 10 outputs the command to the device or apparatus to cause the device or apparatus to spray or apply the pesticide.
  • command output module 22 outputs a command to the drone will be described as an example.
  • the command output module 22 outputs the created command to the drone. At this time, the command output module 22 sends the created command to the drone specified when the command was created. The drone receives this command. Based on this command, the drone executes the selection of the pesticide or fertilizer, the determination of the applied amount or the applied amount, and the flight or traveling to the position of the pest. The drone flies or runs to the position of the pest, moves to the position of the pest, and then sprays or fertilizes the pest on the basis of the selected pesticide application amount or fertilizer application amount. That is, the computer 10 outputs a command to the drone to cause the drone to spray or apply the pesticide.
  • the computer 10 executes the processing of steps S20 and S21 described above at the same time as the processing of step S17 described above, or after executing the processing of step S17. That is, the computer 10 outputs the pest data and the output of this command at the same time, or outputs the command after outputting the pest data.
  • the command output module 22 may be configured to output this command after a predetermined time has elapsed from the time when the recording module 30 recorded the pest data.
  • the predetermined time is, for example, several hours, one day, or several days later.
  • the above-described means and functions are realized by a computer (including a CPU, an information processing device, various terminals) reading and executing a predetermined program.
  • the program is provided, for example, in the form of being provided from a computer via a network (SaaS: software as a service). Further, the program is provided in a form recorded in a computer-readable recording medium such as a flexible disk, a CD (CD-ROM, etc.), a DVD (DVD-ROM, DVD-RAM, etc.).
  • the computer reads the program from the recording medium, transfers the program to an internal recording device or an external recording device, and records and executes the program. Further, the program may be recorded in advance in a recording device (recording medium) such as a magnetic disk, an optical disk, a magneto-optical disk, and provided from the recording device to a computer via a communication line.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Insects & Arthropods (AREA)
  • Pest Control & Pesticides (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Catching Or Destruction (AREA)

Abstract

[Problem] To provide a computer system, pest detection method, and program, whereby pests can be easily detected and eradicated on the basis of captured images not limited to live images. [Solution] A computer system that detects pests using image analysis: obtains a captured image of field and position information for the imaging location; performs image analysis of the obtained captured image; detects pests; and outputs position information for the imaging location in the captured image in which pests were detected. In addition, the computer system can also output instructions for spraying pesticides or fertilizing, on the basis of the output position information.

Description

コンピュータシステム、病害虫検出方法及びプログラムComputer system, pest detection method and program
 本発明は、画像解析により病害虫を検出するコンピュータシステム、病害虫検出方法及びプログラムに関する。 The present invention relates to a computer system for detecting pests by image analysis, a pest detection method, and a program.
 従来、圃場に発生した病害虫をドローン等により駆除することが行われている。このような駆除方法として、圃場における病害虫の位置をリアルタイムに特定し、この特定した位置にドローンを移動させ、駆除を行う構成がある。 Previously, pests that have occurred in the field have been exterminated with drones. As such an extermination method, there is a configuration in which the position of the pest in the field is specified in real time, and the drone is moved to the specified position to perform the extermination.
 このような技術の例として、ドローンにより圃場をリアルタイムに撮影したライブ画像に基づいて、病害虫を駆除するシステムが開示されている(特許文献1参照)。このシステムは、このライブ画像と、このライブ画像の撮影地点の位置情報とを取得し、予め病害虫等を撮影した特定画像とライブ画像とを比較することにより、病害虫の有無を検出する。システムは、病害虫を検出した際、このライブ画像を撮影した撮影地点の位置情報を、ドローン等に出力し、病害虫の駆除を行う。 As an example of such a technology, a system for controlling pests based on a live image of a field taken by a drone in real time is disclosed (see Patent Document 1). This system detects the presence or absence of a pest by acquiring the live image and the position information of the shooting point of the live image, and comparing the live image with a specific image in which a pest or the like is captured in advance. When the system detects a pest, the system outputs the position information of the shooting point where the live image was shot to a drone or the like to exterminate the pest.
特開2017-16271号公報JP, 2017-16271, A
 しかしながら、特許文献1の構成では、病害虫の検出は、リアルタイムに撮影したライブ画像に限定されてしまっており、このようなライブ画像以外の撮影画像に対して有効でないおそれがあった。 However, with the configuration of Patent Document 1, the detection of pests is limited to live images taken in real time, which may not be effective for captured images other than such live images.
 本発明は、ライブ画像に限定されず、撮影画像に基づいた病害虫の検出及び駆除が容易なコンピュータシステム、病害虫検出方法及びプログラムを提供することを目的とする。 The present invention is not limited to live images, and an object of the present invention is to provide a computer system, a pest detection method, and a program that facilitate detection and control of pests based on captured images.
 本発明では、以下のような解決手段を提供する。 The present invention provides the following solutions.
 本発明は、画像解析により病害虫を検出するコンピュータシステムであって、
 圃場を撮影した撮影画像及び撮影地点の位置情報を取得する取得手段と、
 取得した前記撮影画像を画像解析し、病害虫を検出する検出手段と、
 前記病害虫を検出した撮影画像における前記撮影地点の位置情報を出力する位置情報出力手段と、
 を備えることを特徴とするコンピュータシステムを提供する。
The present invention is a computer system for detecting pests by image analysis,
An acquisition means for acquiring a photographed image of the field and position information of the photographing point;
Image analysis of the captured image acquired, a detection means for detecting pests,
Position information output means for outputting the position information of the photographing point in the photographed image in which the pest is detected,
There is provided a computer system comprising:
 本発明によれば、画像解析により病害虫を検出するコンピュータシステムは、圃場を撮影した撮影画像及び撮影地点の位置情報を取得し、取得した前記撮影画像を画像解析し、病害虫を検出し、前記病害虫を検出した撮影画像における前記撮影地点の位置情報を出力する According to the present invention, a computer system for detecting a pest by image analysis, the captured image of the field and position information of the shooting point is acquired, image analysis of the captured image acquired, the pest is detected, the pest The positional information of the photographing point in the photographed image in which is detected is output.
 本発明は、システムのカテゴリであるが、方法及びプログラム等の他のカテゴリにおいても、そのカテゴリに応じた同様の作用・効果を発揮する。 The present invention is a system category, but also in other categories such as methods and programs, the same action/effect according to the category is exhibited.
 本発明によれば、ライブ画像に限定されず、撮影画像に基づいた病害虫の検出及び駆除が容易なコンピュータシステム、病害虫検出方法及びプログラムを提供することが可能となる。 According to the present invention, it is possible to provide a computer system, a pest detection method, and a program, which are not limited to live images and which can easily detect and exterminate pests based on captured images.
図1は、病害虫検出システム1の概要を示す図である。FIG. 1 is a diagram showing an outline of the pest detection system 1. 図2は、病害虫検出システム1の全体構成図である。FIG. 2 is an overall configuration diagram of the pest detection system 1. 図3は、コンピュータ10が実行する位置情報出力処理のフローチャートを示す図である。FIG. 3 is a diagram showing a flowchart of the position information output process executed by the computer 10. 図4は、コンピュータ10が実行する指示出力処理のフローチャートを示す図である。FIG. 4 is a diagram showing a flowchart of the instruction output process executed by the computer 10. 図5は、コンピュータ10が作成する病害虫マップの一例を模式的に示す図である。FIG. 5 is a diagram schematically illustrating an example of the pest map created by the computer 10.
 以下、本発明を実施するための最良の形態について図を参照しながら説明する。なお、これはあくまでも例であって、本発明の技術的範囲はこれに限られるものではない。 Hereinafter, the best mode for carrying out the present invention will be described with reference to the drawings. Note that this is merely an example, and the technical scope of the present invention is not limited to this.
 [病害虫検出システム1の概要]
 本発明の好適な実施形態の概要について、図1に基づいて説明する。図1は、本発明の好適な実施形態である病害虫検出システム1の概要を説明するための図である。病害虫検出システム1は、コンピュータ10から構成され、画像解析により病害虫を検出するコンピュータシステムである。
[Outline of Pest Detection System 1]
The outline of a preferred embodiment of the present invention will be described with reference to FIG. FIG. 1 is a diagram for explaining an outline of a pest detection system 1 which is a preferred embodiment of the present invention. The pest detection system 1 is a computer system that includes a computer 10 and detects a pest by image analysis.
 なお、病害虫検出システム1は、ドローン、農機具、高性能農業機械、作物を栽培する作業者が所持する作業者端末(例えば、スマートフォンやタブレット端末やパーソナルコンピュータ)、その他のコンピュータ等のその他の端末や装置類が含まれていてもよい。また、病害虫検出システム1は、例えば、コンピュータ10等の1台のコンピュータで実現されてもよいし、クラウドコンピュータのように、複数のコンピュータで実現されてもよい。 The pest detection system 1 is a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal possessed by a worker who cultivates a crop (for example, a smartphone, a tablet terminal, a personal computer), other terminals such as other computers, Devices may be included. Further, the pest detection system 1 may be realized by one computer such as the computer 10 or may be realized by a plurality of computers such as a cloud computer.
 コンピュータ10は、ドローン、農機具、高性能農業機械、作業者端末、その他のコンピュータ等と、公衆回線網等を介して、データ通信可能に接続されており、必要なデータの送受信を実行する。 The computer 10 is connected to a drone, an agricultural machine tool, a high-performance agricultural machine, a worker terminal, other computers, and the like so as to be able to perform data communication via a public line network, etc., and transmits and receives necessary data.
 コンピュータ10は、圃場を撮影した撮影画像及び撮影地点の位置情報を取得する。コンピュータ10は、例えば、ドローンにより圃場の各地点を撮影した撮影画像を取得する。コンピュータ10は、例えば、ドローンが撮影画像を撮影した撮影地点の位置情報を、ドローンから取得する。 The computer 10 acquires a photographed image of the field and position information of the photographing point. The computer 10, for example, acquires a photographed image of each point in the field by a drone. The computer 10 acquires from the drone, for example, the position information of the shooting location where the drone shot the shot image.
 コンピュータ10は、取得した撮影画像を画像解析し、この撮影画像に写っている病害虫を検出する。ンピュータ10は、例えば、画像解析として、撮影画像の特徴点(例えば、形状、輪郭、色相)や特徴量(例えば、画素値の平均、分散、ヒストグラム等の統計的な数値)を抽出する。コンピュータ10は、抽出した特徴点や特徴量に基づいて、この撮影画像に写っている病害虫を検出する。 The computer 10 analyzes the acquired photographed image and detects the pests shown in this photographed image. The computer 10 extracts, for example, as image analysis, characteristic points (for example, shape, contour, and hue) and characteristic amounts (for example, average of pixel values, variance, and statistical values such as histogram) of the captured image. The computer 10 detects the pests shown in this captured image based on the extracted characteristic points and characteristic amounts.
 コンピュータ10は、検出した病害虫の位置情報を、取得した撮影地点の位置情報に基づいて特定する。例えば、コンピュータ10は、検出した病害虫の位置情報が、撮影地点の位置情報に一致するものとして、この病害虫の位置情報を特定する。その結果、コンピュータ10は、圃場における検出した病害虫の位置を特定することになる。 The computer 10 identifies the position information of the detected pest based on the acquired position information of the shooting point. For example, the computer 10 identifies the position information of the detected pest, assuming that the detected position information of the pest matches the position information of the photographing point. As a result, the computer 10 will specify the position of the detected pest in the field.
 コンピュータ10は、特定した病害虫の位置情報を出力する。このとき、コンピュータ10は、上述したドローン、農機具、高性能農業機械、作業者端末、その他のコンピュータ等に、この位置情報を出力する。 The computer 10 outputs the position information of the specified pest. At this time, the computer 10 outputs this position information to the above-mentioned drone, agricultural equipment, high-performance agricultural machine, worker terminal, other computer, or the like.
 なお、コンピュータ10は、この位置情報に基づいて、農薬散布又は施肥の指示を出力する構成も可能である。例えば、コンピュータ10は、この位置情報に基づいて、病害虫の位置にドローンを移動させ、このドローンが有する農薬散布装置又は施肥装置により、農薬又は肥料を散布させる指示を、コマンドとしてドローンに送信する。ドローンは、このコマンドを受信し、病害虫に対して、農薬又は肥料を散布する。その結果、コンピュータ10は、この位置情報に基づいて、農薬散布又は施肥の指示を出力することになる。 The computer 10 can also be configured to output an instruction for spraying pesticide or fertilizing based on this position information. For example, the computer 10 moves the drone to the position of the pest based on the position information, and transmits an instruction to spray the pesticide or fertilizer to the drone as a command by the pesticide spraying device or the fertilizer applying device included in the drone. The drone receives this command and sprays pesticides or fertilizers against the pests. As a result, the computer 10 outputs a pesticide spraying or fertilizing instruction based on this position information.
 次に、病害虫検出システム1が実行する処理の概要について説明する。 Next, the outline of the processing executed by the pest detection system 1 will be described.
 コンピュータ10は、圃場を撮影した撮影画像及び撮影地点の位置情報を撮影データとして取得する(ステップS01)。コンピュータ10は、例えば、ドローンが圃場の各地点を撮影した撮影画像を取得する。コンピュータ10は、例えば、ドローンが撮影画像を撮影した撮影地点の位置情報を、ドローンから取得する。コンピュータ10は、このような撮影画像及び撮影地点の位置情報を撮影データとして取得する。 The computer 10 acquires a photographed image of the field and position information of the photographing point as photographing data (step S01). The computer 10 acquires, for example, captured images obtained by capturing images of various points in the field by the drone. The computer 10 acquires from the drone, for example, the position information of the shooting location where the drone shot the shot image. The computer 10 acquires such photographed images and positional information of photographing points as photographing data.
 コンピュータ10は、撮影画像を画像解析し、この撮影画像に写っている病害虫を検出する(ステップS02)。コンピュータ10は、例えば、画像解析として、撮影画像の特徴点や特徴量を抽出する。コンピュータ10は、抽出した特徴点や特徴量に基づいて、この撮影画像に写っている病害虫を検出する。 The computer 10 analyzes the photographed image and detects the pests shown in this photographed image (step S02). The computer 10 extracts, for example, a feature point or a feature amount of a captured image as image analysis. The computer 10 detects the pests shown in this captured image based on the extracted characteristic points and characteristic amounts.
 コンピュータ10は、検出した病害虫の位置情報を、取得した撮影地点の位置情報に基づいて特定する(ステップS03)。コンピュータ10は、検出した病害虫の位置情報が、撮影地点の位置情報に一致するものとして、この病害虫の位置情報を特定する。その結果、コンピュータ10は、圃場における検出した病害虫の位置を特定することになる。 The computer 10 identifies the position information of the detected pest based on the acquired position information of the shooting point (step S03). The computer 10 specifies the position information of the detected pest, assuming that the detected position information of the pest matches the position information of the photographing point. As a result, the computer 10 will specify the position of the detected pest in the field.
 コンピュータ10は、特定した病害虫の位置情報を出力する(ステップS04)。コンピュータ10は、例えば、上述したドローン、農機具、高性能農業機械、作業者端末、その他のコンピュータ等に、この位置情報を出力する。 The computer 10 outputs the position information of the specified pest (step S04). The computer 10 outputs this position information to, for example, the above-mentioned drone, agricultural machinery, high-performance agricultural machine, worker terminal, and other computers.
 以上が、病害虫検出システム1が実行する処理の概要である。 The above is an outline of the processing executed by the pest detection system 1.
 [病害虫検出システム1のシステム構成]
 図2に基づいて、本発明の好適な実施形態である病害虫検出システム1のシステム構成について説明する。図2は、本発明の好適な実施形態である病害虫検出システム1のシステム構成を示す図である。図2において、病害虫検出システム1は、コンピュータ10から構成され、画像解析により病害虫を検出するコンピュータシステムである。
[System Configuration of Pest Detection System 1]
Based on FIG. 2, a system configuration of a pest detection system 1 according to a preferred embodiment of the present invention will be described. FIG. 2 is a diagram showing the system configuration of the pest detection system 1 according to the preferred embodiment of the present invention. In FIG. 2, the pest detection system 1 is a computer system that includes a computer 10 and detects a pest by image analysis.
 コンピュータ10は、ドローン、農機具、高性能農業機械、作業者端末、その他のコンピュータ等と公衆回線網等を介してデータ通信可能に接続されており、必要なデータの送受信を実行する。 The computer 10 is connected to a drone, an agricultural machine tool, a high-performance agricultural machine, a worker terminal, other computers, etc. so as to be able to perform data communication via a public network, etc., and transmits/receives necessary data.
 なお、病害虫検出システム1は、図示していないドローン、農機具、高性能農業機械、作業者端末、その他のコンピュータ等やその他の端末や装置類が含まれていてもよい。また、病害虫検出システム1は、例えば、コンピュータ10等の1台のコンピュータで実現されてもよいし、クラウドコンピュータのように、複数のコンピュータで実現されてもよい。 The pest detection system 1 may include a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal, other computers, and other terminals and devices that are not shown. Further, the pest detection system 1 may be realized by one computer such as the computer 10 or may be realized by a plurality of computers such as a cloud computer.
 コンピュータ10は、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、RAM(Random Access Memory)、ROM(Read Only Memory)等を備え、通信部として、他の端末や装置等と通信可能にするためのデバイス、例えば、IEEE802.11に準拠したWi―Fi(Wireless―Fidelity)対応デバイス等を備える。また、コンピュータ10は、記録部として、ハードディスクや半導体メモリ、記録媒体、メモリカード等によるデータのストレージ部を備える。また、コンピュータ10は、処理部として、各種処理を実行する各種デバイス等を備える。 The computer 10 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), etc., and enables communication with other terminals or devices as a communication unit. Device, for example, a device compatible with Wi-Fi (Wireless-Fidelity) compliant with IEEE 802.11. The computer 10 also includes, as a recording unit, a data storage unit such as a hard disk, a semiconductor memory, a recording medium, and a memory card. Further, the computer 10 includes various devices that execute various processes as a processing unit.
 コンピュータ10において、制御部が所定のプログラムを読み込むことにより、通信部と協働して、撮影データ取得モジュール20、病害虫データ出力モジュール21、コマンド出力モジュール22を実現する。また、コンピュータ10において、制御部が所定のプログラムを読み込むことにより、記録部と協働して、記録モジュール30を実現する。また、コンピュータ10において、制御部が所定のプログラムを読み込むことにより、処理部と協働して、画像解析モジュール40、位置特定モジュール41、対策データ特定モジュール42、病害虫マップ作成モジュール43、コマンド作成モジュール44を実現する。 In the computer 10, the control unit reads a predetermined program to realize the imaging data acquisition module 20, the pest data output module 21, and the command output module 22 in cooperation with the communication unit. Further, in the computer 10, the control unit reads a predetermined program to realize the recording module 30 in cooperation with the recording unit. Further, in the computer 10, the control unit reads a predetermined program, and in cooperation with the processing unit, the image analysis module 40, the position identification module 41, the countermeasure data identification module 42, the pest map creation module 43, the command creation module. Achieve 44.
 [位置情報出力処理]
 図3に基づいて、病害虫検出システム1が実行する位置情報出力処理について説明する。図3は、コンピュータ10が実行する位置情報出力処理のフローチャートを示す図である。上述した各モジュールが実行する処理について、本処理に併せて説明する。
[Position information output processing]
The position information output process executed by the pest detection system 1 will be described with reference to FIG. FIG. 3 is a diagram showing a flowchart of the position information output process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
 撮影データ取得モジュール20は、圃場を撮影した撮影画像及び撮影地点の位置情報を、撮影データとして取得する(ステップS10)。ステップS10において、撮影データ取得モジュール20は、例えば、ドローンが圃場の予め設定された複数の撮影地点の其々において撮影した撮影画像と、ドローンが撮影画像を撮影した其々の撮影地点の位置情報とを撮影データとして取得する。ドローンは、自身の直下を、撮影画像として撮影する。すなわち、ドローンは、圃場に対して垂直な位置から撮影画像を撮影することになる。ドローンは、撮影地点において、撮影画像を撮影するとともに、自身の位置情報をGPS等から取得する。ドローンは、この取得した自身の位置情報を撮影地点の位置情報として扱う。ドローンは、この撮影画像と、撮影地点の位置情報とを撮影データとして、コンピュータ10に送信する。撮影データ取得モジュール20は、ドローンが送信した撮影データを受信することにより、撮影データを取得する。その結果、コンピュータ10は、圃場を撮影した撮影画像及び撮影地点の位置情報を取得することになる。 The shooting data acquisition module 20 acquires the shooting image of the field and the position information of the shooting point as the shooting data (step S10). In step S10, the shooting data acquisition module 20, for example, the shooting images taken by the drone at each of a plurality of preset shooting points in the field, and the position information of each shooting point at which the drone has taken the shooting images. And are acquired as shooting data. The drone shoots the area directly below itself as a captured image. That is, the drone captures the captured image from a position perpendicular to the field. The drone captures a captured image at the capturing location and acquires its own position information from the GPS or the like. The drone handles the acquired position information of itself as position information of the shooting point. The drone transmits the photographed image and the position information of the photographing point to the computer 10 as photographing data. The imaging data acquisition module 20 acquires imaging data by receiving the imaging data transmitted by the drone. As a result, the computer 10 acquires the photographed image of the field and the position information of the photographing point.
 画像解析モジュール40は、撮影データに基づいて、撮影画像を画像解析する(ステップS11)。ステップS11において、画像解析モジュール40は、この撮影画像における特徴点や特徴量を抽出する。画像解析モジュール40は、例えば、画像解析として、撮影画像に存在する物体の形状、色相等を抽出する。 The image analysis module 40 analyzes the captured image based on the captured data (step S11). In step S11, the image analysis module 40 extracts feature points and feature amounts in this captured image. The image analysis module 40 extracts, for example, the shape, hue, etc. of an object existing in a captured image as image analysis.
 画像解析モジュール40は、画像解析の結果に基づいて、病害虫が検出できたか否かを判断する(ステップS12)。ステップS12において、画像解析モジュール40は、抽出した特徴点や特徴量と、予め病害虫の特徴点や特徴量と病害虫の識別子とを対応付けて登録した病害虫データベースとを比較することにより、撮影画像に病害虫が存在するか否かを判断する。画像解析モジュール40は、今回抽出した特徴点や特徴量と、病害虫データベースに登録された特徴点や特徴量とが一致する場合、病害虫が検出できたと判断し、一致しない場合、病害虫が検出できなかったと判断することになる。画像解析モジュール40は、病害虫が検出できた場合、この病害虫の特徴点や特徴量と、病害虫データベースとに基づいて、この病害虫の識別子を併せて判断する。 The image analysis module 40 determines whether the pest could be detected based on the result of the image analysis (step S12). In step S12, the image analysis module 40 compares the extracted feature points and feature amounts with the pest database in which the feature points and feature amounts of the pests and the pest identifiers are registered in advance to form a captured image. Determine whether pests are present. The image analysis module 40 determines that the pest can be detected when the feature point or the feature amount extracted this time and the feature point or the feature amount registered in the pest database are judged to have been detected. If they do not match, the pest cannot be detected. Will be judged. When the pest can be detected, the image analysis module 40 also determines the pest identifier based on the feature point and the feature amount of the pest and the pest database.
 ステップS12において、画像解析モジュール40は、画像解析の結果、病害虫が検出できなかったと判断した場合(ステップS12 NO)、コンピュータ10は、本処理を終了する。なお、コンピュータ10は、この場合、上述したステップS10の処理を実行し、今回画像解析を行った撮影データとは異なる撮影データを取得する構成であってもよい。 In step S12, when the image analysis module 40 determines that the pest could not be detected as a result of the image analysis (step S12 NO), the computer 10 ends this processing. In this case, the computer 10 may be configured to execute the above-described processing of step S10 and acquire shooting data different from the shooting data subjected to the image analysis this time.
 一方、ステップS12において、画像解析モジュール40は、画像解析の結果、病害虫が検出できたと判断した場合(ステップS12 YES)、位置特定モジュール41は、この撮影データにおける位置情報に基づいて、検出した病害虫の位置情報を特定する(ステップS13)。ステップS13において、位置特定モジュール41は、今回画像解析を行った撮影画像に対応する撮影地点の位置情報を、撮影データに基づいて特定する。位置特定モジュール41は、撮影地点の位置情報を、この病害虫の位置情報として特定する。さらに、位置特定モジュール41は、この病害虫の位置情報と、撮影画像における座標とに基づいて、この病害虫の詳細な位置を特定する。例えば、位置特定モジュール41は、撮影画像に対して、直交座標系を設定し、この病害虫を検出した撮影画像における位置を、この撮影画像におけるX座標及びY座標として特定する。位置特定モジュール41は、撮影地点における位置情報と、このX座標及びY座標とに基づいて、実際の圃場における病害虫の位置情報を特定する。このとき、位置特定モジュール41は、撮影画像の中心の位置が、撮影地点における位置情報に該当し、X座標及びY座標がこの中心の位置に対する位置として特定する。その結果、位置特定モジュール41は、圃場におけるこの病害虫の位置を特定することになる。 On the other hand, in step S12, when the image analysis module 40 determines that the pest can be detected as a result of the image analysis (YES in step S12), the position specifying module 41 determines the detected pest based on the position information in the imaging data. The position information of is specified (step S13). In step S13, the position specifying module 41 specifies the position information of the shooting point corresponding to the shot image subjected to the image analysis this time based on the shooting data. The position specifying module 41 specifies the position information of the shooting point as the position information of this pest. Further, the position specifying module 41 specifies the detailed position of the pest based on the position information of the pest and the coordinates in the captured image. For example, the position specifying module 41 sets an orthogonal coordinate system for the picked-up image and specifies the position in the picked-up image in which this pest is detected as the X coordinate and the Y coordinate in this picked-up image. The position specifying module 41 specifies the position information of the pest in the actual field based on the position information at the shooting point and the X and Y coordinates. At this time, the position specifying module 41 specifies the center position of the captured image as position information at the shooting point, and specifies the X coordinate and the Y coordinate as positions with respect to the center position. As a result, the position specifying module 41 specifies the position of this pest in the field.
 対策データ特定モジュール42は、今回特定した病害虫に対して有効な農薬又は肥料の識別子と、この農薬又は肥料の散布量又は施肥量とを、対策データとして特定する(ステップS14)。ステップS14において、対策データ特定モジュール42は、予め病害虫の識別子と、有効な農薬又は肥料の識別子とを対応付けて登録した病害虫対策データベースを参照し、今回検出した病害虫に有効な農薬又は肥料を特定する。さらに、対策データ特定モジュール42は、抽出した特徴点や特徴量に基づいて、この農薬又は肥料の散布量又は施肥量を特定する。例えば、対策データ特定モジュール42は、抽出した病害虫の大きさや形状に応じて、必要な農薬の散布量や肥料の施肥量を特定する。 The countermeasure data identification module 42 identifies, as countermeasure data, the identifier of the pesticide or fertilizer effective for the pest identified this time and the sprayed amount or fertilizer application amount of this pesticide or fertilizer (step S14). In step S14, the countermeasure data identification module 42 refers to the pest countermeasure database in which the identifier of the pest and the identifier of the effective pesticide or fertilizer are registered in advance to identify the pesticide or fertilizer effective for the pest detected this time. To do. Furthermore, the countermeasure data identification module 42 identifies the sprayed amount or fertilized amount of this pesticide or fertilizer based on the extracted characteristic points or characteristic amounts. For example, the countermeasure data identification module 42 identifies the required spraying amount of pesticides and fertilizer application amount according to the size and shape of the extracted pests.
 記録モジュール30は、特定した病害虫の識別子と、この病害虫の位置情報と、対策データとを病害虫データとして記録する(ステップS15)。 The recording module 30 records the identifier of the specified pest, the position information of the pest, and the countermeasure data as the pest data (step S15).
 コンピュータ10は、上述したステップS10-S14の処理を繰り返し、圃場全体に対して、検出した病害虫の其々の識別子と位置情報と対策データとを記録する。 The computer 10 repeats the processing of steps S10 to S14 described above, and records the identifiers, position information, and countermeasure data of each detected pest for the entire field.
 なお、コンピュータ10は、圃場全体に対して、病害虫データを記録せずに、後述する処理を実行する構成であってもよい。例えば、コンピュータ10は、一の病害虫データを特定した際、この一の病害虫データを記録し、この一の病害虫データを後述する処理で用いる構成であってもよい。 Note that the computer 10 may be configured to execute the process described below without recording pest data on the entire field. For example, the computer 10 may be configured to record the one pest data when the one pest data is specified and use the one pest data in the process described later.
 病害虫マップ作成モジュール43は、記録した病害虫データに基づいて、この圃場における検出した病害虫の識別子及び位置を示す病害虫マップを作成する(ステップS16)。ステップS16において、病害虫マップ作成モジュール43は、予め設定された領域又は圃場全体の全体マップを作成する。この全体マップは、予め設定された領域又は圃場全体の位置情報に基づいて作成する。例えば矩形の領域が設定されている場合、この矩形の各頂点の位置情報に基づいて、全体マップを作成する。病害虫マップ作成モジュール43は、全体マップにおいて、特定した病害虫の位置情報に該当する位置に、病害虫アイコンを重畳させたものを病害虫マップとして作成する。この病害虫アイコンは、例えば、丸や四角や星等の記号、病害虫の識別子を示すテキスト、病害虫のイラストである。この病害虫アイコンは、その形状や大きさ等により、この場所における病害虫の種類や被害状況の程度を併せて示すことも可能である。 The pest map creation module 43 creates a pest map showing the identifiers and positions of the detected pests in this field based on the recorded pest data (step S16). In step S16, the pest map creation module 43 creates an entire map of a preset area or the entire field. This entire map is created based on the preset area or position information of the entire farm field. For example, when a rectangular area is set, an entire map is created based on the position information of each vertex of this rectangle. The pest map creation module 43 creates a pest map in which the pest icon is superimposed on the position corresponding to the position information of the specified pest in the entire map. The pest icon is, for example, a symbol such as a circle, a square, or a star, text indicating an identifier of the pest, or an illustration of the pest. The pest icon can also indicate the type of pest and the degree of damage at this location, depending on the shape, size, and the like.
 さらに、病害虫マップ作成モジュール43は、この病害虫マップに、記録した病害虫データに基づいて、この病害虫の対策として有効な薬の識別子及び散布量や肥料の識別子及び施肥量を示す対策アイコンを重畳又は近傍に配置することも可能である。この対策アイコンは、例えば、「農薬A 1リットル散布」といった必要な農薬又は肥料の識別子と、散布量又は施肥量とを示すテキスト、これらを示すイラストである。 Furthermore, the pest map creation module 43 superimposes or measures the pest map on the pest map on the basis of the recorded pest data with a measure identifier indicating a drug identifier and a spray amount or a fertilizer identifier and a fertilizer application amount that are effective as measures against this pest. It is also possible to arrange it at. This countermeasure icon is, for example, an identifier of a necessary pesticide or fertilizer such as "Pesticide A 1 liter spray", a text indicating the spray amount or the fertilizer application amount, and an illustration showing these.
 なお、病害虫マップ作成モジュール43は、この全体マップに、栽培する作物の位置をアイコン、テキスト、イラスト等により示すことも可能である。 The pest map creation module 43 can also show the position of the crop to be cultivated on this entire map with icons, texts, illustrations, and the like.
 図5に基づいて、病害虫マップ作成モジュール43が作成する病害虫マップについて説明する。図5は、病害虫マップの一例を模式的に示したものである。図5において、病害虫マップ100は、病害虫アイコン110、対策アイコン120が全体マップに重畳されている。病害虫アイコン110は、全体マップにおいて、上述した病害虫データに基づいた位置に配置されたものである。病害虫アイコン110は、この病害虫の識別子を示すテキストをあわせて表示するものである。また、この病害虫アイコン110は、アイコン自体の大きさにより、病害虫の被害の状況を模式的に示すものでもある。また、この病害虫アイコン110は、アイコンの形状により、病害虫の種類を模式的に示すものである。対策アイコン120は、この病害虫アイコン110の近傍の位置に配置されたものである。対策アイコン120は、必要な農薬又は肥料の識別子と、散布量又は施肥量とを示すテキストを表示するものである。 The pest map created by the pest map creation module 43 will be described with reference to FIG. FIG. 5 schematically shows an example of the pest map. In FIG. 5, the pest map 100 has a pest icon 110 and a countermeasure icon 120 superimposed on the entire map. The pest icon 110 is arranged on the entire map at a position based on the pest data described above. The pest icon 110 is also displayed together with text indicating the identifier of this pest. Further, the pest icon 110 also schematically shows the damage situation of the pest according to the size of the icon itself. Further, the pest icon 110 schematically shows the type of pest according to the shape of the icon. The countermeasure icon 120 is arranged near the pest icon 110. The countermeasure icon 120 displays a text indicating an identifier of a necessary pesticide or fertilizer and a spraying amount or a fertilizing amount.
 なお、病害虫アイコン110の表示態様等は、上述したものに限らず、適宜変更可能である。また、対策アイコン120の位置や表示態様等は、上述したものに限らず、適宜変更可能である。 The display mode of the pest icon 110 is not limited to the one described above, and can be changed as appropriate. Further, the position, display mode, and the like of the countermeasure icon 120 are not limited to those described above, and can be changed as appropriate.
 病害虫データ出力モジュール21は、記録した圃場における病害虫データを、出力する(ステップS17)。ステップS17において、病害虫データ出力モジュール21は、上述した病害虫マップを出力することにより、病害虫データを出力することになる。病害虫データ出力モジュール21は、この病害虫マップを、ドローン、農機具、高性能農業機械、作業者端末、その他のコンピュータ等に出力する。このとき、病害虫データ出力モジュール21は、この病害虫マップをドローン、農機具、高性能農業機械、作業者端末、その他のコンピュータ等に送信する。 The pest data output module 21 outputs the recorded pest data in the field (step S17). In step S17, the pest data output module 21 outputs the pest data by outputting the pest map described above. The pest data output module 21 outputs the pest map to a drone, an agricultural machine, a high-performance agricultural machine, a worker terminal, another computer, or the like. At this time, the pest data output module 21 transmits this pest map to a drone, agricultural machinery, high-performance agricultural machine, worker terminal, other computer, or the like.
 なお、病害虫データ出力モジュール21は、病害虫マップを出力するのではなく、病害虫データを出力する構成であってもよい。この場合、病害虫データ出力モジュール21は、上述したステップS16の処理を省略し、記録した病害虫データを、出力する構成とすればよい。 The pest data output module 21 may output the pest data instead of outputting the pest map. In this case, the pest data output module 21 may omit the process of step S16 described above and output the recorded pest data.
 病害虫データ出力モジュール21が、作業者端末にこの病害虫マップを出力する場合について説明する。 Described below is the case where the pest data output module 21 outputs this pest map to the worker terminal.
 コンピュータ10は、病害虫マップを作業者端末に送信し、この病害虫マップを作業者端末に表示させることにより、作業者端末にこの病害虫マップを出力させることになる。 The computer 10 causes the worker terminal to output the pest map by transmitting the pest map to the worker terminal and displaying the pest map on the worker terminal.
 病害虫データ出力モジュール21は、病害虫マップを、作業者端末に送信する。作業者端末は、この病害虫マップを受信する。作業者端末は、受信したこの病害虫マップを自身の表示部に表示する。作業者は、この病害虫マップを閲覧することにより、農薬散布又は施肥が必要な圃場の位置及びその量を把握することが容易となる。 The pest data output module 21 sends the pest map to the worker terminal. The worker terminal receives this pest map. The worker terminal displays the received pest map on its own display unit. By browsing this pest map, the operator can easily understand the position and the amount of the agricultural field where pesticide application or fertilization is necessary.
 また、作業者端末は、この受信した病害虫マップを自身の記録部等に記録し、作業者からの入力を受け付けることにより、表示する。その結果、作業者は、自身が作業する時点で、この病害虫マップを閲覧することが容易となり、リアルタイムなライブ画像によらずとも、農薬散布又は施肥を行うことが容易となる。 Also, the worker terminal records the received pest map in its own recording unit, etc., and displays it by accepting the input from the worker. As a result, the worker can easily browse the pest map at the time of working, and can easily apply the pesticide or apply fertilizer without using a real-time live image.
 病害虫データ出力モジュール21が、ドローンにこの病害虫マップを出力する場合について説明する。 The case where the pest data output module 21 outputs this pest map to the drone will be described.
 コンピュータ10は、病害虫マップをドローンに送信することにより、ドローンにこの病害虫マップを出力させることになる。 By sending the pest map to the drone, the computer 10 causes the drone to output this pest map.
 病害虫データ出力モジュール21は、病害虫マップを、ドローンに送信する。ドローンは、この病害虫マップを受信する。ドローンは、この病害虫マップを自身の記録部等に記録することにより、このドローンの操作者等からの操作入力等に基づいて、病害虫の位置への飛行や移動等を行うことが容易となる。さらに、予め農薬の散布又は肥料の施肥を行う設定がされている場合、記録した病害虫マップに基づいて農薬散布又は施肥を行うことにより、リアルタイムなライブ画像によらずとも、農薬散布又は施肥を行うことが容易となる。 The pest data output module 21 sends the pest map to the drone. The drone receives this pest map. By recording the pest map in its own recording unit or the like, the drone can easily fly or move to the position of the pest based on an operation input or the like from the operator of the drone. Furthermore, if the settings for spraying pesticides or fertilizer application are made in advance, pesticide spraying or fertilizing is performed based on the recorded pest map, so that pesticide spraying or fertilization is performed without using real-time live images. It will be easy.
 なお、病害虫データ出力モジュール21は、この病害虫マップを、記録モジュール30が病害虫データを記録した時点から所定時間経過した後、出力する構成であってもよい。所定時間としては、例えば、数時間後、一日後、数日後である。 Note that the pest data output module 21 may be configured to output the pest map after a predetermined time has elapsed from the time when the recording module 30 recorded the pest data. The predetermined time is, for example, several hours, one day, or several days later.
 以上が、位置情報出力処理である。 The above is the position information output process.
 [指示出力処理]
 図4に基づいて、病害虫検出システム1が実行する指示出力処理について説明する。図4は、コンピュータ10が実行する指示出力処理のフローチャートを示す図である。上述した各モジュールが実行する処理について、本処理に併せて説明する。なお、上述した処理と同様の処理については、その詳細な説明は省略する。
[Instruction output processing]
The instruction output process executed by the pest detection system 1 will be described with reference to FIG. FIG. 4 is a diagram showing a flowchart of the instruction output process executed by the computer 10. The processing executed by each module described above will be described together with this processing. The detailed description of the same processes as those described above will be omitted.
 コマンド作成モジュール44は、記録した病害虫データに基づいて、検出した病害虫の対策となる農薬又は肥料を散布又は施肥させる指示をコマンドとして作成する(ステップS20)。ステップS20において、コマンド作成モジュール44は、出力先の機器や装置に応じたこのコマンドを作成する。例えば、出力先が予め設定された農薬又は肥料を散布又は施肥する機器や装置である場合、コマンド作成モジュール44は、対策データにおける農薬又は肥料を散布又は施肥する機器や装置を特定し、この機器や装置が散布又は施肥するために必要なコマンドを作成する。この場合におけるコマンドは、この機器や装置が、農薬又は肥料を散布又は施肥するためのコマンド、この農薬の散布量又は肥料の施肥量を指定するためのコマンド及びこの病害虫の位置へ飛行や移動するためのコマンドである。また、出力先が複数の農薬又は肥料を散布又は施肥する機器や装置である場合、コマンド作成モジュール44は、対策データにおける農薬又は肥料を散布又は施肥する機器や装置を特定し、この農薬又は肥料を選択し、この農薬又は肥料を散布又は施肥する必要なコマンドを作成する。この場合におけるコマンドは、この機器や装置が、この農薬又は肥料を選択するためのコマンド、この農薬又は肥料を散布又は施肥するためのコマンド、この農薬の散布量又は肥料の施肥量を指定するためのコマンド及びこの病害虫の位置へ飛行や移動するためのコマンドである。 The command creation module 44 creates, as a command, an instruction to spray or fertilize a pesticide or fertilizer as a countermeasure for the detected pest based on the recorded pest data (step S20). In step S20, the command creation module 44 creates this command according to the output destination device or apparatus. For example, when the output destination is a device or device for spraying or applying fertilizer or fertilizer set in advance, the command creation module 44 identifies the device or device for spraying or applying fertilizer of agricultural chemicals or fertilizer in the countermeasure data. Or create the commands required for the device to apply or fertilize. In this case, the command is for this equipment or device to spray or fertilize pesticide or fertilizer, to specify the spraying amount of this pesticide or the fertilizing amount of fertilizer, and to fly or move to the position of this pest. Is a command for. Further, when the output destination is a device or apparatus for spraying or applying a plurality of pesticides or fertilizers, the command creation module 44 identifies the device or apparatus for spraying or applying a pesticide or fertilizer in the countermeasure data, Select to create the required command to spray or apply this pesticide or fertilizer. The command in this case is for this equipment or device to specify the command for selecting this pesticide or fertilizer, the command for spraying or fertilizing this pesticide or fertilizer, the spraying amount of this pesticide or the fertilizing amount of fertilizer. And commands for flying and moving to the location of this pest.
 なお、出力先の機器や装置が単独で移動可能なものでない場合、コマンド作成モジュール44が作成するコマンドとして、病害虫の位置へ飛行や移動するためのコマンドを作成する必要はなく、それ以外のものを作成する構成であればよい。 If the output destination device or device is not movable independently, it is not necessary to create a command for flying or moving to the position of the pest, as a command created by the command creating module 44, and other commands. Any configuration may be used as long as it is created.
 コマンド作成モジュール44が複数の農薬又は肥料を散布又は施肥可能なドローンに対してコマンドを作成する場合を例として説明する。コマンド作成モジュール44は、病害虫データにおける対策データに基づいて、対策データにおける農薬又は肥料を散布又は施肥可能なドローンを特定する。コマンド作成モジュール44は、このドローンが農薬又は肥料を散布又は施肥するためのコマンドを作成する。また、コマンド作成モジュール44は、このドローンがこの農薬の散布量又は肥料の施肥量を指定するためのコマンドを作成する。また、コマンド作成モジュール44は、対策データにおける病害虫の位置に基づいて、この位置へ飛行又は走行するためのコマンドを作成する。 A case where the command creation module 44 creates a command for a drone that can spray or apply a plurality of pesticides or fertilizers will be described as an example. The command creation module 44 identifies a drone to which the pesticide or fertilizer in the countermeasure data can be applied or fertilized based on the countermeasure data in the pest data. The command creation module 44 creates a command for this drone to spray or apply pesticide or fertilizer. Further, the command creation module 44 creates a command for the drone to specify the spraying amount of the pesticide or the fertilizer applying amount. In addition, the command creation module 44 creates a command for flying or traveling to this position based on the position of the pest in the countermeasure data.
 コマンド作成モジュール44は、病害虫マップにおける各病害虫の位置の其々に対して、このコマンドを作成する。 The command creation module 44 creates this command for each position of each pest in the pest map.
 コマンド出力モジュール22は、作成したコマンドを出力する(ステップS21)。ステップS21において、コマンド出力モジュール22は、特定した機器や装置に、作成したコマンドを出力する。その結果、コンピュータ10は、農薬散布又は施肥の指示を、機器や装置に出力する。すなわち、コンピュータ10は、コマンドを機器や装置に出力することにより、農薬散布又は施肥を、この機器や装置に実行させることになる。 The command output module 22 outputs the created command (step S21). In step S21, the command output module 22 outputs the created command to the specified device or apparatus. As a result, the computer 10 outputs an instruction for spraying agricultural chemicals or fertilizing to the equipment or device. That is, the computer 10 outputs the command to the device or apparatus to cause the device or apparatus to spray or apply the pesticide.
 コマンド出力モジュール22がドローンに対してコマンドを出力する場合を例として説明する。 A case where the command output module 22 outputs a command to the drone will be described as an example.
 コマンド出力モジュール22は、作成したコマンドをドローンに出力する。このとき、コマンド出力モジュール22は、コマンド作成時に特定したドローンに、作成したコマンドを送信する。ドローンは、このコマンドを受信する。ドローンは、このコマンドに基づいて、農薬又は肥料の選択、散布量又は施肥量の特定及び病害虫の位置への飛行又は走行を実行する。ドローンは、病害虫の位置へ飛行又は走行し、この病害虫の位置へ移動後、この病害虫に対して、選択した農薬の散布量又は肥料の施肥量に基づいて、散布又は施肥する。すなわち、コンピュータ10は、コマンドをドローンに出力することにより、農薬散布又は施肥を、このドローンに実行させることになる。 The command output module 22 outputs the created command to the drone. At this time, the command output module 22 sends the created command to the drone specified when the command was created. The drone receives this command. Based on this command, the drone executes the selection of the pesticide or fertilizer, the determination of the applied amount or the applied amount, and the flight or traveling to the position of the pest. The drone flies or runs to the position of the pest, moves to the position of the pest, and then sprays or fertilizes the pest on the basis of the selected pesticide application amount or fertilizer application amount. That is, the computer 10 outputs a command to the drone to cause the drone to spray or apply the pesticide.
 コンピュータ10は、上述したステップS20及びS21の処理を、上述したステップS17の処理と同時又はステップS17の処理を実行後に実行する。すなわち、コンピュータ10は、病害虫データの出力と、このコマンドの出力とを同時に実行する又は病害虫データの出力後に、このコマンドの出力を実行するものである。 The computer 10 executes the processing of steps S20 and S21 described above at the same time as the processing of step S17 described above, or after executing the processing of step S17. That is, the computer 10 outputs the pest data and the output of this command at the same time, or outputs the command after outputting the pest data.
 なお、コマンド出力モジュール22は、このコマンドを、記録モジュール30が病害虫データを記録した時点から所定時間経過した後、出力する構成であってもよい。所定時間としては、例えば、数時間後、一日後、数日後である。 The command output module 22 may be configured to output this command after a predetermined time has elapsed from the time when the recording module 30 recorded the pest data. The predetermined time is, for example, several hours, one day, or several days later.
 以上が、指示出力処理である。 The above is the instruction output process.
 上述した手段、機能は、コンピュータ(CPU、情報処理装置、各種端末を含む)が、所定のプログラムを読み込んで、実行することによって実現される。プログラムは、例えば、コンピュータからネットワーク経由で提供される(SaaS:ソフトウェア・アズ・ア・サービス)形態で提供される。また、プログラムは、例えば、フレキシブルディスク、CD(CD-ROMなど)、DVD(DVD-ROM、DVD-RAMなど)等のコンピュータ読取可能な記録媒体に記録された形態で提供される。この場合、コンピュータはその記録媒体からプログラムを読み取って内部記録装置又は外部記録装置に転送し記録して実行する。また、そのプログラムを、例えば、磁気ディスク、光ディスク、光磁気ディスク等の記録装置(記録媒体)に予め記録しておき、その記録装置から通信回線を介してコンピュータに提供するようにしてもよい。 The above-described means and functions are realized by a computer (including a CPU, an information processing device, various terminals) reading and executing a predetermined program. The program is provided, for example, in the form of being provided from a computer via a network (SaaS: software as a service). Further, the program is provided in a form recorded in a computer-readable recording medium such as a flexible disk, a CD (CD-ROM, etc.), a DVD (DVD-ROM, DVD-RAM, etc.). In this case, the computer reads the program from the recording medium, transfers the program to an internal recording device or an external recording device, and records and executes the program. Further, the program may be recorded in advance in a recording device (recording medium) such as a magnetic disk, an optical disk, a magneto-optical disk, and provided from the recording device to a computer via a communication line.
 以上、本発明の実施形態について説明したが、本発明は上述したこれらの実施形態に限るものではない。また、本発明の実施形態に記載された効果は、本発明から生じる最も好適な効果を列挙したに過ぎず、本発明による効果は、本発明の実施形態に記載されたものに限定されるものではない。 Although the embodiments of the present invention have been described above, the present invention is not limited to these embodiments described above. Further, the effects described in the embodiments of the present invention only list the most suitable effects that result from the present invention, and the effects according to the present invention are limited to those described in the embodiments of the present invention. is not.
 1 病害虫検出システム、10 コンピュータ 1 pest detection system, 10 computers

Claims (4)

  1.  画像解析により病害虫を検出するコンピュータシステムであって、
     圃場を撮影した撮影画像及び撮影地点の位置情報を取得する取得手段と、
     取得した前記撮影画像を画像解析し、病害虫を検出する検出手段と、
     前記病害虫を検出した撮影画像における前記撮影地点の位置情報を出力する位置情報出力手段と、
     を備えることを特徴とするコンピュータシステム。
    A computer system for detecting pests by image analysis,
    An acquisition means for acquiring a photographed image of the field and position information of the photographing point;
    Image analysis of the captured image acquired, a detection means for detecting pests,
    Position information output means for outputting the position information of the photographing point in the photographed image in which the pest is detected,
    A computer system comprising:
  2.  前記出力した位置情報に基づいて、農薬散布又は施肥の指示を出力する指示出力手段と、
     をさらに備えることを特徴とする請求項1に記載のコンピュータシステム。
    Based on the output position information, an instruction output means for outputting an instruction of pesticide spraying or fertilizer application,
    The computer system according to claim 1, further comprising:
  3.  画像解析により病害虫を検出するコンピュータシステムが実行する病害虫検出方法であって、
     圃場を撮影した撮影画像及び撮影地点の位置情報を取得するステップと、
     取得した前記撮影画像を画像解析し、病害虫を検出するステップと、
     前記病害虫を検出した撮影画像における前記撮影地点の位置情報を出力するステップと、
     を備えることを特徴とする病害虫検出方法。
    A pest detection method executed by a computer system for detecting a pest by image analysis, comprising:
    A step of acquiring a photographed image of the field and position information of the photographing point;
    Image analysis of the obtained captured image, detecting pests,
    Outputting the position information of the shooting point in the shot image detecting the pest,
    A method for detecting pests, which comprises:
  4.  画像解析により病害虫を検出するコンピュータシステムに、
     圃場を撮影した撮影画像及び撮影地点の位置情報を取得するステップ、
     取得した前記撮影画像を画像解析し、病害虫を検出するステップ、
     前記病害虫を検出した撮影画像における前記撮影地点の位置情報を出力するステップ、
     を実行させるためのコンピュータ読み取り可能なプログラム。
    For computer systems that detect pests by image analysis,
    A step of acquiring a photographed image of the field and position information of the photographing point;
    Image analysis of the captured image obtained, a step of detecting pests,
    A step of outputting position information of the photographing point in the photographed image in which the pest is detected,
    A computer-readable program for executing.
PCT/JP2019/007762 2019-02-28 2019-02-28 Computer system, pest detection method, program WO2020174645A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021501481A JP7075171B2 (en) 2019-02-28 2019-02-28 Computer systems, pest detection methods and programs
PCT/JP2019/007762 WO2020174645A1 (en) 2019-02-28 2019-02-28 Computer system, pest detection method, program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/007762 WO2020174645A1 (en) 2019-02-28 2019-02-28 Computer system, pest detection method, program

Publications (1)

Publication Number Publication Date
WO2020174645A1 true WO2020174645A1 (en) 2020-09-03

Family

ID=72239652

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/007762 WO2020174645A1 (en) 2019-02-28 2019-02-28 Computer system, pest detection method, program

Country Status (2)

Country Link
JP (1) JP7075171B2 (en)
WO (1) WO2020174645A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113229248A (en) * 2021-04-16 2021-08-10 北京农业智能装备技术研究中心 Target identification accurate pesticide application system and method
KR20220080276A (en) * 2020-12-07 2022-06-14 서명랑 Smart farm pest control system based on super-directional speakers and artificial intelligence streaming
CN117875571A (en) * 2024-03-12 2024-04-12 中国林业科学研究院林业研究所 Forest vegetation growth condition analysis method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06424A (en) * 1992-06-18 1994-01-11 Yanmar Agricult Equip Co Ltd Disease injury detection and control machine
JPH11235124A (en) * 1998-02-23 1999-08-31 Yanmar Agricult Equip Co Ltd Precise farming
JP2016144990A (en) * 2015-02-07 2016-08-12 ヤンマー株式会社 Aerial spraying device
WO2017130236A1 (en) * 2016-01-29 2017-08-03 パナソニックIpマネジメント株式会社 Turf growing device, turf growing system, and turf management system
US20180065747A1 (en) * 2016-09-08 2018-03-08 Wal-Mart Stores, Inc. Systems and methods for dispensing an insecticide via unmanned vehicles to defend a crop-containing area against pests

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6427301B2 (en) 2016-06-30 2018-11-21 株式会社オプティム Mobile control application and mobile control method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06424A (en) * 1992-06-18 1994-01-11 Yanmar Agricult Equip Co Ltd Disease injury detection and control machine
JPH11235124A (en) * 1998-02-23 1999-08-31 Yanmar Agricult Equip Co Ltd Precise farming
JP2016144990A (en) * 2015-02-07 2016-08-12 ヤンマー株式会社 Aerial spraying device
WO2017130236A1 (en) * 2016-01-29 2017-08-03 パナソニックIpマネジメント株式会社 Turf growing device, turf growing system, and turf management system
US20180065747A1 (en) * 2016-09-08 2018-03-08 Wal-Mart Stores, Inc. Systems and methods for dispensing an insecticide via unmanned vehicles to defend a crop-containing area against pests

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220080276A (en) * 2020-12-07 2022-06-14 서명랑 Smart farm pest control system based on super-directional speakers and artificial intelligence streaming
KR102584785B1 (en) * 2020-12-07 2023-10-05 서명랑 Smart farm pest control system based on super-directional speakers and artificial intelligence streaming
CN113229248A (en) * 2021-04-16 2021-08-10 北京农业智能装备技术研究中心 Target identification accurate pesticide application system and method
CN117875571A (en) * 2024-03-12 2024-04-12 中国林业科学研究院林业研究所 Forest vegetation growth condition analysis method and system

Also Published As

Publication number Publication date
JPWO2020174645A1 (en) 2021-09-13
JP7075171B2 (en) 2022-05-25

Similar Documents

Publication Publication Date Title
WO2020174645A1 (en) Computer system, pest detection method, program
US10638744B2 (en) Application and method for controlling moving vehicle
Psirofonia et al. Use of unmanned aerial vehicles for agricultural applications with emphasis on crop protection: Three novel case-studies
US10405535B2 (en) Methods, systems and devices relating to real-time object identification
CN111201496A (en) System and method for aerial video traffic analysis
WO2020078396A1 (en) Method for determining distribution information, and control method and device for unmanned aerial vehicle
WO2022094854A1 (en) Growth monitoring method for crops, and devices and storage medium
US9719973B2 (en) System and method for analyzing the effectiveness of an application to a crop
JP7152836B2 (en) UNMANNED AIRCRAFT ACTION PLAN CREATION SYSTEM, METHOD AND PROGRAM
JP2022526368A (en) Targeted weed control using chemical and mechanical means
JP7068747B2 (en) Computer system, crop growth support method and program
JP6326009B2 (en) Wireless aircraft, position information output method, and wireless aircraft program.
CN114206110A (en) Method for generating an application map for processing a field with agricultural equipment
WO2020157879A1 (en) Computer system, crop growth assistance method, and program
CN109507967B (en) Operation control method and device
CA3135084A1 (en) Camera based pest management sprayer
JP6212662B1 (en) Drone automatic flight control application, smart device, drone, server, drone automatic flight control method and program.
WO2021081896A1 (en) Operation planning method, system, and device for spraying unmanned aerial vehicle
CN113454558A (en) Obstacle detection method and device, unmanned aerial vehicle and storage medium
CN111818796B (en) Device for spray management
CN109492541B (en) Method and device for determining type of target object, plant protection method and plant protection system
WO2020107354A1 (en) Control method and device
JP2007303831A (en) Chemical concentration analyzing program, recording medium and chemical concentration analyzer
KR20160065250A (en) Method and System for Fertilizer Distribution Control according to Crop Position Recognition
CN112926359A (en) Crop identification method and device, and control method of operation equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19917319

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
ENP Entry into the national phase

Ref document number: 2021501481

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19917319

Country of ref document: EP

Kind code of ref document: A1