CN116630663A - Intelligent pesticide application method and system based on Internet of things - Google Patents

Intelligent pesticide application method and system based on Internet of things Download PDF

Info

Publication number
CN116630663A
CN116630663A CN202310527466.6A CN202310527466A CN116630663A CN 116630663 A CN116630663 A CN 116630663A CN 202310527466 A CN202310527466 A CN 202310527466A CN 116630663 A CN116630663 A CN 116630663A
Authority
CN
China
Prior art keywords
information
crops
monitoring area
soil environment
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310527466.6A
Other languages
Chinese (zh)
Inventor
王杰
刘瑜
卢书云
魏星
郭从
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Capital Airport Property Management Co ltd
Qingzhou Tobacco Research Institute of China National Tobacco Corp of Institute of Tobacco Research of CAAS
Original Assignee
Beijing Capital Airport Property Management Co ltd
Qingzhou Tobacco Research Institute of China National Tobacco Corp of Institute of Tobacco Research of CAAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Capital Airport Property Management Co ltd, Qingzhou Tobacco Research Institute of China National Tobacco Corp of Institute of Tobacco Research of CAAS filed Critical Beijing Capital Airport Property Management Co ltd
Priority to CN202310527466.6A priority Critical patent/CN116630663A/en
Publication of CN116630663A publication Critical patent/CN116630663A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/242Query formulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application relates to an intelligent pesticide application method and system based on the Internet of things, comprising the following steps: collecting image information of crops in a monitoring area and soil environment characteristic information of the monitoring area, and preprocessing the collected information; establishing a comparison database, and storing various information such as image characteristic information of each growth stage of crops, soil environment characteristic information of the growth of the crops, image characteristic information of various pests, and crop application types and proportion information of different growth conditions; establishing a deep convolutional neural network model, and performing deep learning and training on the neural network model through training sample data; identifying and judging crop growth conditions, soil environment information, whether pests exist or not and whether pesticide application and pesticide application types in a monitoring area through a deep convolutional neural network model; and selecting the type and proportion of the pesticide application to be allocated according to the result information after the identification and judgment, and applying the pesticide to the monitoring area in a self-adaptive working mode.

Description

Intelligent pesticide application method and system based on Internet of things
Technical Field
The application relates to the technical field of the Internet of things, the technical field of image recognition and the technical field of agricultural environment detection, in particular to an intelligent pesticide application method and system based on the Internet of things.
Background
Grain is the basis of human survival, agricultural planting in China has been developed for thousands of years, a lot of agricultural planting experience is accumulated, different coping methods are provided for crops in different situations, corresponding medicines are applied to the crops in the face of diseased crops and insect damage situations, damage caused by diseases and insect damage is prevented, when coping measures are taken, the defects of uneven spraying, excessive medicine use and the like exist in manual application of medicines, pollution to the environment and influence on the growth of the crops can be caused, and how to better cope with the insect damage and the disease without influencing the surrounding environment and the growth of the crops is an important problem.
The present agricultural science and technology is combined with the previous experience of planting crops, the growth condition of the crops is identified through an image identification technology, meanwhile, whether potential damage exists is predicted through the growth condition of the crops and the soil environment condition, and corresponding measures are taken to deal with the predicted condition by combining with the identification.
Disclosure of Invention
The application overcomes the defects of the prior art, and provides an intelligent pesticide application method and system based on the Internet of things, which mainly aim at improving the efficiency of agricultural pesticide application and the pesticide application accuracy.
To achieve the above object, a first aspect of the present application provides an intelligent pesticide application method based on the internet of things, including:
collecting image information of crops in a monitoring area and environmental characteristic information of the monitoring area, and preprocessing the collected information;
establishing a comparison database, and storing image characteristic information of various crops at each growth stage, soil environment characteristic information of the growth of the crops, image characteristic information of various pests, and crop application type information and proportion information of different growth conditions;
establishing a deep convolutional neural network model, and performing deep learning and training on the neural network model through a test sample;
identifying and judging crop growth conditions, soil environment information, whether pests exist or not and whether pesticide application and pesticide application types in a monitoring area through a deep convolutional neural network model;
selecting the type and proportion of the pesticide application to be allocated according to the result information after identification and judgment, and applying the pesticide to the monitoring area in a self-adaptive working mode;
in this scheme, gather the image information and the regional environmental characteristic information of monitoring of regional crops, to the information of gathering carries out the preliminary treatment, specifically does:
collecting real-time image information of various crops in a monitoring area through high-definition camera equipment;
collecting soil environment characteristic information in a monitoring area by adopting a remote sensing shooting technology;
the environment characteristic information comprises soil environment temperature information, soil environment humidity information, and insect and microorganism information in soil;
collecting soil environment temperature information and soil environment humidity information through a high-sensitivity sensor;
and carrying out noise reduction, filtering and screening pretreatment on the image information and the soil environment characteristic information acquired in real time.
In this scheme, establish contrast database, the image characteristic information of each growth stage of various crops, the soil environment characteristic information that crops grow, the image characteristic information of various pests, the crop of different growth conditions and apply medicine kind information and proportion information, specifically include:
collecting image information of each growth stage of each crop appearing in history;
collecting pest image characteristic information appearing in history;
collecting soil environment characteristic information suitable for the growth of each crop;
collecting information of the use types and proportion of crop application under different growth conditions;
and establishing a comparison database, and storing the collected information data for comparison judgment with the output value of the neural network.
In this scheme, establish the degree of depth convolutional neural network model, through the test sample to the neural network model carries out degree of depth study and training, specifically includes:
establishing a deep convolutional neural network model, including identifying the deep convolutional neural network model and predicting the deep convolutional neural network model;
the identification neural network model is divided into a crop identification neural network model and a pest identification neural network model
The prediction neural network model is divided into a plant disease and insect pest prediction neural network model;
deep learning and training are carried out on the deep convolutional neural network model through training sample data;
collecting image information of different crops, preprocessing the image information to obtain growth stage information of the crops, searching the disease-prone type of the growth stage, obtaining the image information of the disease-prone crops according to the disease-prone type, and taking the image information of the disease-prone crops as training sample data;
collecting soil environment characteristic information causing diseases through the searched disease types, and taking the soil environment characteristic information as training sample data;
deep learning and training are carried out on the deep convolutional neural network model through training sample data;
deep learning and training are carried out on the established neural network model by adopting an automatic segmentation method, and network parameters and characteristics are automatically learned;
and continuously performing deep learning and training on the deep convolutional neural network model, and reducing the deviation between the calculated value and the expected value to obtain the deep convolutional neural network model with the deviation value within the tolerable error range.
In this scheme, through the discernment of degree of depth convolutional neural network model judgement crop growth condition, soil environment information in the monitoring area, whether there is the pest and whether need apply medicine and the kind of applying medicine specifically include:
inputting the collected soil environment characteristic information and crop image information into an identification depth convolution neural network model to obtain an output value;
adopting a similarity calculation method to calculate the similarity between the output value and the data in the comparison database;
if the similarity value is larger than a threshold value, obtaining any one or more of identification result information of poor growth condition of crops, poor soil environment and existence of pests in the monitoring area;
if the similarity value is smaller than the threshold value, obtaining any one or more of identification result information of good growth condition, good soil environment and no pests of the crops in the monitored area;
inputting the identification result information into a value prediction depth convolutional neural network model, and predicting whether potential pests exist in the soil of the monitoring area or not to obtain prediction result information;
and judging whether the medicine is required to be applied and the type and proportion of the medicine to be applied according to the identification result information and the prediction result information, and obtaining the judged result information.
In this scheme, according to the result information after discernment judgement, select kind of application and proportion to allocate, apply medicine to the monitoring area through self-adaptation working method, specifically include:
according to the result information after the identification and judgment, the working equipment automatically prepares pesticides or fertilizers according to a preset proportion;
dividing similar influence factors into the same category through an European clustering algorithm, setting operation references facing different conditions, and realizing the self-adaptive work;
the self-adaptive working mode is adopted, so that the self-adaptive spraying device is adaptive to various environmental influence factors during the application of the pesticide, and automatically adjusts the spraying direction, the spraying intensity and the spraying time;
after the steps are finished, the working equipment automatically starts spraying pesticides in a preset area according to a preset track.
To achieve the above object, a second aspect of the present application provides an intelligent drug delivery system based on the internet of things, including:
the information acquisition module is divided into a first acquisition unit and a second acquisition unit;
the first acquisition unit is used for acquiring image information of crops in a monitoring area and is composed of a plurality of high-definition shooting devices;
the second acquisition unit is used for acquiring the soil environment characteristic information of the monitored area and comprises remote sensing shooting equipment and a high-sensitivity sensor array.
The method comprises the steps of acquiring image information of crops in a monitoring area and soil environment characteristic information of the monitoring area in real time, and knowing various conditions and soil environment conditions of the crops in the monitoring area in real time;
the data processing module is divided into a first processing unit and a second processing unit;
the first processing unit is used for preprocessing the collected image information of crops in the monitoring area and the soil environment characteristic information of the monitoring area, and primarily eliminating interference data affecting identification judgment.
The second processing unit is used for identifying and judging the image information of the preprocessed crops and the soil environment characteristic information of the monitoring area, identifying and judging the growth condition of the crops and the condition of the soil environment, predicting the potential hazard condition in the monitoring area, and judging whether the pesticide is needed to be applied and the type and the proportion of the pesticide to be applied;
the control and implementation module is divided into a first control unit, a second control unit and an implementation unit;
the first control unit is used for remotely controlling the operation of the equipment, and the second control unit is used for controlling the operation of the equipment in the field;
the implementation unit is used for executing various working instructions, collecting information and applying medicine;
the allocation and storage module is divided into an allocation unit and a storage unit;
the allocation unit receives the judging result information and automatically selects the corresponding types and proportions of the applied medicines for allocation;
the storage unit is used for storing the medicines for preventing crop diseases and insect pests, so that the proportion of the required medicines can be conveniently and rapidly prepared;
and the visualization module is used for displaying the judging result, the soil environment information in the monitoring area, the growth condition of crops in the monitoring area and the working state of each device.
The application discloses a pesticide application method and a pesticide application system based on the Internet of things, wherein the intelligent pesticide application method based on the Internet of things comprises the following steps: collecting image information of crops in a monitoring area and environmental characteristic information of the monitoring area, and preprocessing the collected information; establishing a comparison database, and storing image characteristic information of various crops at each growth stage, soil environment characteristic information of the growth of the crops, image characteristic information of various pests, and crop application type information and proportion information of different growth conditions; establishing a deep convolutional neural network model, and performing deep learning and training on the neural network model through a test sample; identifying and judging crop growth conditions, soil environment information, whether pests exist or not and whether pesticide application and pesticide application types in a monitoring area through a deep convolutional neural network model; according to the result information after the identification judgment, the pesticide application type and proportion are selected for allocation, the pesticide is applied to the monitoring area through the self-adaptive working mode, the growth condition and the soil environment condition of the pesticide in the monitoring area are effectively known in real time through the intelligent identification judgment, the potential hazard understanding property is improved, the intelligent of the equipment is improved through the Internet of things technology, the operation of remote control equipment is realized, the use convenience is improved, the pesticide application accuracy is improved through the self-adaptive pesticide application mode, the pesticide application using amount is automatically controlled, and the environmental pollution risk is caused by the excessive use of the device is greatly reduced.
Drawings
In order to more clearly illustrate the technical solutions of embodiments or examples of the present application, the drawings that are required to be used in the embodiments or examples of the present application will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to the drawings without inventive efforts for those skilled in the art.
Fig. 1 is a flowchart of an intelligent pesticide application method based on the internet of things according to an embodiment of the present application;
FIG. 2 is a flowchart of information processing of an intelligent pesticide application system based on the Internet of things according to an embodiment of the present application;
FIG. 3 is a block diagram of an intelligent pesticide application system based on the Internet of things according to an embodiment of the present application;
the achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
In order that the above-recited objects, features and advantages of the present application will be more clearly understood, a more particular description of the application will be rendered by reference to the appended drawings and appended detailed description. It should be noted that, without conflict, the embodiments of the present application and features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, however, the present application may be practiced in other ways than those described herein, and therefore the scope of the present application is not limited to the specific embodiments disclosed below.
Fig. 1 is a flowchart of an intelligent pesticide application method based on the internet of things according to an embodiment of the present application;
as shown in fig. 1, the present application provides an intelligent administration method flowchart based on the internet of things, including:
s102, acquiring image information of crops in a monitoring area and soil environment characteristic information of the monitoring area, and preprocessing the acquired information;
collecting real-time image information of various crops in a monitoring area through high-definition camera equipment;
collecting soil environment characteristic information in a monitoring area by adopting a remote sensing shooting technology;
the environment characteristic information comprises soil environment temperature information, soil environment humidity information, and insect and microorganism information in soil;
collecting soil environment temperature information and soil environment humidity information through a high-sensitivity sensor;
noise reduction, filtering and screening pretreatment are carried out on the image information and the soil environment characteristic information which are acquired in real time;
furthermore, the image information and the soil environment characteristic information acquired in real time are preprocessed by noise reduction, filtering and the like, and better comparison and identification are facilitated by improving the stability, accuracy and definition of the acquired data;
s104, establishing a comparison database, and storing image characteristic information of various crops at each growth stage, soil environment characteristic information of various crops, image characteristic information of various pests, and crop application types and proportion information of different growth conditions;
collecting image information of each growth stage of each crop appearing in history;
collecting pest image characteristic information appearing in history;
collecting soil environment characteristic information suitable for the growth of each crop;
collecting information of the use types and proportion of crop application under different growth conditions;
establishing a comparison database, and storing the collected information data for comparison judgment with the output value of the neural network;
further, the collected image information of each growth stage of the crops is used for comparing and identifying with the image information of the crops collected in real time, and the growth stage and the condition of the crops in the monitoring area are identified;
further, the collected pest image characteristic information is used for identifying possible pests in the crop image information acquired in real time, so that even if pesticide application measures are adopted;
further, the collected crop soil environment characteristic information is compared with the soil environment characteristic information collected in real time, the condition of the soil environment in the monitoring area is judged, and whether potential harm exists in the environment is predicted, so that preventive measures are taken in advance.
S106, establishing a deep convolutional neural network model, and performing deep learning and training on the neural network model through training sample data;
establishing a deep convolutional neural network model, including identifying the deep convolutional neural network model and predicting the deep convolutional neural network model;
the identification neural network model is divided into a crop identification neural network model and a pest identification neural network model
The prediction neural network model is divided into a plant disease and insect pest prediction neural network model;
deep learning and training are carried out on the deep convolutional neural network model through training sample data;
collecting image information of different crops, preprocessing the image information to obtain growth stage information of the crops, searching the disease-prone type of the growth stage, obtaining the image information of the disease-prone crops according to the disease-prone type, and taking the image information of the disease-prone crops as training sample data;
collecting soil environment characteristic information causing diseases through the searched disease types, and taking the soil environment characteristic information as training sample data;
deep learning and training are carried out on the deep convolutional neural network model through training sample data;
deep learning and training are carried out on the established neural network model by adopting an automatic segmentation method, and network parameters and characteristics are automatically learned;
and continuously performing deep learning and training on the deep convolutional neural network model, and reducing the deviation between the calculated value and the expected value to obtain the deep convolutional neural network model with the deviation value within the tolerable error range.
Further, the training sequence for training the deep convolutional neural network model is as follows: firstly training an identification neural network model to obtain an identification neural network model which meets expectations, then training a prediction neural network model by using the output value of the trained neural network model and training sample data, and finally obtaining the neural network model with prediction accuracy within the tolerable error range.
S108, identifying and judging the growth condition of crops, the soil environment condition, whether pests exist or not and whether the pesticide is needed to be applied or not and the type of the pesticide to be applied in a monitoring area through a deep convolutional neural network model;
inputting the collected soil environment characteristic information and crop image information into an identification depth convolution neural network model to obtain an output value;
adopting a similarity calculation method to calculate the similarity between the output value and the data in the comparison database;
if the similarity value is larger than a threshold value, obtaining any one or more of identification result information of poor growth condition of crops, poor soil environment and existence of pests in the monitoring area;
if the similarity value is smaller than the threshold value, obtaining any one or more of identification result information of good growth condition, poor soil environment and no pests of the crops in the monitored area;
inputting the identification result information into a value prediction depth convolutional neural network model, and predicting whether potential pests exist in the soil of the monitoring area;
if the similarity value is larger than a threshold value, obtaining prediction result information of potential hazard of the soil in the monitoring area;
if the similarity value is smaller than a threshold value, obtaining prediction result information that potential hazards do not exist in the soil in the monitored area;
and judging whether the medicine is required to be applied and the type and proportion of the medicine to be applied according to the identification result information and the prediction result information, and obtaining the judged result information.
S110, selecting the type and proportion of the pesticide application to be allocated according to the result information after the identification and judgment, and applying the pesticide to the monitoring area in a self-adaptive working mode;
according to the result information after the identification and judgment, the working equipment automatically prepares pesticides or fertilizers according to a preset proportion;
dividing similar influence factors into the same category through an European clustering algorithm, setting operation references facing different conditions, and realizing the self-adaptive work;
the self-adaptive working mode is adopted, so that the self-adaptive spraying device is adaptive to various environmental influence factors during the application of the pesticide, and automatically adjusts the spraying direction, the spraying intensity and the spraying time;
after the steps are finished, the working equipment automatically starts spraying pesticides in a preset area according to a preset track.
Further, the monitoring area is divided into a plurality of subareas, the disease and pest degree of crops in the monitoring area is obtained through identifying judging result information and predicting result information, deviation of the disease and pest degree of the subareas is calculated, the same-category subareas are calculated according to different disease and pest degrees of each subarea, average disease and pest degrees of the same-category subareas are calculated, the average value of the disease and pest degrees of the same-category subareas is obtained, the average value is used as an average application standard of the same-category subareas, an optimal application scheme is formulated according to the average standard, the optimal application type and proportion and the size and the rate of spray droplets are selected, the application utilization rate is improved, the loss of crops caused by excessive application is avoided, and the risk of environmental pollution is reduced;
further, selecting a serious illness region from the classified sub-regions, extracting environmental features of the serious illness region, defining a nearby region of the serious illness region as a high-risk region, extracting environmental feature parameters of the high-risk region, calculating Manhattan distances of the environmental feature parameters of the serious illness region and the high-risk region, taking the calculated Manhattan distances as environmental feature similarity values of the serious illness region and the high-risk region, comparing the similarity values with a preset threshold value, judging the illness probability of the high-risk region according to the magnitude relation between the similarity values and the preset threshold value, and adopting a corresponding optimal administration scheme according to a judging result to effectively prevent the illness degree of other regions from rising;
further, by calculating the ratio of the diseased features in the crop image information acquired in the region to the basic features of the whole crop, for example, calculating the occupancy rate of the diseased leaf area of the diseased crop to the leaf area of the whole crop, the disease and pest degree in the region is obtained and is classified into serious, medium and extremely small, the diseased degree and the diseased probability of the sub-region can be judged according to the disease and pest degree, and the corresponding optimal application scheme is adopted for the diseased region and the high-risk region.
The crop growth condition and the soil environment condition of the monitoring area are identified and judged by collecting the crop image information and the soil environment information in the monitoring area in real time, then the prediction is carried out according to the result after the identification and judgment, and countermeasures are taken in advance to prevent the occurrence of potential hazard;
after the identification judgment result and the prediction result are obtained, corresponding solutions are matched according to result information, the solutions are automatically prepared according to the types and the proportions of the medicines in the corresponding solutions, and the flow and the spraying direction of pesticide spraying are controlled according to different influence factors of the environment when the pesticide is sprayed in a self-adaptive working mode, so that the intellectualization of pesticide application is greatly improved, and meanwhile, the environment friendliness is guaranteed.
FIG. 2 is a flowchart of information processing of an intelligent pesticide application system based on the Internet of things according to an embodiment of the present application;
as shown in fig. 2, the present application provides a flowchart for intelligent pesticide application system information processing based on internet of things, which includes:
s202, acquiring crop image information and soil environment information in an acquired monitoring area;
s204, preprocessing the collected various information data;
preprocessing such as noise reduction and filtering is carried out on various acquired information data, and better comparison and identification are facilitated by improving the stability and definition of the acquired data;
s206, performing recognition judgment through a recognition neural network model, and performing comparison calculation with data in a comparison database;
adopting a similarity calculation method to calculate the similarity between the output value and the data in the comparison database;
if the similarity value is larger than a threshold value, obtaining any one or more of identification result information of poor growth condition of crops, poor soil environment and existence of pests in the monitoring area;
if the similarity value is smaller than the threshold value, obtaining any one or more of identification result information of good growth condition, poor soil environment and no pests of the crops in the monitored area;
inputting the identification result information into a value prediction depth convolutional neural network model, and predicting whether potential pests exist in the soil of the monitoring area;
if the similarity value is larger than a threshold value, obtaining prediction result information of potential hazard of the soil in the monitoring area;
if the similarity value is smaller than a threshold value, obtaining prediction result information that potential hazards do not exist in the soil in the monitored area;
judging whether the medicine needs to be applied and the type and proportion of the medicine to be applied according to the identification result information and the prediction result information, and obtaining judged result information;
s208, obtaining identification result information, and inputting the identification result information into a prediction neural network model;
inputting the identification result information into a value prediction depth convolutional neural network model, and predicting whether potential pests exist in the soil of the monitoring area;
if the similarity value is larger than a threshold value, obtaining prediction result information of potential hazard of the soil in the monitoring area;
if the similarity value is smaller than a threshold value, obtaining prediction result information that potential hazards do not exist in the soil in the monitored area;
s210, obtaining prediction result information;
s212, judging the types and the use proportion of the medicines to be used according to the identification result information and the prediction result information;
judging whether the medicine needs to be applied and the type and proportion of the medicine to be applied according to the identification result information and the prediction result information, and obtaining judged result information;
s214, judging result information is obtained;
it should be noted that, the recognition result of the neural network model is important information for judging intelligent application and predicting potential hazard; identifying the growth condition of crops, whether the crops have pests and the types of the pests and the condition of soil environment through the collected image information, predicting and judging the potential hazard based on the identification result, judging which corresponding scheme is adopted, and automatically matching the types and the use proportion of pesticides; greatly improves the accuracy of drug administration, achieves symptomatic drug delivery and avoids loss.
FIG. 3 is a block diagram of an intelligent pesticide application system based on the Internet of things according to an embodiment of the present application;
as shown in fig. 3, the present application provides a structural diagram of an intelligent pesticide application system based on the internet of things, which includes:
the system comprises an information acquisition module, a data processing module, a control and implementation module, a deployment and storage module and a visualization module;
the information acquisition module is divided into a first acquisition unit and a second acquisition unit;
the first acquisition unit is used for acquiring image information of crops in a monitoring area and is composed of a plurality of high-definition shooting devices;
the second acquisition unit is used for acquiring the soil environment characteristic information of the monitored area and comprises remote sensing shooting equipment and a high-sensitivity sensor array.
The method comprises the steps of acquiring image information of crops in a monitoring area and soil environment characteristic information of the monitoring area in real time, and knowing various conditions and soil environment conditions of the crops in the monitoring area in real time;
the data processing module is divided into a first processing unit and a second processing unit;
the first processing unit is used for preprocessing the collected image information of crops in the monitoring area and the soil environment characteristic information of the monitoring area, and primarily eliminating interference data affecting identification judgment.
The second processing unit is used for identifying and judging the image information of the preprocessed crops and the soil environment characteristic information of the monitoring area, identifying and judging the growth condition of the crops and the condition of the soil environment, predicting the potential hazard condition in the monitoring area, and judging whether the pesticide is needed to be applied and the type and the proportion of the pesticide to be applied;
the control and implementation module is divided into a first control unit, a second control unit and an implementation unit;
the first control unit is used for remotely controlling the operation of the equipment, and the second control unit is used for controlling the operation of the equipment in the field;
the implementation unit is used for executing various working instructions, collecting information and applying medicine;
further, the first control unit is arranged in the terminal equipment, the terminal equipment can be intelligent equipment with control functions such as a computer, a mobile phone or a console, and the working state of the remote control equipment end can be realized through the first control unit in the terminal equipment;
further, the second control unit is a control unit arranged at the equipment end and is used for controlling each module at the equipment end to normally operate and executing real-time working instructions of the terminal;
furthermore, the normal operation of each module of the equipment end can be controlled only by operating the second control unit at the equipment end, the second control unit is responsible for the normal operation of all the modules of the equipment end, and the first control unit remotely controls the second control unit to enable the equipment end to normally operate;
further, the implementation module is composed of various working devices, and performs daily work by executing various operation instructions.
The allocation and storage module is divided into an allocation unit and a storage unit;
the allocation unit receives the judging result information and automatically selects the corresponding types and proportions of the applied medicines for allocation;
the storage unit is used for storing the medicines for preventing crop diseases and insect pests, so that the proportion of the required medicines can be conveniently and rapidly prepared;
the visualization module is divided into terminal display and equipment end display and is used for displaying judging results, soil environment information in a monitoring area, growth conditions of crops in the monitoring area and working states of all equipment.
Further, the terminal display is to display the prediction result information, the identification judgment result information and various data acquired in real time on a terminal device, wherein the terminal device can be an intelligent device with a display function such as a computer, a mobile phone or a console;
further, the device side display is to display the prediction result, the identification judgment result and various data acquired in real time on a display screen installed on the device side, wherein the display screen is a screen capable of displaying information data, and can be a screen with a touch function or a screen with a display function only;
it should be noted that, based on the control unit and the display module of the present application, a user can control the operation of the equipment at any end, and can query the data collected in real time, the identification and judgment result and the prediction result at any end, so that the user can conveniently know the crop growth condition and the soil environment condition of the monitored area in real time, and can perform secondary judgment on the monitored area according to own experience, thereby greatly improving the convenience and the simplicity of use.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
From the foregoing description of the embodiments, those skilled in the art will clearly understand that the above embodiment method may be implemented by means of software plus a necessary hardware platform, and by means of a system combining software and hardware, an intelligent administration method can be implemented more intelligently, systematically and precisely. Based on this understanding, the technical solution of the present application may be embodied essentially or in part contributing to the prior art in the form of a combination of a software product and a hardware facility, the system program being stored on a storage medium such as: various mediums such as ROM/RAM, magnetic disk, optical disk and the like capable of storing program codes, the system program is implemented by a terminal device, which may be a computer, a cloud server or a network device, etc. to execute the software control embodiment of the present application, while the embodiment of the actual collection and driving operation needs to be implemented by a hardware device.
Alternatively, the above-described partial modules of the present application may be stored in a computer-readable storage medium if implemented in the form of a whole set or a partial product and sold or used as an independent product. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or part of what contributes to the prior art, and may be embodied in the form of a software product stored in a storage medium, and in hardware, comprising several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program code, such as removable storage devices, ROM/RAM, magnetic or optical disks, and the like.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the application, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (10)

1. An intelligent pesticide application method based on the Internet of things is characterized by comprising the following steps:
collecting image information of crops in a monitoring area and soil environment characteristic information of the monitoring area, and preprocessing the collected information;
establishing a comparison database, and storing image characteristic information of various crops at each growth stage, soil environment characteristic information of various crops, image characteristic information of various pests, and crop application types and proportion information of different growth conditions;
establishing a deep convolutional neural network model, and performing deep learning and training on the deep convolutional neural network model through training sample data;
identifying and judging the growth condition of crops, the soil environment condition, whether pests exist or not and whether the pesticide is needed to be applied or not and the type of the pesticide to be applied in a monitoring area through a deep convolutional neural network model;
and selecting the type and proportion of the pesticide application to be allocated according to the result information after the identification and judgment, and applying the pesticide to the monitoring area in a self-adaptive working mode.
2. The intelligent pesticide application method based on the internet of things according to claim 1, wherein the method is characterized by collecting image information of crops in a monitoring area and environmental characteristic information of the monitoring area, and preprocessing the collected information, and specifically comprises the following steps:
collecting real-time image information of various crops in a monitoring area;
collecting soil environment characteristic information in a monitoring area;
the environmental characteristic information comprises soil temperature information, soil humidity information, insects and microorganism information in soil;
and carrying out noise reduction, filtering and screening pretreatment on the image information and the environmental characteristic information acquired in real time.
3. The intelligent pesticide application method based on the Internet of things according to claim 1, wherein a comparison database is established to store image characteristic information of various crop growth stages, environment characteristic information of various crops suitable for growth, image characteristic information of various pests, and pesticide application type information and proportion information of the crops in different growth conditions, and the method specifically comprises the following steps:
collecting image information of each growth stage of each crop appearing in history;
collecting pest image characteristic information appearing in history;
collecting soil environment characteristic information suitable for the growth of each crop;
collecting the proportion information and the use type information of crop application under different growth conditions;
a comparison database is established to store the collected information data.
4. The intelligent pesticide application method based on the Internet of things, which is characterized by establishing a deep convolutional neural network model, and performing deep learning and training on the neural network model through a test sample, and specifically comprising the following steps:
establishing a deep convolutional neural network model, including identifying the deep convolutional neural network model and predicting the deep convolutional neural network model;
collecting soil environment characteristic information of crops susceptible to disease and image information of the diseased crops, and taking the soil environment characteristic information and the image information of the diseased crops as training sample data;
deep learning and training are carried out on the deep convolutional neural network model through training sample data;
and performing deep learning and training on the established neural network model by adopting an automatic segmentation method, and automatically learning network parameters and characteristics.
5. The intelligent pesticide application method based on the Internet of things according to claim 1, wherein the method is characterized in that the crop growth condition, the soil environment information, the existence of pests and the pesticide application type in the monitored area are identified and judged through a deep convolutional neural network model, and specifically comprises the following steps:
inputting the collected soil environment characteristic information and crop image information into an identification depth convolution neural network model to obtain an output value;
adopting a similarity calculation method to calculate the similarity between the output value and the data in the comparison database;
if the similarity value is larger than a threshold value, obtaining any one or more of identification result information of poor growth condition of crops, poor soil environment and existence of pests in the monitoring area;
if the similarity value is smaller than the threshold value, obtaining any one or more of identification result information of good growth condition, good soil environment and no pests of the crops in the monitored area;
inputting the identification result information into a value prediction depth convolutional neural network model, and predicting whether potential pests exist in the soil of the monitoring area or not to obtain prediction result information;
and judging whether the medicine is required to be applied and the type and proportion of the medicine to be applied according to the identification result information and the prediction result information, and obtaining the judged result information.
6. The intelligent pesticide application method based on the internet of things according to claim 1, wherein the pesticide application type and the pesticide application proportion are selected for allocation according to the result information after identification and judgment, and the pesticide is applied to the monitoring area in a self-adaptive working mode, and the method specifically comprises the following steps:
according to the result information after the identification and judgment, the working equipment automatically prepares pesticides or fertilizers according to a preset proportion;
dividing similar influence factors into the same category through an European clustering algorithm, setting operation references facing different conditions, and realizing the self-adaptive working mode;
the self-adaptive working mode is adopted, so that the self-adaptive spraying device is adaptive to various environmental influence factors during the application of the pesticide, and automatically adjusts the spraying direction, the spraying intensity and the spraying time;
after the steps are finished, the working equipment automatically starts spraying pesticides in a preset area according to a preset track.
7. Intelligent medicine application system based on thing networking, its characterized in that includes:
the information acquisition module acquires image information of crops in a monitoring area and soil environment characteristic information of the monitoring area in real time;
the data processing module is used for preprocessing various acquired information data, and then identifying and judging the preprocessed information data to obtain judging result information;
the control and implementation module receives various instructions of the terminal in real time and controls the normal operation of each working device;
the blending and storage module is used for storing the required pesticide and blending the required pesticide according to the required proportion;
and the visualization module is used for displaying the judging result, the soil environment information in the monitoring area, the growth condition of crops in the monitoring area and the working state of each device.
8. The intelligent pesticide application system based on the internet of things of claim 7, further comprising an information acquisition module;
the information acquisition module is divided into a first acquisition unit and a second acquisition unit;
the first acquisition unit is used for acquiring image information of crops in a monitoring area and is composed of a plurality of high-definition shooting devices;
the second acquisition unit is used for acquiring the soil environment characteristic information of the monitored area and comprises remote sensing shooting equipment and a high-sensitivity sensor array.
9. The intelligent dispensing system based on the internet of things of claim 7, further comprising a data processing module;
the data processing module is divided into a first processing unit and a second processing unit;
the first processing unit is used for preprocessing the collected image information of crops in the monitoring area and the soil environment characteristic information of the monitoring area, and primarily eliminating interference data affecting identification judgment.
The second processing unit is used for identifying and judging the image information of the preprocessed crops and the soil environment characteristic information of the monitoring area, identifying and judging the growth condition of the crops and the condition of the soil environment, predicting the potential hazard condition in the monitoring area, and judging whether the pesticide is needed to be applied and the type and the proportion of the pesticide to be applied.
10. A computer readable storage medium, wherein the computer readable storage medium includes an intelligent dosing method program based on the internet of things, and when the intelligent dosing method program based on the internet of things is executed by a processor, the steps of the intelligent dosing method based on the internet of things according to any one of claims 1 to 6 are implemented.
CN202310527466.6A 2023-05-11 2023-05-11 Intelligent pesticide application method and system based on Internet of things Pending CN116630663A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310527466.6A CN116630663A (en) 2023-05-11 2023-05-11 Intelligent pesticide application method and system based on Internet of things

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310527466.6A CN116630663A (en) 2023-05-11 2023-05-11 Intelligent pesticide application method and system based on Internet of things

Publications (1)

Publication Number Publication Date
CN116630663A true CN116630663A (en) 2023-08-22

Family

ID=87591204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310527466.6A Pending CN116630663A (en) 2023-05-11 2023-05-11 Intelligent pesticide application method and system based on Internet of things

Country Status (1)

Country Link
CN (1) CN116630663A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116681544A (en) * 2023-08-02 2023-09-01 四川永坚新能源科技有限公司 Crop environment information processing method, electronic device, and computer-readable medium
CN117110242A (en) * 2023-10-18 2023-11-24 北京英视睿达科技股份有限公司 Monitoring method, device and storage medium for use of pesticide fertilizer
CN117557914A (en) * 2024-01-08 2024-02-13 成都大学 Crop pest identification method based on deep learning

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116681544A (en) * 2023-08-02 2023-09-01 四川永坚新能源科技有限公司 Crop environment information processing method, electronic device, and computer-readable medium
CN117110242A (en) * 2023-10-18 2023-11-24 北京英视睿达科技股份有限公司 Monitoring method, device and storage medium for use of pesticide fertilizer
CN117110242B (en) * 2023-10-18 2024-01-16 北京英视睿达科技股份有限公司 Monitoring method, device and storage medium for use of pesticide fertilizer
CN117557914A (en) * 2024-01-08 2024-02-13 成都大学 Crop pest identification method based on deep learning
CN117557914B (en) * 2024-01-08 2024-04-02 成都大学 Crop pest identification method based on deep learning

Similar Documents

Publication Publication Date Title
CN116630663A (en) Intelligent pesticide application method and system based on Internet of things
CN111582055B (en) Unmanned aerial vehicle aviation pesticide application route generation method and system
CN113110207A (en) Insect pest remote monitoring method and system based on sensor of Internet of things and storage medium
CN113095555A (en) Crop disease and insect pest monitoring method and system based on Internet of things and storage medium
US10729117B2 (en) Pest monitoring method based on machine vision
US20170071188A1 (en) Methods, Systems and Devices Relating to Real-Time Object Identification
CN110926430B (en) Air-ground integrated mangrove forest monitoring system and control method
CN108073908B (en) Pest identification method and device, computer device and storage medium
CN111767802A (en) Method and device for detecting abnormal state of object
CN109197273B (en) Method and device for determining pest activity time period and method for determining pesticide application time
US20190107521A1 (en) System and method for field test management
KR20200068052A (en) Method for diagnosis and control of diseases and insect pests using multiple camera module
CN109496622A (en) The recognition methods of pest and device, the determination method, the plant protection system that are administered information
CN113207511A (en) Pesticide application method and system based on pesticide resistance monitoring and readable storage medium
CN116543347A (en) Intelligent insect condition on-line monitoring system, method, device and medium
WO2022079172A1 (en) Treatment system for plant specific treatment
CN114723667A (en) Agricultural fine planting and disaster prevention control system
CN116740644A (en) Comprehensive evaluation and control method and system for plant diseases and insect pests in passion fruit cultivation process
CN115342859A (en) Multifunctional grain condition detection system and detection method
CN117057946A (en) Intelligent agricultural park data monitoring device and method
CN108874910A (en) The Small object identifying system of view-based access control model
CN114586760A (en) Pesticide spraying method and system based on big data and readable storage medium
CN113377141A (en) Artificial intelligence agricultural automatic management system
CN116740645A (en) Fruit fly monitoring and comprehensive prevention and control method, system and storage medium based on Internet of things
CN113435825A (en) Intelligent management method, system and storage medium based on soil-borne disease control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination