WO2019179270A1 - 植株种植数据测量方法、作业路线规划方法及装置、*** - Google Patents

植株种植数据测量方法、作业路线规划方法及装置、*** Download PDF

Info

Publication number
WO2019179270A1
WO2019179270A1 PCT/CN2019/075457 CN2019075457W WO2019179270A1 WO 2019179270 A1 WO2019179270 A1 WO 2019179270A1 CN 2019075457 W CN2019075457 W CN 2019075457W WO 2019179270 A1 WO2019179270 A1 WO 2019179270A1
Authority
WO
WIPO (PCT)
Prior art keywords
plant
information
planting
area
historical
Prior art date
Application number
PCT/CN2019/075457
Other languages
English (en)
French (fr)
Inventor
代双亮
Original Assignee
广州极飞科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州极飞科技有限公司 filed Critical 广州极飞科技有限公司
Priority to EP19771748.1A priority Critical patent/EP3770830A4/en
Priority to US16/976,891 priority patent/US11321942B2/en
Priority to JP2020548944A priority patent/JP7086203B2/ja
Priority to AU2019238712A priority patent/AU2019238712B2/en
Publication of WO2019179270A1 publication Critical patent/WO2019179270A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/047Optimisation of routes or paths, e.g. travelling salesman problem
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C21/00Methods of fertilising, sowing or planting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06313Resource planning in a project environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06315Needs-based resource requirements planning or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/32UAVs specially adapted for particular uses or applications for imaging, photography or videography for cartography or topography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/40UAVs specially adapted for particular uses or applications for agriculture or forestry operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Definitions

  • the present application relates to the field of plant protection technology, and in particular, to a plant planting data measuring method, a working route planning method, device and system.
  • planting data such as planting quantity and plant size.
  • planting data are very important for agricultural managers to predict the economic benefits of agricultural products, and are often obtained by manual statistics, which makes the statistics of planting data.
  • the efficiency is relatively low, especially for some crops with large planting areas, the statistics of planting data will become very complicated.
  • the embodiment of the present application provides a planting data measurement method, a working route planning method, a device and a system, to simplify the statistical process of plant planting data, thereby improving the statistical efficiency of plant planting data, and making the route planning more accurate.
  • the present application provides a method for measuring planting data, and the method for measuring planting data includes:
  • the image information of the planting area is processed by using a preset recognition model to obtain plant planting data.
  • the image information of the planting area transmitted by the surveying device is processed by using a preset recognition model to obtain plant planting data, thereby simplifying the statistical process of plant planting data.
  • the route planning is more accurate, avoiding the statistical inefficiency caused by the planting data of the artificially planted area, and the complicated statistical process.
  • the present application further provides a planting plant data measuring device, the plant planting data measuring device comprising:
  • a receiving module configured to receive image information of the planting area
  • the processing module is configured to process image information of the planting area by using a preset recognition model to obtain plant planting data.
  • the present application further provides a planting plant data measuring system, comprising: a mapping device and a plant planting data measuring device according to the above technical solution; an output end of the mapping device and the plant The receiving module included in the planting data measuring device is connected.
  • the present application further provides a work route planning method, where the work route planning method includes:
  • the plant planting data of the work area is obtained from the work area image for the work route planning;
  • the working route of the movable device in the working area is planned.
  • the beneficial effects of the work route planning method provided by the present application are the same as those of the plant planting data measurement method provided by the above technical solution, and are not described herein.
  • the working route can be made more accurate, work efficiency is improved, and job execution accuracy is improved.
  • the application further provides a work route planning system, the work route planning system comprising:
  • the plant planting data measuring device is configured to obtain the plant planting data of the working area from the work area image of the work route planning according to the plant planting data measuring method provided by the above technical solution;
  • the planning module is configured to plan a working route of the movable device in the working area according to the planting data of the working area.
  • FIG. 1 is a flow chart of a method for planning a work route according to an embodiment of the present application
  • FIG. 2 is a flow chart showing a selection operation of plant planting data in the embodiment of the present application.
  • FIG. 3 is a flow chart 1 of planning a working route of a mobile device in a work area according to an embodiment of the present application
  • FIG. 4 is a second flowchart of a working route for planning a mobile device in a work area according to an embodiment of the present application
  • FIG. 5 is a schematic structural diagram of a job route planning system according to an embodiment of the present application.
  • FIG. 6 is a flow chart of a method for measuring plant planting data provided by an embodiment of the present application.
  • FIG. 7 is a flowchart of training learning of a deep network model by using planting region history information in an embodiment of the present application
  • FIG. 8 is a schematic structural diagram of a deep network model in an embodiment of the present application.
  • FIG. 9 is a flowchart of processing image information of a planting area by using a deep network model in an embodiment of the present application.
  • FIG. 10 is a structural block diagram of a plant planting data measuring device according to an embodiment of the present application.
  • FIG. 11 is a structural block diagram of a deep network model according to an embodiment of the present application.
  • FIG. 12 is a schematic diagram of information interaction of a plant planting data measurement system according to an embodiment of the present application.
  • the resulting planting data is statistically inefficient, especially for some crops with a large planting area, the statistics of planting data will become very complicated, and the embodiment of the present application provides a plant.
  • Planting data measurement method, operation route planning method and device, system to solve the problem which can be used for the measurement of crop recognizable crop planting data, and is not affected by the planting area, and the plant planting data is conveniently and quickly obtained, and this application
  • the plant planting data measured in the examples can be used not only to predict plant yield, but also to observe the growth of the plants, and to plan the operation route using plant planting data when the plants need to be operated.
  • the embodiment of the present application provides a work route planning method. As shown in FIG. 1 , the work route planning method includes:
  • Step S110 Obtain plant planting data of the work area from the work area image for the work route planning by using the plant planting data measurement method;
  • Step S130 Plan the working route of the mobile device in the work area according to the plant planting data of the work area.
  • the plant planting data measurement method is based on image recognition technology such as training and learning to measure plant planting data;
  • the image recognition technology can be deep network learning technology or other image recognition technology; for example: simple color recognition, shape recognition, image segmentation, The edge recognition calculation method and the like separate the desired plant planting data from the work area image.
  • image recognition technologies plant planting data of the required work area can be quickly separated from the work area image, and the work route of the mobile device in the work area can be planned according to the plant planting data of the work area, thereby improving the accuracy of the work execution. Avoid the phenomenon of blind work and improve the efficiency of surveying.
  • the movable device includes a drone, an aircraft, a tractor, a tiller, a harvester, an intelligent robot, etc., and the drone can be used to spray pesticides, fertilizers, seeds, etc. in the area to be operated, the tractor, The tillage machine and the harvester can be used for cultivating, harvesting, etc. the farmland, and the intelligent robot can automatically pick the work area, sharpen the cotton, and remove the grass by the laser.
  • the work route planned by the embodiment of the present application may include one or more of an arcuate route, a spiral route, a concentric circle route, a polygonal line route, and an in-situ rotation route.
  • the arcuate route is to first determine a plurality of parallel work routes that are to be operated in the work area, and connect the work routes end to end into a continuous work route;
  • the spiral route is from the work center The point gradually rotates outward, similar to a spirally spiraling work route;
  • the concentric circular path includes a plurality of concentric circular paths centered on the work center;
  • the polygonal line route is used to connect necessary work points in sequence,
  • the line-shaped route can be flexibly set to a free route; in one embodiment of the present application, the in-situ rotation route is used to characterize the in-situ rotation of the working machine, such as a drone.
  • the working route planning method provided by the embodiment of the present application further includes: Step S120 shown in FIG. 2: performing a selection operation on plant planting data, specifically including the following steps:
  • Step S121 visually labeling plant planting data of the work area, so that the user can conveniently and quickly normal plant planting data
  • Step S122 Obtain plant planting data of the plant area to be operated from the plant planting data of the work area of the visual mark according to the user's selection instruction;
  • the plant planting data of the visual marking work area can be displayed on the terminal device such as a mobile phone or a tablet computer, and the user sees the plant planting data of the visual marked work area on the terminal device, and outputs a selection instruction according to actual needs, the terminal
  • the device obtains plant planting data of the plant area to be operated from the plant planting data of the work area of the visual mark according to the user's selection instruction; that is, the function of selecting the operation in the embodiment of the present application can be integrated on the terminal device, of course. It can also be integrated with the functions of the method of planning the line.
  • the planning route of the mobile device in the work area includes:
  • the working route of the mobile device in the working plant area is planned.
  • the working path of the individual plants can be planned.
  • the plant planting data includes plant edge information and plant position information, as shown in FIG. 3, according to the plant planting data of the work area, the operation route of the planable mobile device in the work area includes:
  • Step S131a determining a work center point of each plant according to plant position information of the work area; determining a work radius of each plant according to plant edge information of the work area; and a work center point of each plant is substantially a geometric center position of the plant ;
  • Step S132a a plant operation route is generated according to a work center point of each plant, a work radius of each plant, and a work width of the movable device, and the plant work route is used to control a work route of the movable device in the work area.
  • the working path of each plant is generated according to the working radius of each plant and the working center point, and the working width of the movable device.
  • the generated plant working route is essentially the operation of each plant. The combination of paths.
  • the operation route of the planable mobile device in the work area includes:
  • Step S131b determining at least one work center line according to the plant position information; determining a plant width corresponding to the at least one work center line according to the plant edge information; specifically, a straight line of each plant is perpendicular to the corresponding work center line;
  • the operation center line here refers to a line formed by the operation center points of all plants in each step area, and may be a curve, a straight line or a wavy line; each operation center line corresponds to a plant width, and the plant The width may be the average width of all plants corresponding to the center line of the operation, or the width of one of the plants with the largest width among all the plants corresponding to the center line of the operation.
  • Step S132b Generate a plant operation route according to a work width of the movable device, each work center line, and a corresponding plant width, and the plant work route is used to control a work route of the movable device in the work area.
  • the plant operation route is substantially the sum of the plant operation paths on the step regions of each stage.
  • job in the embodiment of the present application may be an operation of spraying insecticides, nutraceuticals, or performing monitoring tasks on plants in the work area, and is not enumerated here.
  • job For the ultimate goal of “job”, it is best to complete the operation for each part of each plant. As for how to achieve the best goal, it can be tested by the mobile device according to the generated plant operation route. Of course, other achievable detection methods are not excluded.
  • the embodiment of the present application further provides a planting planting data measuring method, which can be used for obtaining the planting data of the working area in the above operation route planning, and can also be used for analyzing the plant growth state, as shown in FIG. 5 and FIG.
  • the plant planting data measuring method comprises the following steps:
  • Step S220 receiving image information of the planting area; the image information of the planting area may be provided by an image capturing device such as a surveying device or a camera, and the image information includes one or more of mapping image information, map information, and picture information, and is not limited thereto. .
  • Step S230 processing image information of the planting area by using a preset recognition model to obtain plant planting data.
  • the plant planting data measuring method provided by the embodiment of the present application can also be output according to actual needs. If the output is required, the method further includes the step S240: outputting the plant planting data according to actual needs for further application.
  • the image information of the planting area is processed by using a preset recognition model, and plant planting data is obtained, thereby simplifying the statistical process of plant planting data, thereby improving plant planting.
  • the statistical efficiency of the data makes the route planning more accurate, avoiding the statistical inefficiency caused by the artificial planting data of the planting area and the complicated statistical process.
  • the preset recognition model is a model having an image recognition function, and may also be some algorithms with color recognition, as long as it can The plant data can be identified in the image.
  • the preset recognition model in the embodiment of the present application is a deep network model, but the deep network model needs to be trained and learned to efficiently identify the image information of the planting area. That is, as shown in FIG. 5 to FIG. 7 , before receiving the image information of the planting area, the method for measuring plant planting data provided by the embodiments of the present application includes:
  • Step S210 Perform training learning on the deep network model by using the planting region history information, which specifically includes the following steps;
  • Step S211 Receive planting area history information;
  • the category area history information includes historical image information and historical plant calibration information corresponding to the historical image information;
  • Step S212 extracting historical plant graphic features from the historical image information, which may be implemented by a convolutional neural network model CNN as shown in FIG. 3, and may also be identified by other image recognition models.
  • Step S213 processing the historical plant graphic features and the historical plant calibration information by using the deep learning method to obtain the depth network model loss value;
  • Step S214 Using the historical plant calibration information to optimize the loss value of the deep network model, and obtaining the deep network model, so that the image information of the planting area is processed accurately by using the deep network model, and the deep network model includes the optimized plant planting data. Identification strategy;
  • the optimization method can be implemented by a reverse transfer optimization method, and of course, other optimization methods can also be selected.
  • the plant planting data measuring method provided by the embodiment of the present application further includes a step S250: saving the depth network model, and the deep network model may be saved in the storage module having the storage function. 130 or in memory.
  • the historical plant calibration information provided by the embodiment of the present application includes historical plant edge calibration information and/or historical plant species calibration information.
  • the deep learning method is used to process the historical plant graphic features and historical plant calibration information, and the depth network model loss values are obtained:
  • the feature pyramid model FPN is used to segment the historical plant image features, and the plant history image segmentation results are obtained.
  • the plant history image segmentation results are essentially a series of plant edge track point prediction information.
  • the edge of the plant edge track point prediction information is Irregular edges.
  • the specific pyramid model is based on a feature extraction network. It can be a common network such as ResNet or DenseNet, which can be used in target detection, instance segmentation, gesture recognition, facial recognition and other applications.
  • a pre-training model can be used to implement FPN in a common deep learning framework.
  • the size of the target in the image is various, and the objects in the data set cannot cover all the scales, so it is necessary to use the image pyramid (downsampling of different resolutions) to help CNN learn. But this speed is too slow, so it can only be predicted using a single scale, and some people will take intermediate results to predict.
  • This architecture is heavily used when there are auxiliary information and auxiliary loss functions.
  • FPN has improved the above method in a very clever way.
  • the lateral connection In addition to the lateral connection, it also joins the top-down connection, and the results from the top to the bottom and the lateral results are combined by the addition method.
  • Get feature maps of different resolutions and they all contain the semantic information of the original deepest feature map.
  • the method further comprises: obtaining a historical plant image segmentation loss value Lmask according to the plant history image segmentation result and the historical plant edge calibration information;
  • the regional recommended network model RPN was used to process the historical plant graphic features to obtain the target plant regional results.
  • the target plant regional results were essentially some of the plant edge trajectory point prediction information, but the difference from the plant historical image segmentation results was: the edge of these plants
  • the edge of the plant formed by the track point prediction information is a regular circle to further obtain the plant position prediction information by using the plant edge track point prediction information;
  • the Regional Recommendation Network Model is a fully convolutional network that provides end-to-end training for specific tasks to generate recommendations.
  • the input of the RPN network is an image
  • the output is divided into two, one is the probability of the target and the non-target, and the other is the four parameters of the bbox, which are the central coordinates of the bbox x, y and the width w of the bbox.
  • Long h. In order to get the output area suggestion, use a small network to slide on the last convolutional feature map of the five convolutional layers, each sliding window is mapped to a lower dimension feature, and finally two parallel connections are formed. Layer, a box regression layer and a box classification layer.
  • the target plant region regression loss value Lbox is obtained according to the target plant region result and the historical plant edge calibration information
  • the regional recommendation network model RPN is used to process the historical plant graphic features to obtain the target plant species information; according to the target plant species information and the historical plant species calibration information, the target plant species regression loss value Lcls is obtained;
  • the depth network model loss value is obtained according to one or more of the historical plant image segmentation loss value Lmask, the target plant region regression loss value Lbox, and the target plant species regression loss value Lcls.
  • the historical plant image segmentation loss value Lmask, the target plant region regression loss value Lbox, and the target plant species regression loss value Lcls are essentially a scalar, according to the historical plant image segmentation loss value Lmask, the target plant region regression loss value Lbox, the target plant species One or more of the regression loss values Lcls, the process of obtaining the loss value of the deep network model is equivalent to one of the historical plant image segmentation loss value Lmask, the target plant region regression loss value Lbox, and the target plant species regression loss value Lcls. Or a plurality of summations are obtained.
  • the historical plant calibration information is used to optimize the loss value of the deep network model, so that the identification strategy of the plant planting data can be more accurate.
  • the image information of the planting area is processed by using the deep network model, and the planting data obtained by the plant includes:
  • Step S231 Control, according to the depth network model, extracting graphic features of the plant to be tested from the image information of the planting area;
  • Step S232 controlling the graphic features of the plant to be tested according to the depth network model, and obtaining plant planting data;
  • the graphic characteristics of the plant to be tested are controlled, and the planting data obtained includes:
  • control feature pyramid model FPN performs image segmentation on the graphic features of the plant to obtain plant edge information
  • the plant edge information includes a plant crown shape, a plant size, and the like.
  • control feature pyramid model is processed to obtain the plant growth information, and the plant growth status can be determined according to the plant edge information.
  • the plant growth information is used to characterize the growth status of the plant, including whether the plant is germinated, the germination rate of the plant, the flowering condition of the plant, the pollination of the plant, the damage of the plant to the pest and the plant maturation, and the like, in one embodiment, the depth
  • the recognition model automatically recognizes the plant pests and diseases, and by calibrating the damage pictures of insect pests and grasses in the historical image information of the planting area, the plant identification pests and diseases can be automatically identified by the deep recognition model, thereby monitoring the plant health and generating plants. Growth information.
  • control area recommended network model RPN processes the graphic features of the plant to obtain plant position information
  • the depth network model combined with the image information of the planting area, information such as the degree of sparseness of the plant, the location of the plant, and the distribution of the plant can be obtained.
  • the image information of the planting area is from the geographic coordinates of 23 degrees 06 minutes 32 seconds north latitude, 113 degrees 15 minutes 53 seconds east longitude, and the geographical position of the plants is determined according to the geographic location of the image sampling and the relative position of the plants in the image.
  • control area recommendation network model processes the graphic features of the plant to be tested to obtain plant quantity information; of course, the plant quantity information can also be obtained by plant position information.
  • control area recommendation network model RPN processes the historical plant graphic features to obtain plant species information.
  • the image information of the planting area is processed by the deep network model to obtain the planting data of the plant, and the method for measuring the planting data provided by the embodiment of the present application further includes:
  • plant edge information One or more of plant edge information, plant position information, plant growth information, plant quantity information, and plant type information are output as plant planting data.
  • Table 1 lists the plant edge information and plant position information that are output.
  • the plant planting data listed in the plant planting data measuring method provided by the examples of the present application includes the plant center point position and the plant radius, and the plant radius is the plant edge information and the plant growth information.
  • the plant center point location is a representation of plant quantity information and plant position information.
  • the embodiment of the present application further provides a work route planning system.
  • the work route planning system includes:
  • the planting data measuring device 100 is configured to obtain plant planting data of the working area from the working area image of the working route according to the plant planting data measuring method provided in the second embodiment;
  • the planning module 400 is configured to plan a working route of the movable device in the working area according to the planting data of the working area.
  • the work route includes one or more of an arcuate route, a spiral route, a concentric circle route, a polygonal line route, and an in-situ rotation route, but is not limited thereto.
  • the work route planning system provided by the embodiment of the present application further includes:
  • the data marking module 200 is configured to: after acquiring the planting data of the working area, plan the movable device to visually mark the planting data of the working area before the working route of the working area;
  • the control module 300 is configured to obtain plant planting data of the plant area to be operated from the plant planting data of the work area of the visual mark according to the selection instruction sent by the user;
  • the planning module 400 is specifically configured to plan a working route of the mobile device in the area of the working plant according to the planting data of the plant area to be operated.
  • the planning module 400 in the embodiment of the present application is specifically used according to the working area.
  • Plant position information determining a work center point of each plant; determining a working radius of each plant according to plant edge information of the work area;
  • a plant operation route is generated based on a work center point of each plant, a work radius of each plant, and a work width of the movable device, and the plant work route is used to control a work route of the movable device in the work area.
  • the terrain module has a steeper step shape and the plant is smaller.
  • the planning module 400 in the embodiment of the present application is specifically used for: Determining plant position information to determine at least one work center line; determining plant width corresponding to at least one work center line according to plant edge information; a straight line of each plant width direction is perpendicular to a corresponding work center line; according to the working width of the movable device a plant operation route is generated for each of the operation center lines and the corresponding plant width, and the plant operation route is used to control a work route of the movable device in the work area.
  • the embodiment of the present application provides a planting plant data measuring device 100, which can be used as the plant planting data measuring device 100 included in the working route planning system provided in the third embodiment; as shown in FIG. 6 and FIG.
  • the measuring device 100 comprises:
  • the receiving module 110 is configured to receive image information of the planting area
  • the processing module 120 is configured to process image information of the planting area by using a preset recognition model to obtain plant planting data;
  • the beneficial effects of the plant planting data measuring device provided by the embodiment of the present application are the same as those of the plant planting data measuring method provided in the second embodiment, and are not described herein.
  • the image information includes one or more of mapping image information, map information, and picture information, and may of course be other forms of image information, such as an infrared image, etc., which are not enumerated here.
  • the plant planting data measuring apparatus 100 provided by the embodiment of the present application further includes a storage module 130 for storing a deep network model;
  • the receiving module 110 is further configured to receive the planting region history information before receiving the image information of the planting region;
  • the category region history information includes historical image information and historical plant calibration information corresponding to the historical image information;
  • the processing module 120 includes a feature extraction unit 121 for extracting historical plant graphic features from historical image information before receiving image information of the planting region, and receiving image information of the planting region, according to the depth network.
  • the model controls the image feature of the plant to be tested to be extracted from the image information of the planting area; specifically, for the information identifying unit 122, the function is substantially for extracting historical plant graphic features from the historical image information, and specifically, the convolutional nerve may be used.
  • a model with image recognition function such as a network model CNN extracts historical plant graphic features.
  • the information identifying unit 122 is configured to process the historical plant graphic features and the historical plant calibration information by using a deep learning method to obtain the depth network model loss value, and obtain the image information of the planting area, according to the depth, before receiving the image information of the planting area.
  • the network model controls the graphic characteristics of the plants to be tested to obtain plant planting data.
  • the model optimization unit 123 is configured to: before receiving the image information of the planting area, use the historical plant calibration information to optimize the loss value of the depth network model before the image information of the planting area is received (the optimization method may be a reverse transfer optimization method), and obtain the depth.
  • the network model, the deep network model includes an optimized identification strategy for plant planting data.
  • the historical plant calibration information in the embodiment of the present application includes historical plant edge calibration information and/or historical plant species calibration information.
  • the information identifying unit 122 includes a first identifying unit 122 a for performing image segmentation on the historical plant graphic features using the feature pyramid model FPN before receiving the image information of the planting region, and obtaining the segmentation result of the plant history image.
  • the feature pyramid model FPN is used to perform image segmentation on the graphic features of the plant to be tested to obtain plant edge information; and/or, according to the depth network model, the feature pyramid model is controlled according to the depth network model. Describe the graphic characteristics of the plant to be processed to obtain plant growth information;
  • the historical plant graphic feature is processed by the regional recommendation network model RPN to obtain the target plant region result; and the target plant region result and the historical plant edge calibration information are obtained according to the target plant region result , obtaining the target plant region regression loss value Lbox; and,
  • the graphic feature of the plant to be tested is processed by using the regional recommendation network model RPN to obtain plant position information; and/or according to the depth network.
  • a model, a control area recommendation network model processes the graphic features of the plant to be tested, and obtains plant quantity information;
  • the historical plant model feature is processed by the regional recommendation network model RPN to obtain the target plant species information; and the target plant species is obtained according to the target plant species information and the historical plant species calibration information.
  • Regression loss value Lcls and receiving image information of the planting area, according to the depth network model, using the regional recommendation network model RPN to process the historical plant graphic features to obtain plant species information;
  • the information calculation unit is configured to obtain the depth network model according to one or more of the historical plant image segmentation loss value Lmask, the target plant region regression loss value Lbox, and the target plant species regression loss value Lcls before receiving the image information of the planting region. Loss value
  • the embodiment of the present application further includes a plant planting data measuring device 100, further comprising a sending module 140, configured to use plant edge information, plant position information, plant quantity information, plant growth information, One or more of the plant species information is output as plant planting data.
  • the embodiment of the present application further provides a planting plant data measuring system, as shown in FIG. 10 and FIG. 12, the plant planting data measuring system includes an image capturing device 001 and the memory 302 provided by the above embodiment; the output of the image capturing device 001 The end is connected to the receiving module 110 included in the planting plant data measuring device 100.
  • the image capturing device 001 can collect historical image information included in the planting region history information or image information of the current planting region, and transmit the image information to the plant planting data measuring device 100, so that the plant planting data measuring device 100 can be included according to the mapping planting region history information.
  • the historical image information combined with the historical plant calibration information included in the historical information of the surveying type, realizes the optimization of the deep network model; in order to receive the image information of the current planting area, the deep network model can be quickly called to complete the measurement of the planting data.
  • the beneficial effects of the plant planting data measuring device 100 provided by the embodiments of the present application are the same as those of the plant planting data measuring method provided by the above technical solution, and are not described herein.
  • the image capturing device 001 is generally a surveying aircraft, and may of course be a camera device such as another camera.
  • the planting area is photographed at a high altitude, and the plant planting data measuring device 100 can be set on the ground in the form of a server, and the image information of the planting area collected by the surveying aircraft is transmitted to the ground by wireless transmission. server.
  • the plant planting data measuring device 100 can also be disposed in the surveying aircraft, so that the plant planting data measuring device 100 can process the image information of the planting area collected by the surveying aircraft in real time.
  • the mapping aircraft includes a mapping module and a flight control module.
  • the planting data measuring device 100 provided in the embodiment of the present application is disposed in the flight control module, and the mapping module is connected to the receiving module 110. .
  • the image collecting device in the embodiment of the present application includes at least a positioning unit and an image collecting unit; the positioning unit can be used to locate plant position information; the image collecting unit is configured to collect image information; and the positioning unit and the image collecting unit are respectively connected to the receiving module.
  • the receiving module is connected to the sending module.
  • the location of the mapping aircraft may be used to determine a reference to the position of the plant, thereby indirectly determining the position of the plant by mapping the position information of the aircraft.
  • mapping the position information of the aircraft may be used to determine a reference to the position of the plant.
  • the image information of the planting area transmitted by the surveying and mapping device is processed by using the preset recognition model to obtain the planting data, thereby simplifying the statistical process of the planting data, thereby improving the statistical efficiency of the planting data and avoiding the statistical efficiency of the planting data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Marketing (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Data Mining & Analysis (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Quality & Reliability (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Operations Research (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Molecular Biology (AREA)
  • Mining & Mineral Resources (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)

Abstract

一种植株种植数据测量方法、作业路线规划方法及装置、***,涉及植保技术领域。所述方法包括:利用种植区域历史信息对深度网络模型进行训练学习(S210);接收种植区域的图像信息(S220);利用深度网络模型对种植区域的图像信息进行处理,得到植株种植数据(S230);将植株种植数据输出(S240)。所述作业路线规划方法利用上述测量方法获得的植株种植数据规划作业路线。

Description

植株种植数据测量方法、作业路线规划方法及装置、*** 技术领域
本申请涉及植保技术领域,具体而言,涉及一种植株种植数据测量方法、作业路线规划方法及装置、***。
背景技术
在农业生产过程中,农产品产量长期受到人们的关注,其直接决定了农业管理人员的收入,因此,农业管理人员往往通过计算农产品产量,估算农产品的经济收益。
目前,影响农产品产量的因素主要包括种植数量、植株大小等植株种植数据,这些植株种植数据对于农业管理人员预测农产品的经济收益非常重要,经常采用人工统计的方式获得,这使得植株种植数据的统计效率比较低,尤其是对于一些种植面积较大的农作物,植株种植数据的统计将会变得十分繁杂。
发明内容
本申请实施例提供了一种植株种植数据测量方法、作业路线规划方法及装置、***,以简化植株种植数据的统计过程,从而提高植株种植数据的统计效率,使得路线规划更为准确。
为了实现上述目的,本申请提供如下技术方案:
第一方面,本申请提供了一种植株种植数据测量方法,该植株种植数据测量方法包括:
接收种植区域的图像信息;
利用预设识别模型对所述种植区域的图像信息进行处理,获得植株种植数据。
与现有技术相比,本申请提供的植株种植数据测量方法中,利用预设识别模型对测绘设备传送的种植区域的图像信息进行处理,获得植株种植数据,以简化了植株种植数据的统计过程,从而提高植株种植数据的统计效率,使得路线规划更为准确,避免了人工统计种植区域的植株种植数据所导致的统计效率低下,以及统计过程繁复的问题。
第二方面,本申请还提供了一种植株种植数据测量装置,该植株种植数据测量装置包括:
接收模块,用于接收种植区域的图像信息;
处理模块,用于利用预设识别模型对所述种植区域的图像信息进行处理,获得植株种植数据。
与现有技术相比,本申请提供的植株种植数据测量装置的有益效果与上述技术方案提供的植株种植数据测量装置的有益效果相同,在此不做赘述。
第三方面,本申请还提供了一种植株种植数据测量***,该植株种植数据测量***包括测绘设备以及上述技术方案所述的植株种植数据测量装置;所述测绘装置的输出端与所述植株种植数据测量装置所包括的接收模块连接。
与现有技术相比,本申请提供的植株种植数据测量***的有益效果与上述技术方案提供的植株种植数据测量方法的有益效果相同,在此不做赘述。
第四方面,本申请还提供了一种作业路线规划方法,该作业路线规划方法包括:
根据上述技术方案所述的植株种植数据测量方法,从用于作业路线规划的作业区域图像中获取作业区域的植株种植数据;
根据所述作业区域的植株种植数据,规划可移动设备在作业区域的作业路线。
与现有技术相比,本申请提供的作业路线规划方法的有益效果与上述技术方案提供的植株种植数据测量方法的方案相同,在此不做赘述。同时,利用所测量的植株种植数据规划作业路线,能够使得作业路线更加准确,提高工作效率,提高作业执行准确度。
第五方面,本申请还提供了一种作业路线规划***,该作业路线规划***包括:
上述技术方案所述的植株种植数据测量装置,用于根据上述技术方案提供的所述的植株种植数据测量方法,从所述作业路线规划的作业区域图像获取作业区域的植株种植数据;
规划模块,用于根据所述作业区域的植株种植数据,规划可移动设备在作业区域的作业路线。
与现有技术相比,本申请提供的作业路线规划装置的有益效果与上述技术方案提供的作业路线规划方法的有益效果相同,在此不做赘述。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1为本申请实施例提供的作业路线规划方法的流程框图;
图2为本申请实施例中对植株种植数据进行选取操作的流程图;
图3为本申请实施例中规划可移动设备在作业区域的作业路线流程图一;
图4为本申请实施例中规划可移动设备在作业区域的作业路线流程图二;
图5为本申请实施例提供的作业路线规划***结构示意图;
图6为本申请实施例提供的植株种植数据测量方法流程图;
图7为本申请实施例中利用种植区域历史信息对深度网络模型进行训练学习的流程图;
图8为本申请实施例中深度网络模型的结构原理图;
图9为本申请实施例中利用深度网络模型对种植区域的图像信息进行处理的流程图;
图10为本申请实施例提供的植株种植数据测量装置结构框图;
图11为本申请实施例的一种深度网络模型的结构框图;
图12为本申请实施例提供的植株种植数据测量***的信息交互示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
针对人工统计植株种植数据,导致的植株种植数据统计效率低,尤其是对于一些种植面积较大的农作物,植株种植数据的统计将会变得十分繁杂的问题,本申请实施例提供了一种植株种植数据测量方法、作业路线规划方法及装置、***,以解决问题,其可用于图像可识别的作物种植数据的测量,且不受种植面积影响,方便快捷的获得植株种植数据,而且,本申请实施例所测量的植株种植数据不仅可以用于预测植株产量,而且还可以对植株的成长状况进行观察,并在需要对植株进行作业时,利用植株 种植数据规划作业路线。
实施例一
本申请实施例提供了一种作业路线规划方法,如图1所示,该作业路线规划方法包括:
步骤S110:采用植株种植数据测量方法从用于作业路线规划的作业区域图像中获取作业区域的植株种植数据;
步骤S130:根据作业区域的植株种植数据,规划可移动设备在作业区域的作业路线。
其中,植株种植数据测量方法是基于训练学习等图像识别技术测量植株种植数据;图像识别技术可以是深度网络学习技术或其他图像识别技术;例如:可利用简单的颜色识别,形状识别,图像分割,边缘识别算方法等从作业区域图像中分离出所需的植株种植数据。通过这些图像识别技术可以快速从作业区域图像中分离出所需的作业区域的植株种植数据,并按照作业区域的植株种植数据规划可移动设备在作业区域的作业路线,从而提高作业执行准确性,避免盲目作业的现象出现,提高测绘效率。
所述的可移动设备包括无人机、飞行器、拖拉机、耕种机、收割机和智能机器人等,所述的无人机可以用于在待作业区域喷洒农药、化肥、种子等,所述拖拉机、耕种机、收割机可以用于对农田进行耕种、收割等,所述智能机器人可以对待作业区域进行自动采摘、棉花打尖、激光除草除虫等。
需要说明的是,本申请实施例所规划的作业路线可以包括弓形路线、螺旋路线、同心圆路线、折线形路线、原地旋转路线的一种或多种。在本申请的其中一个实施例中,所述弓形路线是首先确定在待作业区域作业的若干条平行作业路线,并将这些作业路线首尾连接成连贯的作业路线;所述螺旋航线是从作业中心点逐渐向外旋转,类似于海螺一样向外螺旋的作业路线;所述同心圆路线包括若干个以作业中心为中心的同心圆路线;所述折线形路线用于将必要的作业点依次相连,而形成的折线形路线,在本申请一个实施例中,折线形路线可灵活设置成自由路线;在本申请一个实施例中,原地旋转路线用于表征作业机器原地旋转,例如无人机对果树喷洒药液时,可通过原地旋转的方式进行均匀喷洒。
为了使得本申请实施例提供的作业规划方法人性化,获取作业区域的植株种植数据后,规划可移动设备在作业区域的作业路线前,本申请实施例提供的作业路线规划 方法还包括:如图2所示的步骤S120:对植株种植数据进行选取操作,具体包括如下步骤:
步骤S121:将作业区域的植株种植数据进行可视化标记,以使得用户能够方便和快速的法线植株种植数据;
步骤S122:根据用户的选择指令,从可视化标记的作业区域的植株种植数据中获取待作业植株区域的植株种植数据;
应用时,可以将可视化标记的作业区域的植株种植数据显示在手机、平板电脑等终端设备上,用户在终端设备上看到可视化标记的作业区域的植株种植数据,根据实际需要输出选择指令,终端设备根据用户的选择指令,从可视化标记的作业区域的植株种植数据中获取待作业植株区域的植株种植数据;也就是说,本申请实施例中选取操作的功能,可以集成在终端设备上,当然也可以与规划线路的方法所具有的功能集成在处理器上。
其中,根据作业区域的植株种植数据,规划可移动设备在作业区域的作业路线包括:
根据待作业植株区域的植株种植数据,规划可移动设备在作业植株区域的作业路线。
示例性的,如果作业区域的地形地貌比较平坦,且植株半径较大的区域,可对单个植株的作业路径进行规划。例如:当植株种植数据包括植株边缘信息和植株位置信息,如图3所示,上述根据作业区域的植株种植数据,规划可移动设备在作业区域的作业路线包括:
步骤S131a:根据作业区域的植株位置信息,确定各个植株的作业中心点;根据作业区域的植株边缘信息,确定各个植株的作业半径;每个植株的作业中心点实质上为该植株的几何中心位置;
步骤S132a:根据各个植株的作业中心点、各个植株的作业半径,以及所述可移动设备的作业宽度生成植株作业路线,植株作业路线用于控制可移动设备在作业区域内的作业路线。
具体的,根据每个植株的作业半径和作业中心点,以及可移动设备的作业宽度,生成每个植株的作业路径,换句话说,所生成的植株作业路线实质上是由每棵植株的作业路径组合而成的。
而如果作业区域的地形地貌呈现阶梯状,尤其是作业区域内的单个植株尺寸较小,对于单个植株的作业方式进行限定就不切实际,可考虑逐级对阶梯状区域进行作业路线规划;基于此,当植株种植数据包括植株边缘信息和植株位置信息,如图4所示,上述根据作业区域的植株种植数据,规划可移动设备在作业区域的作业路线包括:
步骤S131b:根据植株位置信息确定至少一条作业中心线;根据植株边缘信息确定至少一条作业中心线对应的植株宽度;具体地,每个植株的宽度方向所在直线垂直于对应作业中心线;
具体的,此处的作业中心线是指每一级阶梯区域的所有植株的作业中心点所形成的连线,可以为曲线、直线或波浪线;每条作业中心线都对应有植株宽度,植株宽度可以为该作业中心线对应的所有植株的平均宽度,也可以是该作业中心线对应的所有植株中宽度最大的一个植株的宽度。
步骤S132b:根据可移动设备的作业宽度、每条作业中心线以及对应植株宽度,生成植株作业路线,植株作业路线用于控制所述可移动设备在所述作业区域内的作业路线。具体的,在本申请一个实施例中,植株作业路线实质上是各级阶梯区域上植株作业路径的总和。
需要说明的是,本申请实施例中的“作业”可以是对作业区域的植株喷洒杀虫药剂、营养药剂或执行监测任务等操作,在此不一一列举。而对于“作业”的最终目的,则应当以对每个植株的各个部分完成作业为最佳目标,至于如何达到最佳目标,可通过可移动设备按照所生成的植株作业路线进行反复作业进行检测,当然也不排除其他可实现的检测方式。
实施例二
本申请实施例还提供了一种植株种植数据测量方法,其可以用于上述作业路线规划获取作业区域的植株种植数据,当然也可以用于分析植株生长状况,如图5和图6所示,该植株种植数据测量方法包括以下步骤:
步骤S220:接收种植区域的图像信息;种植区域的图像信息可以由测绘设备、照相机等图像采集设备提供,图像信息包括测绘图像信息、地图信息、图片信息中的一种或多种,不仅限于此。
步骤S230:利用预设识别模型对种植区域的图像信息进行处理,获得植株种植数据。
可以理解的是,本申请实施例提供的植株种植数据测量方法还可以根据实际需要输出,若需要输出时,还包括步骤S240:根据实际需要将植株种植数据输出,以供进行进一步的应用。
由上可见,本申请实施例提供的植株种植数据测量方法中,利用预设识别模型对种植区域的图像信息进行处理,获得植株种植数据,以简化了植株种植数据的统计过程,从而提高植株种植数据的统计效率,使得路线规划更为准确,避免了人工统计种植区域的植株种植数据所导致的统计效率低下,以及统计过程繁复的问题。
可以理解的是,本申请实施例中利用预设识别模型对种植区域的图像信息进行处理时,预设识别模型是具有图像识别功能的模型,也可以是具有色彩识别的一些算法,只要能够从图像中识别出植株数据即可。
而为了使得所识别出的结果比较准确,本申请实施例中预设识别模型为深度网络模型,但是深度网络模型需要进行训练学习,才能对种植区域的图像信息进行高效识别。也就是说,如图5至图7所示,接收种植区域的图像信息前,本申请实施例提供的植株种植数据测量方法包括:
步骤S210:利用种植区域历史信息对深度网络模型进行训练学习,其具体包括以下步骤;
步骤S211:接收种植区域历史信息;种类区域历史信息包括历史图像信息以及历史图像信息对应的历史植株标定信息;
步骤S212:从历史图像信息提取历史植株图形特征,具体可以通过如图3所示的卷积神经网络模型CNN实现,当然也可以通过其他图像识别模型识别。
步骤S213:采用深度学习方式对历史植株图形特征和历史植株标定信息进行处理,获得深度网络模型损失值;
步骤S214:利用历史植株标定信息对深度网络模型损失值进行优化,获得深度网络模型,使得利用深度网络模型对种植区域的图像信息进行处理的结果比较准确,深度网络模型包括优化后的植株种植数据的识别策略;
具体的,优化方法可采用反向传递优化方法实现,当然也可以选择其他优化方法。
为了方便后续应用深度网络模型进行图像识别,如图6所示,本申请实施例提供的植株种植数据测量方法还包括步骤S250:保存深度网络模型,深度网络模型可保存在具有存储功能的存储模块130或存储器内。
示例性的,本申请实施例提供的历史植株标定信息包括历史植株边缘标定信息和/或历史植株种类标定信息。如图8所示,此时采用深度学习方式对历史植株图形特征和历史植株标定信息进行处理,获得深度网络模型损失值包括:
采用特征金字塔模型FPN对历史植株图形特征进行图像分割,获得植株历史图像分割结果;植株历史图像分割结果实质是一系列植株边缘轨迹点预测信息,这些植株边缘轨迹点预测信息所形成的植株边缘为不规则的边缘。
特定金字塔模型(FPN)是基于一个特征提取网络的,它可以是常见的ResNet或者DenseNet之类的网络,可以用在目标检测、实例分割、姿态识别、面部识别等应用中。在常用的深度学习框架下取一个预训练模型,就可以用来实现FPN。
图像里的目标尺寸大小各种各样,数据集里的物体不可能涵盖所有的尺度,所以需要利用图像金字塔(不同分辨率的下采样)来帮助CNN学习。但这样的速度太慢,所以只能使用单一尺度来预测,也有人会取中间结果来预测。这种架构在有辅助信息和辅助损失函数时被大量使用。
FPN用一种很巧妙的方法提高了上述方法,除了侧向的连接,还加入了自上而下地连接,把从上到下的结果和侧向得到的结果通过相加办法融合到一起,可以得到不同分辨率的特征图,而他们都包含了原来最深层特征图的语义信息。
为了验证植株历史图像分割结果与实际的植株边缘标定信息之间的误差,还包括:根据植株历史图像分割结果和历史植株边缘标定信息,获得历史植株图像分割损失值Lmask;
和/或,
采用区域推荐网络模型RPN对历史植株图形特征进行处理,获得目标植株区域结果;目标植株区域结果实质是一些列植株边缘轨迹点预测信息,但与植株历史图像分割结果不同之处在于:这些植株边缘轨迹点预测信息所形成的植株边缘为规则的圆形,以进一步利用这些植株边缘轨迹点预测信息获得植株位置预测信息;
区域推荐网络模型(RPN)是一种完全卷积网络,可以为特定的任务进行端到端的训练来产生推荐。RPN网络的输入是一张图像,输出分为两支,一支是目标和非目标的概率,另一支是bbox的四个参数,分别是bbox的中心坐标x,y以及bbox的宽w和长h。为了得到输出的区域建议,用一个小网络在5个卷积层最后一个卷积层特征图上滑动,每个滑动窗口都映射到一个维度更低的特征,最后是形成两个并列的全连 接层,一个box回归层和一个box分类层。
为了根据验证目标植株区域结果所获得的植株位置预测信息与实际的植株边缘标定信息之间的误差,还包括:根据目标植株区域结果和历史植株边缘标定信息,获得目标植株区域回归损失值Lbox;
和/或,
采用区域推荐网络模型RPN对历史植株图形特征进行处理,获得目标植株种类信息;根据目标植株种类信息和历史植株种类标定信息,获得目标植株种类回归损失值Lcls;
根据历史植株图像分割损失值Lmask、目标植株区域回归损失值Lbox、目标植株种类回归损失值Lcls中的一种或多种,获得深度网络模型损失值。
其中,历史植株图像分割损失值Lmask、目标植株区域回归损失值Lbox、目标植株种类回归损失值Lcls实质是一标量,根据历史植株图像分割损失值Lmask、目标植株区域回归损失值Lbox、目标植株种类回归损失值Lcls中的一种或多种,获得深度网络模型损失值的过程相当于将历史植株图像分割损失值Lmask、目标植株区域回归损失值Lbox、目标植株种类回归损失值Lcls中的一种或多种进行加和获得的,此时利用所述历史植株标定信息对对深度网络模型损失值进行优化,能够使得植株种植数据的识别策略更加准确。
可选地,如图8和图9所示,利用深度网络模型对种植区域的图像信息进行处理,获得植株种植数据包括:
步骤S231:根据深度网络模型控制从种植区域的图像信息提取待测植株图形特征;
步骤S232:根据深度网络模型控制对待测植株图形特征进行处理,获得植株种植数据;
其中,根据深度网络模型控制对待测植株图形特征进行处理,获得植株种植数据包括:
根据深度网络模型,控制特征金字塔模型FPN对待测植株图形特征进行图像分割,获得植株边缘信息;
所述植株边缘信息包括植株树冠形状,植株大小等。
和/或,
根据深度网络模型,控制特征金字塔模型对待测植株图形特征进行处理,获得植 株生长信息;也可以是根据植株边缘信息确定植株生长状况。
所述的植株生长信息用于表征植株的生长状况,包括植株是否发芽、植株发芽率、植株开花情况、植株授粉情况、植株遭受虫草害情况以及植株成熟情况等,在一个实施例中,通过深度识别模型自动识别植株病虫害,通过标定所述的种植区域的历史图像信息中的虫害、草害等受害图片,通过深度识别模型可以自动将植株病虫害识别出来,从而用于监测植株健康情况,生成植株生长信息。
和/或,
根据深度网络模型,控制区域推荐网络模型RPN对待测植株图形特征进行处理,获得植株位置信息;
根据所述深度网络模型,结合种植区域的图像信息即可以获得植株的稀疏程度,植株的位置,植株的分布等信息。例如种植区域的图像信息来自于地理坐标北纬23度06分32秒,东经113度15分53秒,则根据图像采样地理位置,及图像中植株的相对位置确定植株的地理位置。
和/或,
根据深度网络模型,控制区域推荐网络模型对所述待测植株图形特征进行处理,获得植株数量信息;当然植株数量信息也可以通过植株位置信息获得。
和/或,
根据深度网络模型,控制区域推荐网络模型RPN对历史植株图形特征进行处理,获得植株种类信息。
如果需要输出植株种植数据,则利用深度网络模型对种植区域的图像信息进行处理,获得植株种植数据后,本申请实施例提供的植株种植数据测量方法还包括:
将植株边缘信息、植株位置信息、植株生长信息、植株数量信息、植株种类信息中的一个或多个作为植株种植数据输出。示例性的,表1列出了所输出的植株边缘信息和植株位置信息。
表1 本申请实施例提供的植株种植数据测量方法所测量的植株种植数据
编号 植株中心点位置/pix 植株半径/pix
1 (317,101) 58.69
2 (472,379) 65.0
3 (486,189) 48.373
4 (326,34) 52.49
5 (364,256) 53.03
6 (214,318) 56.58
7 (20,116) 37.73
8 (388,130) 57.3
9 (185,397) 86.97
10 (139,238) 72.83
11 (433,56) 67.89
12 (109,482) 60.03
13 (381,402) 69.31
14 (106,353) 68.59
15 (296,421) 78.51
16 (299,355) 57.28
17 (32,20) 35.22
从表1所列出的植株种植数据可以发现,本申请实施例提供的植株种植数据测量方法所列出的植株种植数据包括植株中心点位置和植株半径,植株半径是植株边缘信息和植株生长信息的一种表现形式,植株中心点位置是植株数量信息和植株位置信息的一种表现形式。
实施例三
本申请实施例还提供了一种作业路线规划***,如图1和图5所示,该作业路线规划***包括:
植株种植数据测量装置100,用于根据上述实施例二提供的植株种植数据测量方法,从所述作业路线规划的作业区域图像获取作业区域的植株种植数据;
规划模块400,用于根据作业区域的植株种植数据,规划可移动设备在作业区域的作业路线。
与现有技术相比,本申请实施例提供的作业路线规划***的有益效果与上述实施例一提供的作业路线规划方法的有益效果相同,在此不做赘述。
其中,作业路线包括弓形路线、螺旋路线、同心圆路线、折线形路线、原地旋转路线的一种或多种,但不仅限于此。
可选地,如图2和图5所示,本申请实施例提供的作业路线规划***还包括:
数据标记模块200,用于获取作业区域的植株种植数据后,规划可移动设备在作业区域的作业路线前,将所述作业区域的植株种植数据进行可视化标记;
选取控制模块300,用于根据用户发送的选取指令,从可视化标记的作业区域的 植株种植数据中获取待作业植株区域的植株种植数据;
规划模块400具体用于根据所述待作业植株区域的植株种植数据,规划可移动设备在带作业植株区域的作业路线。
作为一种可实现的方式,在地势较为平坦的区域,且植株较大时,若植株种植数据包括植株边缘信息和植株位置信息,本申请实施例中规划模块400具体用于:根据作业区域的植株位置信息,确定各个植株的作业中心点;根据作业区域的植株边缘信息,确定各个植株的作业半径;
根据各个植株的作业中心点、各个植株的作业半径,以及可移动设备的作业宽度生成植株作业路线,植株作业路线用于控制可移动设备在所述作业区域内的作业路线。
作为另一种可实现的方式,在地势为较为陡峭的阶梯状,且植株较小,若植株种植数据包括植株边缘信息和植株位置信息,本申请实施例中规划模块400具体用于:根据所述植株位置信息确定至少一条作业中心线;根据植株边缘信息确定至少一条作业中心线对应的植株宽度;每个植株的宽度方向所在直线垂直于对应作业中心线;根据所述可移动设备的作业宽度、每条所述作业中心线以及对应植株宽度,生成植株作业路线,所述植株作业路线用于控制所述可移动设备在所述作业区域内的作业路线。
实施例四
本申请实施例提供了一种植株种植数据测量装置100,可作为上述实施例三提供的作业路线规划***所包括的植株种植数据测量装置100;如图6和图10所示,该植株种植数据测量装置100包括:
接收模块110,用于接收种植区域的图像信息;
处理模块120,用于利用预设识别模型对种植区域的图像信息进行处理,获得植株种植数据;
与现有技术相比,本申请实施例提供的植株种植数据测量装置的有益效果与上述实施例二提供的植株种植数据测量方法的有益效果相同,在此不做赘述。
其中,上述图像信息包括测绘图像信息、地图信息、图片信息中的一种或多种,当然还可以是其他形式的图像信息,如红外图等,在此不一一列举。
具体的,如图7和图10所示,当预设识别模型为深度网络模型,本申请实施例提供的植株种植数据测量装置100还包括存储模块130,用于存储深度网络模型;
上述接收模块110还用于接收种植区域的图像信息前,接收种植区域历史信息; 种类区域历史信息包括历史图像信息以及历史图像信息对应的历史植株标定信息;
如图7至图11所示,处理模块120包括特征提取单元121,用于接收种植区域的图像信息前,从历史图像信息提取历史植株图形特征;以及接收种植区域的图像信息后,根据深度网络模型,控制从种植区域的图像信息提取待测植株图形特征;具体而言,对于信息识别单元122来说,其作用实质是用于从历史图像信息提取历史植株图形特征,具体可以采用卷积神经网络模型CNN等具有图像识别功能的模型提取历史植株图形特征。
信息识别单元122,用于接收种植区域的图像信息前,采用深度学习方式对历史植株图形特征和历史植株标定信息进行处理,获得深度网络模型损失值;以及接收种植区域的图像信息后,根据深度网络模型,控制对待测植株图形特征进行处理,获得植株种植数据。
模型优化单元123,用于接收种植区域的图像信息前,接收种植区域的图像信息前,利用历史植株标定信息对深度网络模型损失值进行优化(优化方法可以为反向传递优化方法),获得深度网络模型,深度网络模型包括优化后的植株种植数据的识别策略。
可选地,本申请实施例中历史植株标定信息包括历史植株边缘标定信息和/或历史植株种类标定信息。
如图7和图11所示,信息识别单元122包括第一识别单元122a,用于接收种植区域的图像信息前,采用特征金字塔模型FPN对历史植株图形特征进行图像分割,获得植株历史图像分割结果;根据植株历史图像分割结果和所述历史植株边缘标定信息,获得历史植株图像分割损失值Lmask;以及,
接收种植区域的图像信息后,根据深度网络模型,采用特征金字塔模型FPN对所述待测植株图形特征进行图像分割,获得植株边缘信息;和/或,根据深度网络模型,控制特征金字塔模型对所述待测植株图形特征进行处理,获得植株生长信息;
第二识别单元122b用于接收种植区域的图像信息前,采用区域推荐网络模型RPN对历史植株图形特征进行处理,获得目标植株区域结果;根据所述目标植株区域结果和所述历史植株边缘标定信息,获得目标植株区域回归损失值Lbox;以及,
根据所述深度网络模型,以及接收种植区域的图像信息后,根据深度网络模型,采用区域推荐网络模型RPN对所述待测植株图形特征进行处理,获得植株位置信息; 和/或,根据深度网络模型,控制区域推荐网络模型对所述待测植株图形特征进行处理,获得植株数量信息;
和/或,
接收种植区域的图像信息前,采用区域推荐网络模型RPN对所述历史植株图形特征进行处理,获得目标植株种类信息;根据所述目标植株种类信息和所述历史植株种类标定信息,获得目标植株种类回归损失值Lcls;以及接收测种植区域的图像信息后,根据深度网络模型,采用区域推荐网络模型RPN对所述历史植株图形特征进行处理,获得植株种类信息;
信息计算单元,用于接收种植区域的图像信息前,根据历史植株图像分割损失值Lmask、目标植株区域回归损失值Lbox、目标植株种类回归损失值Lcls中的一种或多种,获得深度网络模型损失值;
如果需要输出植株种植数据,如图10所示,本申请实施例还包括植株种植数据测量装置100还包括发送模块140,用于将植株边缘信息、植株位置信息、植株数量信息、植株生长信息、植株种类信息中的一个或多个作为植株种植数据输出。
实施例五
本申请实施例还提供了一种植株种植数据测量***,如图10和图12所示,该植株种植数据测量***包括图像采集设备001以及上述实施例提供的存储器302;图像采集设备001的输出端与植株种植数据测量装置100所包括的接收模块110连接。
图像采集设备001可以采集种植区域历史信息所包括的历史图像信息或当前种植区域的图像信息,并传递给植株种植数据测量装置100,使得植株种植数据测量装置100能够根据测绘种植区域历史信息所包括的历史图像信息,结合测绘种类区域历史信息所包括的历史植株标定信息,实现深度网络模型的优化;以便在接收当前种植区域的图像信息,能够快速调用深度网络模型完成植株种植数据的测量。
与现有技术相比,本申请实施例提供的植株种植数据测量装置100的有益效果与上述技术方案提供的植株种植数据测量方法的有益效果相同,在此不做赘述。
具体的,如图12所示,本申请实施例提供的植株种植数据测量***中,图像采集设备001一般为测绘飞行器,当然也可以是其他摄像机等拍摄设备。
如图6和图9所示,在高空对种植区域进行摄影,而植株种植数据测量装置100可以以服务器的形式设在地面,测绘飞行器所采集的种植区域的图像信息通过无线传 输的方式传送给服务器。
当然,也可以将植株种植数据测量装置100设在测绘飞行器中,使得植株种植数据测量装置100能够实时处理测绘飞行器所采集的种植区域的图像信息。当植株种植数据测量装置100设在测绘飞行器时,测绘飞行器包括测绘模块和飞行控制模块,本申请实施例提供的植株种植数据测量装置100设在飞行控制模块中,且测绘模块与接收模块110连接。
可选地,本申请实施例中图像采集设备至少包括定位单元和图像采集单元;定位单元可用于定位植株位置信息;图像采集单元用于采集图像信息;定位单元与图像采集单元分别与接收模块连接,接收模块与发送模块连接。这样当图像采集设备采集种植区域的图像信息时,每采集一个植株的图像信息,就可以对该植株进行定位,使得该植株的位置在图像采集阶段即可获得,后期只要对所采集的图像信息进行边缘轮廓的分析即可,在其他实施例中,所述测绘飞行器的位置可以用于对植株位置确定的参考,从而通过测绘飞行器的位置信息间接确定出植株的位置。当然为了更为精确的定位植株的中心位置,有必要对植株的中心点位置进行分析。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。
工业实用性
本申请实施例提供的方案,可应用于植保技术领域。在本申请实施例中,通过利用预设识别模型对测绘设备传送的种植区域的图像信息进行处理,获得植株种植数据,简化了植株种植数据的统计过程,从而提高植株种植数据的统计效率,避免了人工统计种植区域的植株种植数据所导致的统计效率低下,以及统计过程繁复的问题。

Claims (23)

  1. 一种植株种植数据测量方法,包括:
    接收种植区域的图像信息;
    利用预设识别模型对所述种植区域的图像信息进行处理,获得植株种植数据。
  2. 根据权利要求1所述的植株种植数据测量方法,其中,所述图像信息包括测绘图像信息、地图信息、图片信息中的一种或多种。
  3. 根据权利要求1所述的植株种植数据测量方法,其中,所述预设识别模型为深度网络模型,在接收种植区域的图像信息之前前,所述植株种植数据测量方法还包括:
    接收种植区域历史信息,所述种植区域历史信息包括历史图像信息以及所述历史图像信息对应的历史植株标定信息;
    从所述历史图像信息提取历史植株图形特征;
    采用深度学习方式对所述历史植株图形特征和所述历史植株标定信息进行处理,获得深度网络模型损失值;
    利用所述历史植株标定信息对所述深度网络模型损失值进行优化,获得深度网络模型,所述深度网络模型包括优化后的植株种植数据的识别策略;
    保存所述深度网络模型。
  4. 根据权利要求3所述的植株种植数据测量方法,其中,
    所述采用深度学习方式对历史植株图形特征和历史植株标定信息进行处理,获得深度网络模型损失值包括:
    在所述历史植株标定信息包括历史植株边缘标定信息时,采用特征金字塔模型对所述历史植株图形特征进行图像分割,获得植株历史图像分割结果;根据所述植株历史图像分割结果和所述历史植株边缘标定信息,获得历史植株图像分割损失值;
    在所述历史植株标定信息包括历史植株边缘标定信息时,采用区域推荐网络模型对所述历史植株图形特征进行处理,获得目标植株区域结果;根据所述目标植株区域结果和所述历史植株边缘标定信息,获得目标植株区域回归损失值;
    在所述历史植株标定信息包括历史植株种类标定信息时,采用区域推荐网络模型对所述历史植株图形特征进行处理,获得目标植株种类信息;根据所述目标植株种类信息和所述历史植株种类标定信息,获得目标植株种类回归损失值;
    根据所述历史植株图像分割损失值、所述目标植株区域回归损失值、所述目标植株种类回归损失值中的一种或多种,获得深度网络模型损失值。
  5. 根据权利要求1至4任一项所述的植株种植数据测量方法,其中,所述利用预设识别模型对所述种植区域的图像信息进行处理,获得植株种植数据包括:
    根据所述预设识别模型控制从所述种植区域的图像信息提取待测植株图形特征;
    根据所述预设识别模型控制对所述待测植株图形特征进行处理,获得植株种植数据。
  6. 根据权利要求1至4任一项所述的植株种植数据测量方法,其中,所述根据所述预设识别模型控制对待测植株图形特征进行处理,获得植株种植数据包括以下至少之一:
    根据所述深度网络模型,控制特征金字塔模型对所述待测植株图形特征进行图像分割,获得植株边缘信息;
    根据所述深度网络模型,控制区域推荐网络模型对所述待测植株图形特征进行处理,获得植株位置信息;
    根据所述深度网络模型,控制区域推荐网络模型对所述待测植株图形特征进行处理,获得植株数量信息;
    根据所述深度网络模型,控制特征金字塔模型对所述待测植株图形特征进行处理,获得植株生长信息;
    根据所述深度网络模型,控制区域推荐网络模型对所述历史植株图形特征进行处理,获得植株种类信息。
  7. 一种植株种植数据测量装置,包括:
    接收模块,设置为接收种植区域的图像信息;
    处理模块,设置为利用预设识别模型对所述种植区域的图像信息进行处理,获得植株种植数据。
  8. 根据权利要求7所述的植株种植数据测量装置,其中,所述图像信息包括测绘图像信息、地图信息、图片信息中的一种或多种。
  9. 根据权利要求7所述的植株种植数据测量装置,其中,所述预设识别模型为深度网络模型,所述植株种植数据测量装置还包括存储模块,设置为存储深度网络模型;
    所述接收模块还设置为接收种植区域的图像信息前,接收种植区域历史信息;所述植株种类区域历史信息包括历史图像信息以及所述历史图像信息对应的历史植株标定信息;
    所述处理模块包括特征提取单元,设置为接收种植区域的图像信息前,从所述历史图像信息提取历史植株图形特征;以及接收种植区域的图像信息后,根据所述深度网络模型,控制从所述种植区域的图像信息提取待测植株图形特征;
    信息识别单元,设置为接收种植区域的图像信息前,采用深度学习方式对历史植株图形特征和历史植株标定信息进行处理,获得深度网络模型损失值;以及接收种植区域的图像信息后,根据所述深度网络模型,控制对所述待测植株图形特征进行处理,获得植株种植数据;
    模型优化单元,设置为接收种植区域的图像信息前,利用所述历史植株标定信息对所述深度网络模型损失值进行优化,获得深度网络模型,所述深度网络模型包括优化后的植株种植数据的识别策略。
  10. 根据权利要求9所述的植株种植数据测量装置,其中,
    所述信息识别单元包括第一识别单元,设置为在接收种植区域的图像信息前,在所述历史植株标定信息包括历史植株边缘标定信息时,采用特征金字塔模型对所述历史植株图形特征进行图像分割,获得植株历史图像分割结果;根据所述植株历史图像分割结果和所述历史植株边缘标定信息,获得历史植株图像分割损失值;以及,
    在接收种植区域的图像信息后,根据深度网络模型,采用特征金字塔模型对所述待测植株图形特征进行图像分割,获得植株边缘信息;根据所述深度网络模型,控制特征金字塔模型对所述待测植株图形特征进行处理,获得植株生长信息;
    第二识别单元设置为接收种植区域的图像信息前,在所述历史植株标定信息包括历史植株边缘标定信息时,采用区域推荐网络模型对所述历史植株图形特征进行处理,获得目标植株区域结果;根据所述目标植株区域结果和所述历史植株 边缘标定信息,获得目标植株区域回归损失值;以及,
    在接收种植区域的图像信息后,根据所述深度网络模型,采用区域推荐网络模型对所述待测植株图形特征进行处理,获得植株位置信息;根据所述深度网络模型,控制区域推荐网络模型对所述待测植株图形特征进行处理,获得植株数量信息;
    接收种植区域的图像信息前,在所述历史植株标定信息包括历史植株种类标定信息时,采用区域推荐网络模型对所述历史植株图形特征进行处理,获得目标植株种类信息;根据所述目标植株种类信息和所述历史植株种类标定信息,获得目标植株种类回归损失值;以及接收测绘设备传送的种植区域的图像信息后,根据深度网络模型,采用区域推荐网络模型对所述历史植株图形特征进行处理,获得植株种类信息;
    信息计算单元,设置为接收种植区域的图像信息前,根据所述历史植株图像分割损失值、所述目标植株区域回归损失值、所述目标植株种类回归损失值中的一种或多种,获得深度网络模型损失值;
    所述植株种植数据测量装置还包括发送模块,设置为将所述植株边缘信息、所述植株位置信息、所述植株种类信息、植株数量信息、植株生长信息中的一个或多个作为植株种植数据输出。
  11. 一种植株种植数据测量***,包括图像采集设备以及权利要求7至10任一项所述的植株种植数据测量装置;所述图像采集设备的输出端与所述植株种植数据测量装置所包括的接收模块连接。
  12. 根据权利要求11所述的植株种植数据测量***,其中,若所述图像采集设备为测绘飞行器,所述测绘飞行器包括测绘模块和飞行控制模块,所述植株种植数据测量装置设在所述飞行控制模块中,所述测绘模块与所述接收模块连接。
  13. 根据权利要求11所述的植株种植数据测量***,其中,所述图像采集设备至少包括定位单元和图像采集单元;所述定位单元设置为定位植株位置信息;所述图像采集单元设置为采集图像信息;所述定位单元与所述图像采集单元分别与所述接收模块连接。
  14. 一种作业路线规划方法,包括:
    根据权利要求1至6任一项所述的植株种植数据测量方法,从用于作业路线规划的作业区域图像中获取作业区域的植株种植数据;
    根据所述作业区域的植株种植数据,规划可移动设备在作业区域的作业路线。
  15. 根据权利要求14所述的作业路线规划方法,其中,当所述植株种植数据包括植株边缘信息和植株位置信息,所述根据所述作业区域的植株种植数据,规划可移动设备在作业区域的作业路线包括:
    根据所述作业区域的植株位置信息,确定各个植株的作业中心点;
    根据所述作业区域的植株边缘信息,确定各个植株的作业半径;
    根据各个植株的作业中心点、各个植株的作业半径,以及所述可移动设备的作业宽度生成植株作业路线,所述植株作业路线用于控制所述可移动设备在所述作业区域内的作业路线。
  16. 根据权利要求14所述的作业路线规划方法,其中,当所述植株种植数据包括植株边缘信息和植株位置信息,所述根据所述作业区域的植株种植数据,规划可移动设备在作业区域的作业路线包括:
    根据所述植株位置信息确定至少一条作业中心线;
    根据植株边缘信息确定至少一条作业中心线对应的植株宽度;
    根据所述可移动设备的作业宽度、每条所述作业中心线以及对应植株宽度,生成植株作业路线,所述植株作业路线用于控制所述可移动设备在所述作业区域内的作业路线。
  17. 根据权利要求14所述的作业路线规划方法,其中,所述作业路线包括弓形路线、螺旋路线、同心圆路线、折线形路线、原地旋转路线的一种或多种。
  18. 根据权利要求14所述的作业路线规划方法,其中,所述获取作业区域的植株种植数据后,规划可移动设备在作业区域的作业路线前,所述作业路线规划方法还包括:
    将所述作业区域的植株种植数据进行可视化标记;
    根据用户的选择指令,从可视化标记的所述作业区域的植株种植数据中获取待作业植株区域的植株种植数据;
    所述根据所述作业区域的植株种植数据,规划可移动设备在作业区域的作业路线包括:
    根据所述待作业植株区域的植株种植数据,规划可移动设备在带作业植株区 域的作业路线。
  19. 一种作业路线规划***,包括:
    权利要求7至10任一项所述的植株种植数据测量装置,设置为根据权利要求1至6任一项所述的植株种植数据测量方法,从所述作业路线规划的作业区域图像获取作业区域的植株种植数据;
    规划模块,设置为根据所述作业区域的植株种植数据,规划可移动设备在作业区域的作业路线。
  20. 根据权利要求19所述的作业路线规划***,其中,当所述植株种植数据包括植株边缘信息和植株位置信息,所述规划模块具体设置为:根据所述作业区域的植株位置信息,确定各个植株的作业中心点;根据所述作业区域的植株边缘信息,确定各个植株的作业半径;根据各个植株的作业中心点、各个植株的作业半径,以及所述可移动设备的作业宽度生成植株作业路线,所述植株作业路线用于控制所述可移动设备在所述作业区域内的作业路线。
  21. 根据权利要求19所述的作业路线规划***,其中,当所述植株种植数据包括植株边缘信息和植株位置信息,所述规划模块具体设置为:根据所述植株位置信息确定至少一条作业中心线;
    根据植株边缘信息确定至少一条作业中心线对应的植株宽度;
    根据所述可移动设备的作业宽度、每条所述作业中心线以及对应植株宽度,生成植株作业路线,所述植株作业路线用于控制所述可移动设备在所述作业区域内的作业路线。
  22. 根据权利要求19所述的作业路线规划***,其中,所述作业路线包括弓形路线、螺旋路线、同心圆路线、折线形路线、原地旋转路线的一种或多种。
  23. 根据权利要求19所述的作业路线规划***,其中,所述作业路线规划***还包括:
    数据标记模块,设置为获取作业区域的植株种植数据后,规划可移动设备在作业区域的作业路线前,将所述作业区域的植株种植数据进行可视化标记;
    选取控制模块,设置为根据用户发送的选取指令,从可视化标记的所述作业区域的植株种植数据中获取待作业植株区域的植株种植数据;
    所述规划模块具体设置为根据所述待作业植株区域的植株种植数据,规划可移动设备在带作业植株区域的作业路线。
PCT/CN2019/075457 2018-03-23 2019-02-19 植株种植数据测量方法、作业路线规划方法及装置、*** WO2019179270A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP19771748.1A EP3770830A4 (en) 2018-03-23 2019-02-19 METHOD OF MEASURING PLANTING DATA, METHOD OF PLANNING WORK ROUTE, DEVICE AND SYSTEM
US16/976,891 US11321942B2 (en) 2018-03-23 2019-02-19 Method for measuring plant planting data, device and system
JP2020548944A JP7086203B2 (ja) 2018-03-23 2019-02-19 植物体耕作データ測定方法、作業経路計画方法及び装置、システム
AU2019238712A AU2019238712B2 (en) 2018-03-23 2019-02-19 Plant planting data measuring method, working route planning method, device and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810247447.7 2018-03-23
CN201810247447.7A CN110309933A (zh) 2018-03-23 2018-03-23 植株种植数据测量方法、作业路线规划方法及装置、***

Publications (1)

Publication Number Publication Date
WO2019179270A1 true WO2019179270A1 (zh) 2019-09-26

Family

ID=67988117

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/075457 WO2019179270A1 (zh) 2018-03-23 2019-02-19 植株种植数据测量方法、作业路线规划方法及装置、***

Country Status (6)

Country Link
US (1) US11321942B2 (zh)
EP (1) EP3770830A4 (zh)
JP (1) JP7086203B2 (zh)
CN (1) CN110309933A (zh)
AU (1) AU2019238712B2 (zh)
WO (1) WO2019179270A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT201900025831A1 (it) * 2019-12-31 2021-07-01 Twicky S R L Metodo per la pianificazione del percorso di movimentazione di un oggetto all'interno di un’area urbana e programma informatico atto ad implementare tale metodo
WO2021194894A1 (en) * 2020-03-25 2021-09-30 Iunu, Inc. Horticulture aided by autonomous systems
WO2021194897A1 (en) * 2020-03-25 2021-09-30 Iunu, Inc. Horticulture aided by autonomous systems
CN115299245A (zh) * 2022-09-13 2022-11-08 南昌工程学院 一种智能水果采摘机器人的控制方法及控制***
US11580729B2 (en) * 2019-11-22 2023-02-14 Intelinair, Inc. Agricultural pattern analysis system

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113407755A (zh) * 2020-03-17 2021-09-17 北京百度网讯科技有限公司 植物生长状况信息获取方法、装置及电子设备
CN111860626B (zh) * 2020-07-04 2023-08-29 广东粤源工程咨询有限公司 基于无人机遥感和面向对象分类的水土保持监测方法及***
CN112183329A (zh) * 2020-09-27 2021-01-05 广州极飞科技有限公司 植株的补种信息识别方法、装置、计算机设备和存储介质
CN112068572A (zh) * 2020-10-09 2020-12-11 四川长虹电器股份有限公司 一种无人收割***及方法
CN112487936A (zh) * 2020-11-26 2021-03-12 郑州数农通大数据科技有限公司 一种基于机器视觉技术的玉米田间管理机器人
CN112987779A (zh) * 2021-02-03 2021-06-18 湖南祥柏生态环保科技有限公司 基于环形作业的种植区
CN113074740B (zh) * 2021-04-29 2023-11-17 广州极飞科技股份有限公司 一种作业区域内的航线规划方法、装置、设备及介质
CN113778110B (zh) * 2021-11-11 2022-02-15 山东中天宇信信息技术有限公司 一种基于机器学习的智能农机控制方法及***
CN116222547B (zh) * 2023-05-10 2023-08-01 北京市农林科学院智能装备技术研究中心 适用于等高种植的农机导航方法、装置及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102749290A (zh) * 2012-07-02 2012-10-24 浙江大学 一种樱桃树冠层树枝生长状态的检测方法
US20130028487A1 (en) * 2010-03-13 2013-01-31 Carnegie Mellon University Computer vision and machine learning software for grading and sorting plants
CN106846334A (zh) * 2017-01-19 2017-06-13 江南大学 基于支持向量数据描述的田间玉米植株识别方法
CN106993167A (zh) * 2017-04-28 2017-07-28 深圳前海弘稼科技有限公司 一种植株的监控方法及监控***
CN107808375A (zh) * 2017-09-28 2018-03-16 中国科学院合肥物质科学研究院 融合多种上下文深度学习模型的水稻病害图像检测方法

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8855405B2 (en) * 2003-04-30 2014-10-07 Deere & Company System and method for detecting and analyzing features in an agricultural field for vehicle guidance
US20130173321A1 (en) * 2011-12-30 2013-07-04 Jerome Dale Johnson Methods, apparatus and systems for generating, updating and executing a crop-harvesting plan
CN103017753B (zh) * 2012-11-01 2015-07-15 中国兵器科学研究院 一种无人机航路规划方法及装置
CN103336517B (zh) 2013-07-02 2015-06-24 周庆芬 一种智能农业综合管理***
EP3032946B1 (en) * 2013-07-11 2022-10-05 Blue River Technology Inc. Method for automatic phenotype measurement and selection
CN104461318B (zh) * 2013-12-10 2018-07-20 苏州梦想人软件科技有限公司 基于增强现实技术的点读方法及***
US20170161560A1 (en) 2014-11-24 2017-06-08 Prospera Technologies, Ltd. System and method for harvest yield prediction
IL236606B (en) * 2015-01-11 2020-09-30 Gornik Amihay Standards and methods for agricultural monitoring
CN104807457A (zh) * 2015-04-29 2015-07-29 广州快飞计算机科技有限公司 飞行器航线的生成方法、装置及终端设备
CN104866970B (zh) * 2015-05-26 2018-07-24 徐吉祥 智能种植管理方法和智能种植设备
CN105116911B (zh) * 2015-07-20 2017-07-21 广州极飞科技有限公司 无人机喷药方法
SG10201506012SA (en) * 2015-07-31 2017-02-27 Accenture Global Services Ltd Inventory, growth, and risk prediction using image processing
CN107209854A (zh) * 2015-09-15 2017-09-26 深圳市大疆创新科技有限公司 用于支持顺畅的目标跟随的***和方法
CN105159319B (zh) * 2015-09-29 2017-10-31 广州极飞科技有限公司 一种无人机的喷药方法及无人机
US10491879B2 (en) * 2016-01-15 2019-11-26 Blue River Technology Inc. Plant feature detection using captured images
CN106382933B (zh) * 2016-11-04 2019-09-10 北京农业智能装备技术研究中心 一种用于航空植保飞行器的作业航线获取方法及***
US10699185B2 (en) * 2017-01-26 2020-06-30 The Climate Corporation Crop yield estimation using agronomic neural network
CN107315529B (zh) * 2017-06-19 2020-05-26 维沃移动通信有限公司 一种拍照方法及移动终端
CN107451602A (zh) * 2017-07-06 2017-12-08 浙江工业大学 一种基于深度学习的果蔬检测方法
CN107633202A (zh) * 2017-08-11 2018-01-26 合肥嘉浓航空科技有限公司 一种基于农田图像特征识别的植保无人机飞控方法和***

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130028487A1 (en) * 2010-03-13 2013-01-31 Carnegie Mellon University Computer vision and machine learning software for grading and sorting plants
CN102749290A (zh) * 2012-07-02 2012-10-24 浙江大学 一种樱桃树冠层树枝生长状态的检测方法
CN106846334A (zh) * 2017-01-19 2017-06-13 江南大学 基于支持向量数据描述的田间玉米植株识别方法
CN106993167A (zh) * 2017-04-28 2017-07-28 深圳前海弘稼科技有限公司 一种植株的监控方法及监控***
CN107808375A (zh) * 2017-09-28 2018-03-16 中国科学院合肥物质科学研究院 融合多种上下文深度学习模型的水稻病害图像检测方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3770830A4 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11580729B2 (en) * 2019-11-22 2023-02-14 Intelinair, Inc. Agricultural pattern analysis system
IT201900025831A1 (it) * 2019-12-31 2021-07-01 Twicky S R L Metodo per la pianificazione del percorso di movimentazione di un oggetto all'interno di un’area urbana e programma informatico atto ad implementare tale metodo
WO2021137045A1 (en) * 2019-12-31 2021-07-08 Twicky S.R.L Method for planning the handling path of an object in an urban area and computer program suited to implement said method
WO2021194894A1 (en) * 2020-03-25 2021-09-30 Iunu, Inc. Horticulture aided by autonomous systems
WO2021194897A1 (en) * 2020-03-25 2021-09-30 Iunu, Inc. Horticulture aided by autonomous systems
US11656624B2 (en) 2020-03-25 2023-05-23 Iunu, Inc. Horticulture aided by autonomous systems
US11730089B2 (en) 2020-03-25 2023-08-22 Iunu, Inc. Horticulture aided by autonomous systems
CN115299245A (zh) * 2022-09-13 2022-11-08 南昌工程学院 一种智能水果采摘机器人的控制方法及控制***
CN115299245B (zh) * 2022-09-13 2023-07-14 南昌工程学院 一种智能水果采摘机器人的控制方法及控制***

Also Published As

Publication number Publication date
JP2021516060A (ja) 2021-07-01
US11321942B2 (en) 2022-05-03
CN110309933A (zh) 2019-10-08
AU2019238712A1 (en) 2020-09-24
AU2019238712B2 (en) 2022-07-21
US20210004594A1 (en) 2021-01-07
EP3770830A1 (en) 2021-01-27
EP3770830A4 (en) 2021-12-08
JP7086203B2 (ja) 2022-06-17

Similar Documents

Publication Publication Date Title
WO2019179270A1 (zh) 植株种植数据测量方法、作业路线规划方法及装置、***
CN111582055B (zh) 一种无人机的航空施药航线生成方法及***
Shafi et al. A multi-modal approach for crop health mapping using low altitude remote sensing, internet of things (IoT) and machine learning
Fernández‐Quintanilla et al. Is the current state of the art of weed monitoring suitable for site‐specific weed management in arable crops?
US11765542B2 (en) Hybrid vision system for crop land navigation
US11564357B2 (en) Capture of ground truthed labels of plant traits method and system
US5771169A (en) Site-specific harvest statistics analyzer
US20170042081A1 (en) Systems, methods and apparatuses associated with soil sampling
US20210209490A1 (en) Using optical remote sensors and machine learning models to predict agronomic field property data
US20170109395A1 (en) Computer-generated accurate yield map data using expert filters and spatial outlier detection
US11716985B2 (en) Method for remediating developmentally delayed plants
CN113920474B (zh) 一种智能监管柑橘种植态势的物联网***及方法
CN111985724B (zh) 一种农作物产量预估的方法、装置、设备及存储介质
CN111967441A (zh) 一种基于深度学习的农作物病害分析方法
CN115697037A (zh) 用于精准农业的增强的管理区
JP2022082636A (ja) 情報処理装置
US20220392214A1 (en) Scouting functionality emergence
CN117115769A (zh) 一种基于语义分割网络的植株检测与定位方法
CN116739739A (zh) 一种贷款额度评估方法、装置、电子设备及存储介质
Kaivosoja Role of spatial data uncertainty in executions of precision farming operations
Jagli et al. Smart Farming Using Artificial Intelligence.
US20230368312A1 (en) Method for automated weed control of agricultural land and associated stand-alone system
Grego et al. Technologies developed in precision agriculture.
CN118195122A (zh) 作业辅助方法、作业辅助***以及程序
CN117893635A (zh) 一种在线生成草害治理处方图的方法及***

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19771748

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020548944

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2019238712

Country of ref document: AU

Date of ref document: 20190219

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2019771748

Country of ref document: EP