CN116048082A - Automatic famous tea picking control system and method based on unmanned aerial vehicle identification - Google Patents

Automatic famous tea picking control system and method based on unmanned aerial vehicle identification Download PDF

Info

Publication number
CN116048082A
CN116048082A CN202310038982.2A CN202310038982A CN116048082A CN 116048082 A CN116048082 A CN 116048082A CN 202310038982 A CN202310038982 A CN 202310038982A CN 116048082 A CN116048082 A CN 116048082A
Authority
CN
China
Prior art keywords
picking
tea
bud
coordinates
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310038982.2A
Other languages
Chinese (zh)
Inventor
刘立超
梁静
陈黎卿
俞传阳
王喆
王奎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Agricultural University AHAU
Original Assignee
Anhui Agricultural University AHAU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Agricultural University AHAU filed Critical Anhui Agricultural University AHAU
Priority to CN202310038982.2A priority Critical patent/CN116048082A/en
Publication of CN116048082A publication Critical patent/CN116048082A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D46/00Picking of fruits, vegetables, hops, or the like; Devices for shaking trees or shrubs
    • A01D46/30Robotic devices for individually picking crops
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Robotics (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Mechanical Engineering (AREA)
  • Electromagnetism (AREA)
  • Environmental Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of artificial intelligence, and discloses an automatic famous tea picking control system and method based on unmanned aerial vehicle identification, wherein the automatic famous tea picking control system comprises a base station, an information acquisition module, an identification processing module and a picking device; collecting tea garden images through low-altitude shooting, and processing the images to obtain a data set; training the data set by using a deep learning algorithm, and detecting and outputting the tea bud position by a target; the picking device obtains actual coordinates of the famous tea shoots through coordinate change, and judges the motion track of the picking machine and the sequence of picking tea by the tail end mechanical arm according to the coordinates, so that the problems of quick positioning of picking points and automatic picking in automatic picking of the famous tea are effectively solved.

Description

Automatic famous tea picking control system and method based on unmanned aerial vehicle identification
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to an automatic famous tea picking control system and method based on unmanned aerial vehicle identification.
Background
China is used as a large country for tea planting, thousands of years of tea planting history exist, in recent years, tea consumption is increased, tea production is rapidly developed along with the large country, and the requirements for tea picking are increased. At present, the tea leaf picking mainly adopts a manual mode, the efficiency is low, the tea leaf picking relies on the proficiency of tea picking workers, a small amount of semi-automatic tea picking also needs personnel to participate, the noise and vibration are huge, and the operation environment is bad. Based on the above problems, some enterprises develop unmanned tea picking researches, wherein an autonomous navigation picking scheme based on a multi-camera and a preset track picking scheme mainly exist, and the unmanned schemes cannot meet the requirements of high efficiency and universality. At present, an automatic tea picking system with high accuracy and high field universality is lacking in the tea picking systems on the market.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle identification-based famous tea automatic picking control system and method, which solve the following technical problems:
how to provide a control system capable of automatically identifying and picking tender bud tea leaves.
The aim of the invention can be achieved by the following technical scheme:
an automatic famous tea picking control system based on unmanned aerial vehicle identification comprises a base station, an information acquisition module, an identification processing module and a picking device;
the information acquisition device is connected with the base station and the identification processing module and is used for acquiring a tea garden integral diagram of the target tea garden;
the identification processing module is used for dividing the tea garden overall graph according to a preset dividing rule and outputting central coordinate information of the tea bud positions;
the picking device is connected with the identification processing module and used for generating a picking path according to the central coordinate information and picking the tender buds according to the picking path.
As a further scheme of the invention: the information acquisition device comprises an unmanned aerial vehicle, and a communication unit, a shooting unit and a positioning unit which are arranged on the unmanned aerial vehicle;
the shooting unit and the positioning unit are connected with the base station through the communication unit; the positioning unit and the base station are positioned through polar coordinates, and the position of the base station is set as a coordinate origin.
As a further scheme of the invention: driving the unmanned aerial vehicle to acquire image information of tea leaves in a target tea garden according to a preset height and a path, and fusing the acquired image information into a tea garden integral diagram;
the preset segmentation rule comprises the following steps:
dividing the tea garden overall diagram according to a preset dividing proportion;
inputting the segmented pictures into a trained target detection model;
judging the validity of the output result of the target detection model, if so, sending a first working instruction by the target detection model, otherwise, identifying;
the target detection model is a trained neural network model, and the first working instruction comprises the central coordinate information.
As a further scheme of the invention: the method for acquiring the center coordinate information comprises the following steps:
numbering each divided picture, wherein the numbering sequence is arranged from small to large along the X axis and from small to large along the Y axis;
obtaining local center coordinates of the buds in each graph, wherein the center coordinates are calculated by the following formula:
Figure BDA0004050446310000021
wherein x is the x-axis coordinate of the local bud; y is the coordinate of the y axis of the local bud;
x 1 marking the left axis x coordinate, x of the frame for local buds 2 Marking the x coordinate of the right axis of the frame for the local buds;
y 1 marking the left axis y coordinate, y of the frame for local buds 2 Marking the left axis y coordinate of the frame for the local buds;
converting the local bud coordinates into global bud coordinates, the center coordinates being calculated by the following formula:
(X,Y)=(x*(n-1),y*(m-1))
wherein n is the number of each graph after segmentation in the X-axis direction, m is the number of each graph after segmentation in the Y-axis direction, X is the global bud X-axis coordinate, and Y is the global bud Y-axis coordinate.
As a further scheme of the invention: the method for generating the picking path comprises the following steps:
d1, defining a corner of a target tea garden as a zero point coordinate, extracting each ridge where the tea tree is located by using an image processing mode, and extracting a central line from the ridge;
d2, placing the picking device at a ridge starting point closest to the zero point, aligning the center point of the picking device with the ridge center point, and enabling the area covered by the picking device to be a unit at the moment;
d3, finding out the coordinates of the bud closest to the zero point according to the coordinates of the identified center point of the famous tea bud in the area of the picking device covered with one unit;
and D4, planning a path according to a sequential method, selecting the lower left corner as a starting point, traversing all rows and columns from left to right according to a sequencing rule, and determining the sequence of each target.
As a further scheme of the invention: the picking device comprises a frame, a visual positioning module, a picking module and a driving module, wherein the visual positioning module, the picking module and the driving module are arranged on the frame, and a controller is fixedly arranged in the frame;
the picking module comprises a picking parallel mechanical arm, a tail end picking device and a tender bud collecting straw, wherein the picking parallel mechanical arm is fixedly arranged on a rack, the tail end picking device is fixedly arranged at the tail end of the parallel mechanical arm, and the tender bud collecting straw is fixedly arranged in the mechanical arm;
the visual positioning module comprises a depth camera and an industrial personal computer which are arranged on the frame, wherein the depth camera is fixedly arranged above the tail end picking device, and the industrial personal computer is arranged on the right side of the frame;
the driving module comprises a chassis and a crawler belt, wherein the chassis is fixedly arranged at the bottom of the frame, and the crawler belt is fixedly arranged at two sides of the chassis.
As a further scheme of the invention: when the picking module reaches a picking point, the picking parallel mechanical arm moves to be right above the tea leaves to be picked;
the depth camera acquires tea height information;
the method comprises the steps that a depth camera is utilized to detect and identify famous tea buds from the right above through targets, and the height information of the buds can be obtained through conversion of image coordinates and world coordinates;
the mechanical arm cuts out tea buds according to the height information;
the controller judges whether the mechanical arm cutter completes tea collection or not;
and the industrial personal computer judges whether the picking of the tea garden is finished.
A control method for automatically picking famous and excellent tea based on unmanned aerial vehicle identification comprises the following steps:
acquiring a tea garden integral diagram of a target tea garden;
dividing the tea garden overall graph according to a preset dividing rule, and outputting central coordinate information of the tea bud positions;
and generating a picking path according to the central coordinate information, and picking the tender buds according to the picking path.
The invention has the beneficial effects that: according to the invention, the tea garden image is acquired through unmanned aerial vehicle low-altitude shooting, and the image is processed to obtain a data set; training the data set by using a deep learning algorithm, and detecting and outputting the tea bud position by a target; the industrial control computer obtains the actual coordinates of famous tea buds through coordinate change; judging the motion track of the picking machine and the order of picking tea leaves by the tail end manipulator according to the coordinates; recognizing three-dimensional position coordinates of picking points of tea leaves to be picked through a depth camera, and feeding back the three-dimensional position coordinates to an industrial personal computer; the picking points are quickly positioned in the automatic picking process of famous tea by picking through the tender bud collecting suction pipe after the picking device at the tail end of the mechanical arm is picked.
Drawings
The invention is further described below with reference to the accompanying drawings.
FIG. 1 is a schematic view of a picking apparatus according to the present invention;
FIG. 2 is a schematic diagram of the coordinates of the buds in the target tea garden and the structure of the target tea garden according to the invention;
fig. 3 is a schematic flow chart of an overall method for automatic famous tea picking control based on unmanned aerial vehicle identification.
Description of the drawings: 1. an industrial personal computer; 2. a tender bud collecting device; 3. a frame; 4. picking a parallel mechanical arm; 5. collecting the tender shoots and sucking the tender shoots; 6. a mobile platform; 7. a depth camera; 8. a terminal picker; 9. and a driving module.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the invention discloses an automatic famous tea picking control system based on unmanned aerial vehicle identification, which comprises a base station, an information acquisition module, an identification processing module and a picking device;
the information acquisition device is connected with the base station and the identification processing module and is used for acquiring a tea garden integral diagram of the target tea garden;
the identification processing module is used for dividing the tea garden overall graph according to a preset dividing rule and outputting central coordinate information of the tea bud positions;
the picking device is connected with the identification processing module and used for generating a picking path according to the central coordinate information and picking the tender buds according to the picking path.
As a further scheme of the invention: the information acquisition device comprises an unmanned aerial vehicle, and a communication unit, a shooting unit and a positioning unit which are arranged on the unmanned aerial vehicle;
the shooting unit and the positioning unit are connected with the base station through the communication unit; the positioning unit and the base station are positioned through polar coordinates, and the position of the base station is set as a coordinate origin.
As a further scheme of the invention: driving the unmanned aerial vehicle to acquire image information of tea leaves in a target tea garden according to a preset height and a path, and fusing the acquired image information into a tea garden integral diagram;
the preset segmentation rule comprises the following steps:
dividing the tea garden overall diagram according to a preset dividing proportion;
inputting the segmented pictures into a trained target detection model;
judging the validity of the output result of the target detection model, if so, sending a first working instruction by the target detection model, otherwise, identifying;
the target detection model is a trained neural network model, and the first working instruction comprises the central coordinate information.
As a further scheme of the invention: the method for acquiring the center coordinate information comprises the following steps:
numbering each divided picture, wherein the numbering sequence is arranged from small to large along the X axis and from small to large along the Y axis;
obtaining local center coordinates of the buds in each graph, wherein the center coordinates are calculated by the following formula:
Figure BDA0004050446310000061
wherein x is the x-axis coordinate of the local bud; y is the coordinate of the y axis of the local bud;
x 1 marking the left axis x coordinate, x of the frame for local buds 2 Marking the x coordinate of the right axis of the frame for the local buds;
y 1 marking the left axis y coordinate, y of the frame for local buds 2 Marking the left axis y coordinate of the frame for the local buds;
converting the local bud coordinates into global bud coordinates, the center coordinates being calculated by the following formula:
(X,Y)=(x*(n-1),y*(m-1))
wherein n is the number of each graph after segmentation in the X-axis direction, m is the number of each graph after segmentation in the Y-axis direction, X is the global bud X-axis coordinate, and Y is the global bud Y-axis coordinate.
As a further scheme of the invention: the method for generating the picking path comprises the following steps:
d1, defining a corner of a target tea garden as a zero point coordinate, extracting each ridge where the tea tree is located by using an image processing mode, and extracting a central line from the ridge;
d2, placing the picking device at a ridge starting point closest to the zero point, aligning the center point of the picking device with the ridge center point, and enabling the area covered by the picking device to be a unit at the moment;
d3, finding out the coordinates of the bud closest to the zero point according to the coordinates of the identified center point of the famous tea bud in the area of the picking device covered with one unit;
and D4, planning a path according to a sequential method, selecting the lower left corner as a starting point, traversing all rows and columns from left to right according to a sequencing rule, and determining the sequence of each target.
As a further scheme of the invention: the picking device comprises a frame, a visual positioning module, a picking module and a driving module, wherein the visual positioning module, the picking module and the driving module are arranged on the frame, and a controller is fixedly arranged in the frame;
the picking module comprises a picking parallel mechanical arm, a tail end picking device and a tender bud collecting suction pipe, wherein the picking parallel mechanical arm is fixedly arranged on a rack, the tail end picking device is fixedly arranged at the tail end of the parallel mechanical arm, and the tender bud collecting suction pipe is fixedly arranged in the mechanical arm and is communicated with a tender bud collecting device fixed on the rack;
the visual positioning module comprises a depth camera and an industrial personal computer which are arranged on the frame, wherein the depth camera is fixedly arranged above the tail end picking device, and the industrial personal computer is arranged on the right side of the frame;
the driving module comprises a chassis and a crawler belt, wherein the chassis is fixedly arranged at the bottom of the frame, and the crawler belt is fixedly arranged at two sides of the chassis.
As a further scheme of the invention: when the picking module reaches a picking point, the picking parallel mechanical arm moves to be right above the tea leaves to be picked;
the depth camera acquires tea height information;
the method comprises the steps that a depth camera is utilized to detect and identify famous tea buds from the right above through targets, and the height information of the buds can be obtained through conversion of image coordinates and world coordinates;
the mechanical arm cuts out tea buds according to the height information;
the controller judges whether the mechanical arm cutter completes tea collection or not;
and the industrial personal computer judges whether the picking of the tea garden is finished.
A control method for automatically picking famous and excellent tea based on unmanned aerial vehicle identification comprises the following steps:
acquiring a tea garden integral diagram of a target tea garden;
dividing the tea garden overall graph according to a preset dividing rule, and outputting central coordinate information of the tea bud positions;
and generating a picking path according to the central coordinate information, and picking the tender buds according to the picking path.
Specifically, as shown in fig. 3:
s1, placing a base station at a target tea garden corner calibration plate, determining a reference coordinate position of a tea garden image according to the position of the base station, and entering S2;
s2, the unmanned aerial vehicle collects tea garden tea image information according to a preset height and a preset path, the collected image content is fused into a tea garden integral diagram, and S3 is carried out;
s3, dividing the tea garden overall diagram according to a fixed proportion, and entering S4;
s4, marking the tea buds on each part of the graph to form a tea bud data set, and entering S5;
s5, data reinforcement expands the data set through data preprocessing, training is carried out on the data set through a deep learning algorithm to obtain an optimal detection model, and S6 is carried out;
wherein, through data preprocessing mentioned in step S5, the data enhancement extended data set includes the following steps:
a1, dividing a complete tea garden picture into a tea image data set with the length and width dimensions of (a, b) according to a certain proportion;
a2, labeling the famous tea buds on the picture by LabelImg software to generate a txt file, wherein the file comprises: category, marked object abscissa center, marked object ordinate center coordinates, width, height;
a3, blurring the image, adjusting brightness, cutting, rotating, translating, mirroring and other operations on the data set by using an image enhancement method to expand the data set.
A4, randomly dividing the expanded data set into three parts of a training set, a verification set and a test set according to the proportion of 8:1:1.
Training the data set by using the deep learning algorithm mentioned in step S5 to obtain an optimal detection model includes the following steps:
b1, putting the preprocessed data set and the corresponding XML file into a deep learning algorithm for training, and setting initial super parameters, batch size, category number, initial learning rate, iteration number and the like to optimize the recognition effect of the model.
After the training is finished, a weight file and a change curve of each parameter in the training process are obtained;
b2, observing whether the result curve accords with the training rule or not and whether the training precision meets the requirement or not;
b3, putting the weight file into a deep learning algorithm to detect the test file, observing the detection condition, and if the recognition rate is more than 90%, considering that the test file is qualified;
s6, judging the effectiveness of an optimal detection model algorithm on tea bud identification, if the effect meets the requirement, obtaining a first working instruction, otherwise, returning to S4;
s7, inputting the complete tea garden image into a target detection algorithm according to the first working instruction in the S6, outputting central coordinate information of the tea bud position to an industrial personal computer, planning a picking path by the industrial personal computer, and entering the S8;
the center coordinate information acquisition of the tea leaf bud position mentioned in S7 comprises the following steps:
c1, numbering each divided picture, wherein the numbering sequence is arranged from small to large along the X axis, and the Y axis is arranged from small to large as (n, m);
and C2, obtaining local center coordinates of the buds in each graph, wherein the center coordinates are calculated by the following formula:
Figure BDA0004050446310000101
wherein x is the x-axis coordinate of the local bud; y is the coordinate of the y axis of the local bud;
x 1 marking the left axis x coordinate, x of the frame for local buds 2 Marking the x coordinate of the right axis of the frame for the local buds;
y 1 marking the left axis y coordinate, y of the frame for local buds 2 Marking the left axis y coordinate of the frame for the local buds;
converting the local bud coordinates into global bud coordinates, the center coordinates being calculated by the following formula:
(X,Y)=(x*(n-1),y*(m-1))
wherein n is the number of each graph after segmentation in the X-axis direction, m is the number of each graph after segmentation in the Y-axis direction, X is the global bud X-axis coordinate, and Y is the global bud Y-axis coordinate.
The process of planning picking paths by the industrial personal computer in the step S7 comprises the following steps:
d1, defining a tea garden corner as a zero point coordinate, extracting each ridge where the tea tree is located by using an image processing mode, and extracting a central line from one ridge;
d2, placing the picking machine at a ridge starting point closest to the zero point, and enabling the central point of the picking machine to be aligned with the ridge central point, wherein the area covered by the picking machine is a unit;
d3, finding out the coordinates of the bud closest to the zero point according to the coordinates of the identified center point of the famous tea bud in the area covered by the picking machine;
and D4, planning a path according to a sequential method, namely selecting the lower left corner as a starting point as shown by an arrow in FIG. 2, traversing all rows and columns from left to right according to a sequencing rule, and determining the sequence of each target.
S8, the picking machine reaches a picking point according to the planning, the mechanical arm moves to the position right above the tea leaves to be picked, and S9 is carried out;
s9, the depth camera collects tea height information and enters S10;
the obtaining of the tea leaf height information mentioned in S9 includes the steps of:
e1, detecting and identifying famous tea buds from the right above through a target by using a depth camera, and obtaining the height information of the buds through conversion of image coordinates and world coordinates;
and E2, the depth camera is fixedly arranged right above the end effector of the picking mechanical arm.
S10, cutting tea buds by the mechanical arm according to the height information, and entering S11;
s11, the controller judges whether the mechanical arm cutter completes tea collection, if so, the process goes to S12, otherwise, the process goes back to S9;
s12, the industrial personal computer judges whether picking of the tea garden is finished, if so, the process goes to S13, otherwise, the process returns to S8 and the processes of S8 to S12 are repeated;
e1, detecting and identifying famous tea buds from the right above through a target by using a depth camera, and obtaining the height information of the buds through conversion of image coordinates and world coordinates;
and E2, the depth camera is fixedly arranged right above the end effector of the picking mechanical arm.
Taking the example of picking famous tea, the size of a general famous tea garden is 3000 square meters-10000 square meters, the tea row length is 50 meters, the width is 1.5 meters, and the height is 1 meter.
Firstly, a base station is placed at one corner of a tea garden, and the base station coordinates are used as reference coordinates of the tea garden. The tea garden tea image information is collected through the unmanned aerial vehicle according to the preset height and the preset path, collected image content is fused into a tea garden integral diagram, and in the flight process of the unmanned aerial vehicle, the unmanned aerial vehicle records the relative position information between the base station and the unmanned aerial vehicle in a polar coordinate mode. After the integral image of the tea garden is built, the integral image is divided into n X m tea garden divided images with length and width dimensions of (a, b) as data sets, the data sets are arranged in the sequence from small to large of X coordinates and from small to large of Y coordinates, labelImg software is used for marking famous tea buds on the image, the data sets are enhanced and expanded, and the expanded data sets are randomly divided into three parts of a training set, a verification set and a test set according to the proportion of 8:1:1. Training a training set through deep learning, putting a weight file into a deep learning algorithm after training is finished, detecting a test file, if the recognition rate is greater than 90%, considering to be qualified, recognizing tender shoots in all n x m tea garden segmentation graphs based on the deep learning algorithm, calculating global coordinates of the tender shoots of tea after obtaining local coordinates of the tender shoots of each tea, and outputting the global coordinates of the tender shoots of the tea to an industrial personal computer. The industrial personal computer calculates an optimal path point and a path point acquisition area, drives a traveling motor to reach the optimal path point, acquires the tea buds according to the sequence of the X coordinates of the tea buds from small to large and the Y coordinates from small to large, moves a cutter right above the tea buds by using a mechanical arm during acquisition, obtains the relative Z-axis height between the cutter and the buds by using a depth camera, and drives the acquisition device to finish acquisition. Therefore, the horizontal coordinate information of the tea shoots can be obtained through the unmanned aerial vehicle positioning mode, so that the picking machine can collect the tea shoots more efficiently.
The foregoing describes one embodiment of the present invention in detail, but the description is only a preferred embodiment of the present invention and should not be construed as limiting the scope of the invention. All equivalent changes and modifications within the scope of the present invention are intended to be covered by the present invention.

Claims (8)

1. The automatic famous tea picking control system based on unmanned aerial vehicle identification is characterized by comprising a base station, an information acquisition module, an identification processing module and a picking device;
the information acquisition device is connected with the base station and the identification processing module and is used for acquiring a tea garden integral diagram of the target tea garden;
the identification processing module is used for dividing the tea garden overall graph according to a preset dividing rule and outputting central coordinate information of the tea bud positions;
the picking device is connected with the identification processing module and used for generating a picking path according to the central coordinate information and picking the tender buds according to the picking path.
2. The automatic famous tea picking control system based on unmanned aerial vehicle identification according to claim 1, wherein the information acquisition device comprises an unmanned aerial vehicle and a communication unit, a shooting unit and a positioning unit which are arranged on the unmanned aerial vehicle;
the shooting unit and the positioning unit are connected with the base station through the communication unit; the positioning unit and the base station are positioned through polar coordinates, and the position of the base station is set as a coordinate origin.
3. The automatic famous tea picking control system based on unmanned aerial vehicle identification according to claim 2, wherein the unmanned aerial vehicle is driven to collect image information of tea leaves in a target tea garden according to a preset height and a preset path, and the collected image information is fused into a tea garden integral diagram;
the preset segmentation rule comprises the following steps:
dividing the tea garden overall diagram according to a preset dividing proportion;
inputting the segmented pictures into a trained target detection model;
judging the validity of the output result of the target detection model, if so, sending a first working instruction by the target detection model, otherwise, identifying;
the target detection model is a trained neural network model, and the first working instruction comprises the central coordinate information.
4. The unmanned aerial vehicle identification-based famous tea automatic picking control system according to claim 3, wherein the method for acquiring the center coordinate information comprises the following steps:
numbering each divided picture, wherein the numbering sequence is arranged from small to large along the X axis and from small to large along the Y axis;
obtaining local center coordinates of the buds in each graph, wherein the center coordinates are calculated by the following formula:
Figure FDA0004050446300000021
wherein x is the x-axis coordinate of the local bud; y is the coordinate of the y axis of the local bud;
x 1 marking the left axis x coordinate, x of the frame for local buds 2 Marking the x coordinate of the right axis of the frame for the local buds;
y 1 marking the left axis y coordinate, y of the frame for local buds 2 Marking the left axis y coordinate of the frame for the local buds;
converting the local bud coordinates into global bud coordinates, the center coordinates being calculated by the following formula:
(X,Y)=(x*(n-1),y*(m-1))
wherein n is the number of each graph after segmentation in the X-axis direction, m is the number of each graph after segmentation in the Y-axis direction, X is the global bud X-axis coordinate, and Y is the global bud Y-axis coordinate.
5. The unmanned aerial vehicle identification-based famous tea automatic picking control system according to claim 1, wherein the picking path generation method comprises the following steps:
d1, defining a corner of a target tea garden as a zero point coordinate, extracting each ridge where the tea tree is located by using an image processing mode, and extracting a central line from the ridge;
d2, placing the picking device at a ridge starting point closest to the zero point, aligning the center point of the picking device with the ridge center point, and enabling the area covered by the picking device to be a unit at the moment;
d3, finding out the coordinates of the bud closest to the zero point according to the coordinates of the identified center point of the famous tea bud in the area of the picking device covered with one unit;
and D4, planning a path according to a sequential method, selecting the lower left corner as a starting point, traversing all rows and columns from left to right according to a sequencing rule, and determining the sequence of each target.
6. The automatic famous tea picking control system based on unmanned aerial vehicle identification, which is characterized in that the picking device comprises a frame, a visual positioning module, a picking module and a driving module, wherein the visual positioning module, the picking module and the driving module are arranged on the frame, and a controller is fixedly arranged in the frame;
the picking module comprises a picking parallel mechanical arm, a tail end picking device and a tender bud collecting straw, wherein the picking parallel mechanical arm is fixedly arranged on a rack, the tail end picking device is fixedly arranged at the tail end of the parallel mechanical arm, and the tender bud collecting straw is fixedly arranged in the mechanical arm;
the visual positioning module comprises a depth camera and an industrial personal computer which are arranged on the frame, wherein the depth camera is fixedly arranged above the tail end picking device, and the industrial personal computer is arranged on the right side of the frame;
the driving module comprises a chassis and a crawler belt, wherein the chassis is fixedly arranged at the bottom of the frame, and the crawler belt is fixedly arranged at two sides of the chassis.
7. The unmanned aerial vehicle identification-based famous tea automatic picking control system according to claim 6, wherein when the picking module reaches a picking point, the picking parallel mechanical arm moves to be right above tea leaves to be picked;
the depth camera acquires tea height information;
the method comprises the steps that a depth camera is utilized to detect and identify famous tea buds from the right above through targets, and the height information of the buds can be obtained through conversion of image coordinates and world coordinates;
the mechanical arm cuts out tea buds according to the height information;
the controller judges whether the mechanical arm cutter completes tea collection or not;
and the industrial personal computer judges whether the picking of the tea garden is finished.
8. The unmanned aerial vehicle identification-based famous tea automatic picking control method according to claim 1, comprising the following steps:
acquiring a tea garden integral diagram of a target tea garden;
dividing the tea garden overall graph according to a preset dividing rule, and outputting central coordinate information of the tea bud positions;
and generating a picking path according to the central coordinate information, and picking the tender buds according to the picking path.
CN202310038982.2A 2023-01-11 2023-01-11 Automatic famous tea picking control system and method based on unmanned aerial vehicle identification Pending CN116048082A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310038982.2A CN116048082A (en) 2023-01-11 2023-01-11 Automatic famous tea picking control system and method based on unmanned aerial vehicle identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310038982.2A CN116048082A (en) 2023-01-11 2023-01-11 Automatic famous tea picking control system and method based on unmanned aerial vehicle identification

Publications (1)

Publication Number Publication Date
CN116048082A true CN116048082A (en) 2023-05-02

Family

ID=86123431

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310038982.2A Pending CN116048082A (en) 2023-01-11 2023-01-11 Automatic famous tea picking control system and method based on unmanned aerial vehicle identification

Country Status (1)

Country Link
CN (1) CN116048082A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210337734A1 (en) * 2018-10-08 2021-11-04 Advanced Farm Technologies, Inc. Autonomous crop harvester
US20220081226A1 (en) * 2020-09-14 2022-03-17 Yamaha Hatsudoki Kabushiki Kaisha Movable harvesting apparatus and harvesting unit
CN116935235A (en) * 2023-09-19 2023-10-24 深圳市索威尔科技开发有限公司 Fresh tea leaf identification method and related device based on unmanned tea picking machine
CN117616999A (en) * 2024-01-08 2024-03-01 华南农业大学 Intelligent tea picking actuator, device and method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210337734A1 (en) * 2018-10-08 2021-11-04 Advanced Farm Technologies, Inc. Autonomous crop harvester
US12004451B2 (en) * 2018-10-08 2024-06-11 Advanced Farm Technologies, Inc. Autonomous crop harvester
US20220081226A1 (en) * 2020-09-14 2022-03-17 Yamaha Hatsudoki Kabushiki Kaisha Movable harvesting apparatus and harvesting unit
CN116935235A (en) * 2023-09-19 2023-10-24 深圳市索威尔科技开发有限公司 Fresh tea leaf identification method and related device based on unmanned tea picking machine
CN116935235B (en) * 2023-09-19 2024-04-05 深圳市索威尔科技开发有限公司 Fresh tea leaf identification method and related device based on unmanned tea picking machine
CN117616999A (en) * 2024-01-08 2024-03-01 华南农业大学 Intelligent tea picking actuator, device and method

Similar Documents

Publication Publication Date Title
CN116048082A (en) Automatic famous tea picking control system and method based on unmanned aerial vehicle identification
CN109685066B (en) Mine target detection and identification method based on deep convolutional neural network
CN107741234A (en) The offline map structuring and localization method of a kind of view-based access control model
CN112476434A (en) Visual 3D pick-and-place method and system based on cooperative robot
CN111462135A (en) Semantic mapping method based on visual S L AM and two-dimensional semantic segmentation
US11406061B2 (en) Automated walnut picking and collecting method based on multi-sensor fusion technology
CN111968048B (en) Method and system for enhancing image data of less power inspection samples
CN104850832B (en) A kind of large-scale image sample mask method and system based on classification iteration
CN110675453B (en) Self-positioning method for moving target in known scene
CN110827353B (en) Robot positioning method based on monocular camera assistance
CN103198477A (en) Apple fruitlet bagging robot visual positioning method
CN107309876A (en) The control method of manipulator harvesting
CN109792951A (en) For the unmanned plane course line correction system of hybrid rice pollination and its bearing calibration
CN106871902A (en) A kind of method of Navigation of Pilotless Aircraft, device and system
CN105225225A (en) A kind of leather system for automatic marker making method and apparatus based on machine vision
CN109729835B (en) Binocular vision-based oil tea fruit picking system and control method
CN107588723A (en) Circular mark leak source detection method on a kind of High-speed target based on two-step method
CN112700498A (en) Wind driven generator blade tip positioning method and system based on deep learning
CN111862200B (en) Unmanned aerial vehicle positioning method in coal shed
CN107272037A (en) A kind of road equipment position, image information collecting device and the method for gathering information
CN113724387A (en) Laser and camera fused map construction method
CN114492070A (en) High-precision mapping geographic information virtual simulation technology and device
Hu et al. Recognition and localization of strawberries from 3D binocular cameras for a strawberry picking robot using coupled YOLO/Mask R-CNN
CN113793385A (en) Method and device for positioning fish head and fish tail
CN107990825B (en) High-precision position measuring device and method based on priori data correction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination