CN103891697A - Drug spraying robot capable of moving indoors autonomously and variable drug spraying method thereof - Google Patents
Drug spraying robot capable of moving indoors autonomously and variable drug spraying method thereof Download PDFInfo
- Publication number
- CN103891697A CN103891697A CN201410119127.5A CN201410119127A CN103891697A CN 103891697 A CN103891697 A CN 103891697A CN 201410119127 A CN201410119127 A CN 201410119127A CN 103891697 A CN103891697 A CN 103891697A
- Authority
- CN
- China
- Prior art keywords
- spraying
- support
- crop
- drug
- human body
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Guiding Agricultural Machines (AREA)
- Manipulator (AREA)
Abstract
The invention discloses a drug spraying robot capable of moving indoors autonomously and a working method thereof. The drug spraying robot comprises a wheeled robot body, a drug box and a control part are mounted above the wheeled robot body, a movable support is fixedly mounted on one side of the drug box and comprises a transverse support rod and a longitudinal support rod in mutual perpendicular arrangement, a spraying rod is fixed at the far end of the transverse support rod, a spraying nozzle is arranged on the end portion of the spraying rod, the tail portion of the spraying rod is connected with the drug box via a hose, a crop recognition camera and an ultrasonic sensor are fixedly mounted on the transverse support rod, and a navigation camera and another ultrasonic sensor are arranged at the front end of the wheeled robot body. By the aid of a variable drug spraying method, image processing operand of a system can be lowered effectively, precise and measured target-oriented drug spraying is realized, actual working efficiency of the drug spraying robot is improved obviously, and application and promotion of a greenhouse drug spraying robot are promoted positively.
Description
Technical field
The present invention relates to agricultural equipment for plant protection field, relate in particular to a kind of indoor autonomous spraying machine device people and this robot variable spray method to indoor band crop.
Background technology
Agricultural production in plant disease serious threat, and chemical control, mechanical dispenser are the Main Means of disease control.At present, the effective rate of utilization of China's agricultural chemicals only has 30% left and right (developed country is 60%~70%).This not only wastes agricultural chemicals, also causes environmental pollution.Since the nineties in 20th century, " accurate agricultural " technology is subject to the generally attention of countries in the world.Precision spraying technology in dispenser process according to actual needs, accomplish that " quantitatively, fixed point ", to target spray medicine, pursue minimum dosage and the peak efficiency of liquid, be the developing direction of pesticides application technology.Compared with the complicated circumstances in outdoor land for growing field crops, taking greenhouse as the friendly facilities environment of representative as spraying machine device people and precision spraying technology thereof take the lead in promoting creates favorable conditions.
Consult existing document, the related work in this field biases toward the detection of crop row and the Study of recognition of crop and weeds both at home and abroad at present, and be mostly applied in the environment of outdoor land for growing field crops, study at home and seldom see in document for the accurate spray medicine of mobile robot in facility.
Application number is that the invention that CN 200910211739.6, name are called " based on the intelligent variable medicine spraying machine of prescription map control " relates to a kind of intelligent variable medicine spraying machine, the disease, worm, the crop smothering different information that obtain according to GPS/GIS technology, spray as required medicine, comprise intelligence control system and spray medicine system.Its intelligence control system is based on prescription map work pattern, is not the real time discriminating from machine vision, and lacks vision guided navigation function, flexibility and the accuracy deficiency of spray medicine working method.
Application number is that the invention that CN201310272642, name are called " greenhouse automatic medicine sprayer device people " relates to a kind of greenhouse automatic medicine sprayer device people who walks on track, and it comprises track travel parts, is fixed on liquid spraying part and correlation control unit on ground-engaging element.The main track that is characterised in that ground is laid of disclosed content, but the disappearance of its independent navigation function has limited the flexibility of its working method.
Summary of the invention
The object of invention: the present invention has overcome the inefficient deficiency existing in the accurate spray medicine operation process existing in prior art, provides a kind of efficient variable spray medicine method of work for short row crop and similar arrangement in a row potted landscape plant.
Technical scheme: a kind of indoor autonomous spraying machine device people, specifically comprise wheel type machine human body, described wheel type machine human body top is provided with medicine-chest and control assembly, be installed with travel(l)ing rest in described medicine-chest one side, described travel(l)ing rest comprises mutual vertically disposed horizontal support and longitudinal carrier, described horizontal support far-end is fixed with spray boom, described spray boom end is provided with shower nozzle, described spray boom afterbody is connected with described medicine-chest by flexible pipe, on described horizontal support, fixedly mount crop identification camera and ultrasonic sensor, described wheel type machine human body front end is provided with navigation camera and ultrasonic sensor.
Described horizontal support comprises the first mutually nested support and the second support, and described the first support and described the second support can mutually stretch and retract.
Described longitudinal carrier comprises the 3rd mutually nested support and the 4th support, and described the 3rd support and described the 4th support can mutually stretch and retract.
Indoor autonomous spraying machine device people's a variable spray method, is a kind of continuous operation process based on monocular vision, and its method of work comprises the quick consecutive steps that following circulation is carried out:
The ranging information that the ultrasonic sensor of A, read machine human body's front end gathers, guarantees in robot direction of advance accessible; Otherwise continue find range and wait for, until barrier disappears;
B, read the current picture that wheel type machine human body's crop identification camera is taken, adopt chromatism method Fast Segmentation from vertical close shot imaging to go out target crop, and realize the binaryzation of image;
C, judge whether the target crop row in present frame occurs the end, if it is directly termination routine operation; Otherwise proceed to next step;
D, on the basis of step B binary image, matching crop row center line, extract immediately yaw angle,
Driftage is apart from two parameters;
E, according to the yaw angle of extracting in step D and driftage apart from two parameters, adopt adaptive fuzzy control method to realize the Navigational Movements control to wheel type machine human body, to realize the vertical tracking of photocentre aligning crop row center line of crop identification camera;
F, every enforcement one segment distance of wheel type machine human body, calculate once the accounting of object pixel area in current navigation frame, according to image-forming range and imaging accounting, in conjunction with the solar term information of artificial input, mist flow is controlled in decision-making by the good fuzzy neural network of precondition in real time, realizes essence amount target spraying; Then return to steps A.
Beneficial effect: disclosed indoor autonomous spraying machine device people and variable spray method thereof, the navigation information that common two kinds of different monocular visions are realized respectively extracts, crop detects two kinds of different images processing procedures of identification and is fused to a kind of image processing process under monocular vision, effectively reduce image and process operand, obviously accelerated the practical work process to row spray medicine.Along in crop row spray medicine operation process, when realizing the efficient quick navigation under indoor friendly environment, the mode of vertically following the trail of crop row center line with the photocentre of crop identification camera carries out target detection identification in Same Scene picture, obviously reduce system diagram picture and process operand, realization is accurate, essence amount remarkable lifting spraying machine device people's when target is sprayed to medicine real work efficiency, and the application that promotes greenhouse spraying machine device people is actively promoted to effect.
Brief description of the drawings
Fig. 1 is indoor autonomous spraying machine device people's of the present invention structural representation;
Fig. 2 is moving rack structure schematic diagram of the present invention;
Fig. 3 is method of work overall flow block diagram of the present invention;
Fig. 4 is the Fast Segmentation flow chart of target crop;
Fig. 5 is the extraction flow chart of crop row center line;
Fig. 6 is wheel type machine human body's motion control flow chart;
Fig. 7 is smart spraying decision flow diagram.
Embodiment
As Fig. 1, shown in 2, the indoor autonomous spraying machine of disclosed one device people, specifically comprise wheel type machine human body 1, described wheel type machine human body 1 top is provided with medicine-chest 2 and control assembly 3, described medicine-chest 2 one side fixed installation vertical heights are adjustable, cross out the travel(l)ing rest 4 of adjustable in length, described travel(l)ing rest 4 comprises mutual vertically disposed horizontal support 41 and longitudinal carrier 42, described horizontal support 41 has the first support 411 mutually nested and that can stretch and shrink and the second support 412, described longitudinal carrier 42 has the 3rd support 421 mutually nested and that can stretch and shrink and the 4th support 422, on described the first support 411, binding has spray boom 5, on described the first support 411, be fixed with crop identification camera 6 and ultrasonic sensor 7, described spray boom 5 ends are provided with shower nozzle 8, described spray boom 5 afterbodys are connected with described medicine-chest 2 by flexible pipe 9, described wheel type machine human body 1 front end is provided with navigation camera 11 and ultrasonic sensor 12.
Wherein, the horizontal support 41 of described travel(l)ing rest 4 and longitudinal carrier 42 can carry out manual adjustments according to plant height, extension elongation, spray with the liquid that adapts to differing heights and line width crop.
Indoor autonomous spraying machine device people of the present invention, realize on the whole along crop row direction and move ahead and side direction crop is accurately sprayed to medicine, the crop identification camera 6 of installing on horizontal support 41 is vertically downward realized navigation information extraction and crop detection recognition function simultaneously, when vertically following the trail of crop row center line extraction navigation benchmark with the photocentre of crop identification camera 6, carry out target crop detection, the monocular vision processing procedure of common two kinds of difference in functionalitys is united two into one, simplify image calculation and processing, improved the actual job efficiency of spray medicine equipment.For avoiding robot direct of travel to have barrier, gather the ranging information of the ultrasonic sensor 12 of described wheel type machine human body 1 front end simultaneously.
As shown in Figure 3, the variable spray method of being implemented by above-mentioned indoor autonomous spraying machine device people, specifically comprises the quick consecutive steps that the circulation based on monocular vision is carried out as follows:
The ranging information that the ultrasonic sensor 12 of A, read machine human body's 1 front end gathers, guarantees in robot direction of advance accessible; Otherwise continue find range and wait for, until barrier disappears.
B, read the current picture that wheel type machine human body 1 crop identification camera 6 is taken, adopt chromatism method Fast Segmentation from vertical close shot imaging to go out target crop, and realize the binaryzation of image.
C, judge whether the target crop row in present frame occurs the end, if it is directly termination routine operation; Otherwise proceed to next step.
D, on the basis of step B binary image, matching crop row center line, extract immediately yaw angle,
Driftage is apart from two parameters.
E, according to the yaw angle of extracting in step D and driftage apart from two parameters, adopt adaptive fuzzy control method to realize wheel type machine human body 1 carried out to Navigational Movements control, aim at the vertical tracking of crop row center line to realize the photocentre of crop identification camera 6.
F, every enforcement one segment distance of wheel type machine human body 1, calculate once the accounting of object pixel area in current navigation frame, according to image-forming range and imaging accounting, in conjunction with the solar term information of artificial input, mist flow is controlled in decision-making by the good fuzzy neural network of precondition in real time, realizes essence amount target spraying; Then return to steps A.
Below above-mentioned key step is specifically described:
As shown in Figure 4, the flow process of the Fast Segmentation of described target crop is that, first by crop identification camera vertical collection crop map picture, the recycling color property factor is carried out image gray processing, then carry out image binaryzation, then carry out medium filtering and morphology operations.
Image is cut apart and is referred to and utilize machine vision technique target from background extracting out, the primary work that realizes accurate dispenser, it is numerous that existing image is cut apart pattern, and utilize color characteristic that plant is split from background, it is the most frequently used pattern of plnat monitoring, conventionally the crop map picture of shooting is transformed into other color space such as HSI or Lab, rgb from rgb space, to weaken the impact of ambient lighting on image pixel value, but the conversion of color space needs computing consuming time, inapplicable when requirement of real-time is higher.
Taking the most common green crop as example, comprehensively cut apart consuming time and segmentation effect, in the present invention, select the green 2G-R-B aberration of the mistake factor of RGB color system to carry out the gray processing of image, and adopt Ostu adaptive threshold to cut apart the binaryzation that realizes image, cut apart preferably green crop and background.Generally speaking, the excess green of plant part is greater than the non-plant parts such as soil, and the G-B of green plants and the value of G-R be a little more than non-green plants, thus first R, G, B are compared, then carry out the gray count of image, its formula is as follows:
Realize after the gray processing of image, consider the impact of illumination variation, adopt Ostu adaptive threshold to cut apart and carry out binaryzation, cut apart preferably green crop and background, there is noise and hole in the image after Threshold segmentation, application median filter carries out filtering to the image after cutting apart, and carries out morphology operations to fill up the hole in background, obtains comparatively complete crop row target.
As shown in Figure 5, the extraction flow process of crop row center line is, first extracts above-mentioned bianry image, scans in order object pixel upper and lower sides marginal point coordinate and averages, secondly pass through least square fitting crop row center line as navigation datum line, calculate yaw angle and driftage distance
On the basis of binary image, for adding fast scan speed, every certain columns from left to right sequential scanning object pixel region each list lower edge point coordinates and obtain average, extract discrete each center point coordinate, if obtain by column the discrete middle point coordinates of each row, consuming time more, adopt and reduce operand every the mode of column count, interval columns is more in theory, the columns of required statistics is fewer, computing time is fewer, more be conducive to the real-time processing of Vision Builder for Automated Inspection, if but interval columns is too many, will certainly cause the error of calculation, therefore, for Different Crop feature, can select an optimal spacing columns, reach the object that not only shortens operation time but also do not cause the error of calculation, optimized algorithm execution efficiency.
The vision guided navigation of agricultural machinery adopts Hough conversion to carry out straight-line detection mostly, advantage is to be subject to noise and crop disappearance to affect little, strong robustness, shortcoming is that real-time is poor, consider the less feature of native system close shot imaging impure point, select method of least squares more efficiently to simulate rapidly crop row center line as navigation datum line to the discrete navigation spots extracting, obtain fast the photocentre (initial point of image physical coordinates system) of crop camera and the yaw angle θ of crop row centre of surface line, driftage is apart from lambda parameter, in the situation that not considering pattern distortion, image coordinate system in vertical imaging and the yaw angle in world coordinate system can be considered and be equal to, but go off course apart from λ because the difference of imaging height must be by camera calibration Relation acquisition, because crop identification camera is at right angle setting, so calibration process is greatly simplified, in the situation that concrete shooting is highly fixing, demarcation drawing is placed in to level ground, measure the conversion relation of pixel distance and actual range.When actual measurement is gone off course apart from λ, according to right angled triangle model, merge according to measuring from support ultrasonic the imaging height forming, show that very soon actual driftage is apart from λ.
As shown in Figure 6, wheel type machine human body's motion control flow process is, obtain current yaw angle and driftage distance, carry out again domain conversion, be mapped to corresponding fuzzy subset and degree of membership, by adaptive fuzzy reasoning, obtain the corresponding fuzzy subset of controlled quentity controlled variable and degree of membership, by gravity model appoach defuzzification, driving wheel-type robot body.
The at present motion control of wheeled mobile robot is divided into taking linear model control, optimum control as the vehicle mathematical model control method of representative and intelligent control method taking fuzzy control, ANN Control as representative, because wheel type mobile mechanism itself has nonlinearity and uncertainty, and consider state of ground not satisfactory and with tire and mechanism complexity, set up accurately rational model difficulty larger, therefore avoid using auto model to control.On the contrary, fuzzy control has the mathematical models of not relying on, feature that robustness is good.But existing, simple fuzzy control controller controls exquisiteness not, the shortcoming that stable state accuracy is poor.Therefore, for improving control accuracy, improve dynamic property, adopt the fuzzy control of self-adjusting scale factor, control is compared with simple fuzzy control, and its control law can carry out self adaptation adjustment according to actual deviation situation.
In the path trace of robot, main action is divided into straight line moving and fast poor turning to.Robot forward speed when operation is fixed, according to upper one link extract yaw angle, driftage apart from parameter, by regulating car body both sides speed discrepancy, calculate the pulse value that the each control cycle of left/right turbin generator should arrange, drive machines people's straight ahead or speed are poor to be turned to, to realize the vertical tracking of video camera photocentre to crop row center line.
The input variable of concrete two-dimentional fuzzy control is respectively yaw angle θ and goes off course apart from λ, is output as two-wheeled model both sides and expects speed discrepancy u.Its fuzzy control rule can be expressed as:
Wherein, α is regular modifying factor, also claims weighted factor.By adjusting α value, can change yaw angle θ and driftage apart from the weighting degree of λ, realize the adjustment of control law.In the time that driftage is larger apart from λ, the weighting of λ is larger, can eliminate fast lateral attitude deviation; When driftage apart from λ hour, the weighting of θ is larger, can improve the stability of system.Determining of weighted factor can be asked for by linear interpolation method fast according to specific experiment data acquisition.Each control cycle, system can calculate according to actual state straight line moving and fast poor the turning to of the different pulse number of left and right sides motor transmission being carried out to control by following formula.
Wherein,
represent respectively the speed of left side wheel speed, right side wheel speed and robot barycenter.
As shown in Figure 7, the flow process of spraying decision-making and control is, when robot exercises after a segment distance, calculate object pixel imaging area accounting (because of image-forming range difference, cannot directly obtain actual crop area) in present image, then extract support ultrasound data and merge and image-forming range, manually import again solar term factor information, merge above-mentioned three, carry out spray amount decision-making by fuzzy neural network, control in real time and spray flow.Target spraying is implemented complete, and spraying machine device people continues to move ahead, and enters circulation next time according to flow process shown in Fig. 3.
Wherein, obtaining of image-forming range comes from the continuous measurement that is close to the ultrasonic sensor at camera place on support.For correctly obtaining image-forming range, reduce measure error, extract and comprise imaging moment one piece of data before, through simple data Weighted Fusion, obtain final image-forming range.
Other specified scheme that the present invention can be used for being included in essence of the present invention and scope is implemented.For example, in above-described embodiment, taking green crop as object, but the present invention is not limited to this, sprays for the crop row of other characteristic colors, as long as select the applicable color characteristic factor to realize after gray processing, can continue to carry out according to the present embodiment.Equally, the steps such as the extraction of crop row center line, the motion control of mobile platform all available several different methods substitute, and all do not change the high efficiency essence of the continuous operation process based on monocular vision that the present invention explains.Described embodiment should be considered as illustrative, and the present invention can change in claims scope and full scope of equivalents thereof.
Claims (4)
1. an indoor autonomous spraying machine device people, it is characterized in that: specifically comprise wheel type machine human body, described wheel type machine human body top is provided with medicine-chest and control assembly, be installed with travel(l)ing rest in described medicine-chest one side, described travel(l)ing rest comprises mutual vertically disposed horizontal support and longitudinal carrier, described horizontal support far-end is fixed with spray boom, described spray boom end is provided with shower nozzle, described spray boom afterbody is connected with described medicine-chest by flexible pipe, on described horizontal support, fixedly mount crop identification camera and ultrasonic sensor, described wheel type machine human body front end is provided with navigation camera and ultrasonic sensor.
2. the indoor autonomous spraying machine of one according to claim 1 device people, is characterized in that: described horizontal support comprises the first mutually nested support and the second support, and described the first support and described the second support can mutually stretch and retract.
3. the indoor autonomous spraying machine of one according to claim 1 device people, is characterized in that: described longitudinal carrier comprises the 3rd mutually nested support and the 4th support, and described the 3rd support and described the 4th support can mutually stretch and retract.
4. indoor autonomous spraying machine device people's a variable spray method, is characterized in that: be a kind of continuous operation process based on monocular vision, its method of work comprises the quick consecutive steps that following circulation is carried out:
The ranging information that the ultrasonic sensor of A, read machine human body's front end gathers, guarantees in robot direction of advance accessible; Otherwise continue find range and wait for, until barrier disappears;
The current picture that B, read machine human body's crop identification camera is taken, adopts chromatism method Fast Segmentation from vertical close shot imaging to go out target crop, and realizes the binaryzation of image;
C, judge whether the target crop row in present frame occurs the end, if it is directly termination routine operation; Otherwise proceed to next step;
D, on the basis of step B binary image, matching crop row center line, extract immediately yaw angle,
Driftage is apart from two parameters;
E, according to the yaw angle of extracting in step D and driftage apart from two parameters, adopt adaptive fuzzy control method to realize the Navigational Movements control to robot body, to realize the vertical tracking of photocentre aligning crop row center line of crop identification camera;
F, every enforcement one segment distance of wheel type machine human body 1, calculate once the accounting of object pixel area in current navigation frame, according to image-forming range and imaging accounting, in conjunction with the solar term information of artificial input, mist flow is controlled in decision-making by the good fuzzy neural network of precondition in real time, realize essence amount target spraying, then return to steps A.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410119127.5A CN103891697B (en) | 2014-03-28 | 2014-03-28 | The variable spray method of a kind of indoor autonomous spraying machine device people |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410119127.5A CN103891697B (en) | 2014-03-28 | 2014-03-28 | The variable spray method of a kind of indoor autonomous spraying machine device people |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103891697A true CN103891697A (en) | 2014-07-02 |
CN103891697B CN103891697B (en) | 2015-08-12 |
Family
ID=50983531
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410119127.5A Expired - Fee Related CN103891697B (en) | 2014-03-28 | 2014-03-28 | The variable spray method of a kind of indoor autonomous spraying machine device people |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103891697B (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104082268A (en) * | 2014-07-08 | 2014-10-08 | 西北农林科技大学 | Self-adaptive orchard sprayer |
CN104186451A (en) * | 2014-08-19 | 2014-12-10 | 西北农林科技大学 | Insect-killing weeding pesticide spraying robot based on machine vision |
CN104396927A (en) * | 2014-12-05 | 2015-03-11 | 重庆利贞元农林科技有限公司 | Directional spraying method for fruit tree |
CN104488843A (en) * | 2014-12-17 | 2015-04-08 | 江苏大学 | Air-supply-type variable spraying rod automatic tracking system adopting laser sensor and control method |
CN104574405A (en) * | 2015-01-15 | 2015-04-29 | 北京天航华创科技股份有限公司 | Color image threshold segmentation method based on Lab space |
CN104615059A (en) * | 2015-02-09 | 2015-05-13 | 聊城大学 | Control system of spraying robot for hyphantria cunea larva net curtain |
CN104742734A (en) * | 2015-03-30 | 2015-07-01 | 钟长瑞 | Four-wheel drive combined type multifunctional agricultural electric platform car |
CN105528004A (en) * | 2015-12-02 | 2016-04-27 | 安徽农业大学 | Intelligent spraying machine and intelligent spraying method |
CN105638389A (en) * | 2016-02-18 | 2016-06-08 | 中国农业科学院农田灌溉研究所 | Multifunctional self-walking urban road green belt irrigation vehicle |
CN106774420A (en) * | 2017-01-23 | 2017-05-31 | 东莞理工学院 | A kind of automation agriculture pollination method based on micro-robot |
CN106908062A (en) * | 2017-04-21 | 2017-06-30 | 浙江大学 | A kind of self-propelled chlorophyll fluorescence IMAQ robot and its acquisition method |
CN107593629A (en) * | 2017-10-31 | 2018-01-19 | 四川省农业机械研究设计院 | A kind of intelligent robot and intelligent cocoonery |
CN109042595A (en) * | 2018-08-13 | 2018-12-21 | 江苏大学 | A kind of scalable humidifier being equipped on greenhouse atomizing machine track mobile device |
CN110328666A (en) * | 2019-07-16 | 2019-10-15 | 汕头大学 | Identifying system and material mechanism for picking |
CN110710519A (en) * | 2019-10-08 | 2020-01-21 | 桂林理工大学 | Intelligent pesticide spraying robot |
CN110909668A (en) * | 2019-11-20 | 2020-03-24 | 广州极飞科技有限公司 | Target detection method and device, computer readable storage medium and electronic equipment |
CN111436414A (en) * | 2020-04-01 | 2020-07-24 | 江苏大学 | Greenhouse strawberry canopy inner circumference wind-conveying pesticide applying robot and implementation method thereof |
CN112189645A (en) * | 2020-10-27 | 2021-01-08 | 江苏大学 | Double-online pesticide mixing sprayer suitable for intercropping and working method |
CN112690265A (en) * | 2020-12-24 | 2021-04-23 | 吉林农业大学 | Timed pesticide spraying device and method for pest control |
CN113100207A (en) * | 2021-04-14 | 2021-07-13 | 郑州轻工业大学 | Accurate formula pesticide application robot system based on wheat disease information and pesticide application method |
CN113925036A (en) * | 2021-09-19 | 2022-01-14 | 南京农业大学 | Accurate automatic pesticide applying device based on machine vision |
CN113973607A (en) * | 2021-09-14 | 2022-01-28 | 山东省农业科学院作物研究所 | Self-propelled maize leaf control mark device |
CN117397661A (en) * | 2023-10-27 | 2024-01-16 | 佛山市天下谷科技有限公司 | Medicine spraying control method of medicine spraying robot and medicine spraying robot |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4132637A1 (en) * | 1991-10-01 | 1993-04-08 | Walter Prof Dr Kuehbauch | Automatic weed removal on railway track with image recognition control - has tv cameras and processor to compute weed coverage and with spatially controlled spraying of track |
US5348226A (en) * | 1992-11-12 | 1994-09-20 | Rhs Fertilizing/Spraying Systems | Spray boom system with automatic boom end height control |
CN201541615U (en) * | 2009-10-16 | 2010-08-11 | 西北农林科技大学 | Variable-amplitude spraying machine |
CN101807252A (en) * | 2010-03-24 | 2010-08-18 | 中国农业大学 | Crop row center line extraction method and system |
CN102428904A (en) * | 2011-09-20 | 2012-05-02 | 上海交通大学 | Automatic targeting and variable atomizing flow control system for weeding robot |
CN102914967A (en) * | 2012-09-21 | 2013-02-06 | 浙江工业大学 | Autonomous navigation and man-machine coordination picking operating system of picking robot |
CN103196441A (en) * | 2013-03-19 | 2013-07-10 | 江苏大学 | Spraying machine integrated navigation method and system of |
CN103530643A (en) * | 2013-10-11 | 2014-01-22 | 中国科学院合肥物质科学研究院 | Pesticide positioned spraying method and system on basis of crop interline automatic identification technology |
CN203762122U (en) * | 2014-03-28 | 2014-08-13 | 南通职业大学 | Indoor autonomous mobile pesticide spraying robot |
-
2014
- 2014-03-28 CN CN201410119127.5A patent/CN103891697B/en not_active Expired - Fee Related
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4132637A1 (en) * | 1991-10-01 | 1993-04-08 | Walter Prof Dr Kuehbauch | Automatic weed removal on railway track with image recognition control - has tv cameras and processor to compute weed coverage and with spatially controlled spraying of track |
US5348226A (en) * | 1992-11-12 | 1994-09-20 | Rhs Fertilizing/Spraying Systems | Spray boom system with automatic boom end height control |
CN201541615U (en) * | 2009-10-16 | 2010-08-11 | 西北农林科技大学 | Variable-amplitude spraying machine |
CN101807252A (en) * | 2010-03-24 | 2010-08-18 | 中国农业大学 | Crop row center line extraction method and system |
CN102428904A (en) * | 2011-09-20 | 2012-05-02 | 上海交通大学 | Automatic targeting and variable atomizing flow control system for weeding robot |
CN102914967A (en) * | 2012-09-21 | 2013-02-06 | 浙江工业大学 | Autonomous navigation and man-machine coordination picking operating system of picking robot |
CN103196441A (en) * | 2013-03-19 | 2013-07-10 | 江苏大学 | Spraying machine integrated navigation method and system of |
CN103530643A (en) * | 2013-10-11 | 2014-01-22 | 中国科学院合肥物质科学研究院 | Pesticide positioned spraying method and system on basis of crop interline automatic identification technology |
CN203762122U (en) * | 2014-03-28 | 2014-08-13 | 南通职业大学 | Indoor autonomous mobile pesticide spraying robot |
Non-Patent Citations (5)
Title |
---|
D.K.GILES ET AL: "Prescision band spraying with machine-vision guidance and adjustable yaw nozzles", 《TRANSACTIONS OF THE ASAE》, vol. 40, no. 1, 31 December 1997 (1997-12-31), pages 29 - 36, XP000683251 * |
L.TIAN ET AL: "Machine vision identification of tomato seedlings for automated weed control", 《TRANSACTIONS OF THE ASAE》, vol. 40, no. 6, 1 February 2000 (2000-02-01) * |
安秋等: "农业机器人视觉导航试验平台", 《河南科技大学学报:自然科学版》, vol. 33, no. 3, 25 June 2012 (2012-06-25) * |
郭伟斌等: "基于模糊控制的除草机器人自主导航", 《机器人》, vol. 32, no. 2, 15 March 2010 (2010-03-15) * |
饶洪辉等: "基于机器视觉的作物对行喷药控制的研究", 《南京农业大学学报》, vol. 30, no. 1, 15 February 2007 (2007-02-15) * |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104082268A (en) * | 2014-07-08 | 2014-10-08 | 西北农林科技大学 | Self-adaptive orchard sprayer |
CN104082268B (en) * | 2014-07-08 | 2015-05-13 | 西北农林科技大学 | Self-adaptive orchard sprayer |
CN104186451A (en) * | 2014-08-19 | 2014-12-10 | 西北农林科技大学 | Insect-killing weeding pesticide spraying robot based on machine vision |
CN104186451B (en) * | 2014-08-19 | 2017-06-20 | 西北农林科技大学 | A kind of deinsectization weeding spray robot based on machine vision |
CN104396927A (en) * | 2014-12-05 | 2015-03-11 | 重庆利贞元农林科技有限公司 | Directional spraying method for fruit tree |
CN104488843A (en) * | 2014-12-17 | 2015-04-08 | 江苏大学 | Air-supply-type variable spraying rod automatic tracking system adopting laser sensor and control method |
CN104574405A (en) * | 2015-01-15 | 2015-04-29 | 北京天航华创科技股份有限公司 | Color image threshold segmentation method based on Lab space |
CN104574405B (en) * | 2015-01-15 | 2018-11-13 | 北京天航华创科技股份有限公司 | A kind of coloured image threshold segmentation method based on Lab space |
CN104615059A (en) * | 2015-02-09 | 2015-05-13 | 聊城大学 | Control system of spraying robot for hyphantria cunea larva net curtain |
CN104615059B (en) * | 2015-02-09 | 2017-03-22 | 聊城大学 | Control system of spraying robot for hyphantria cunea larva net curtain |
CN104742734A (en) * | 2015-03-30 | 2015-07-01 | 钟长瑞 | Four-wheel drive combined type multifunctional agricultural electric platform car |
CN105528004A (en) * | 2015-12-02 | 2016-04-27 | 安徽农业大学 | Intelligent spraying machine and intelligent spraying method |
CN105528004B (en) * | 2015-12-02 | 2018-02-02 | 安徽农业大学 | A kind of brainpower insufflation machine and brainpower insufflation method |
CN105638389B (en) * | 2016-02-18 | 2019-04-16 | 中国农业科学院农田灌溉研究所 | A kind of Multi-functional self-actuated walks pergola on the city road irrigator |
CN105638389A (en) * | 2016-02-18 | 2016-06-08 | 中国农业科学院农田灌溉研究所 | Multifunctional self-walking urban road green belt irrigation vehicle |
CN106774420A (en) * | 2017-01-23 | 2017-05-31 | 东莞理工学院 | A kind of automation agriculture pollination method based on micro-robot |
CN106774420B (en) * | 2017-01-23 | 2019-11-05 | 东莞理工学院 | A kind of automation agriculture pollination method based on micro-robot |
CN106908062A (en) * | 2017-04-21 | 2017-06-30 | 浙江大学 | A kind of self-propelled chlorophyll fluorescence IMAQ robot and its acquisition method |
CN106908062B (en) * | 2017-04-21 | 2019-08-13 | 浙江大学 | A kind of self-propelled chlorophyll fluorescence Image Acquisition robot and its acquisition method |
CN107593629A (en) * | 2017-10-31 | 2018-01-19 | 四川省农业机械研究设计院 | A kind of intelligent robot and intelligent cocoonery |
CN107593629B (en) * | 2017-10-31 | 2024-04-26 | 四川省农业机械科学研究院 | Intelligent robot and intelligent silkworm house |
CN109042595B (en) * | 2018-08-13 | 2021-10-12 | 江苏大学 | Telescopic spraying mechanism carried on rail moving device of greenhouse sprayer |
CN109042595A (en) * | 2018-08-13 | 2018-12-21 | 江苏大学 | A kind of scalable humidifier being equipped on greenhouse atomizing machine track mobile device |
CN110328666A (en) * | 2019-07-16 | 2019-10-15 | 汕头大学 | Identifying system and material mechanism for picking |
CN110710519A (en) * | 2019-10-08 | 2020-01-21 | 桂林理工大学 | Intelligent pesticide spraying robot |
CN110909668A (en) * | 2019-11-20 | 2020-03-24 | 广州极飞科技有限公司 | Target detection method and device, computer readable storage medium and electronic equipment |
CN111436414A (en) * | 2020-04-01 | 2020-07-24 | 江苏大学 | Greenhouse strawberry canopy inner circumference wind-conveying pesticide applying robot and implementation method thereof |
CN111436414B (en) * | 2020-04-01 | 2021-11-23 | 江苏大学 | Greenhouse strawberry canopy inner circumference wind-conveying pesticide applying robot and implementation method thereof |
CN112189645B (en) * | 2020-10-27 | 2023-12-15 | 江苏大学 | Double-online medicine mixing sprayer suitable for intercropping and working method |
CN112189645A (en) * | 2020-10-27 | 2021-01-08 | 江苏大学 | Double-online pesticide mixing sprayer suitable for intercropping and working method |
CN112690265A (en) * | 2020-12-24 | 2021-04-23 | 吉林农业大学 | Timed pesticide spraying device and method for pest control |
CN113100207A (en) * | 2021-04-14 | 2021-07-13 | 郑州轻工业大学 | Accurate formula pesticide application robot system based on wheat disease information and pesticide application method |
CN113973607A (en) * | 2021-09-14 | 2022-01-28 | 山东省农业科学院作物研究所 | Self-propelled maize leaf control mark device |
CN113973607B (en) * | 2021-09-14 | 2023-09-08 | 山东省农业科学院作物研究所 | Self-propelled maize leaf monitoring and marking device |
CN113925036A (en) * | 2021-09-19 | 2022-01-14 | 南京农业大学 | Accurate automatic pesticide applying device based on machine vision |
CN117397661A (en) * | 2023-10-27 | 2024-01-16 | 佛山市天下谷科技有限公司 | Medicine spraying control method of medicine spraying robot and medicine spraying robot |
Also Published As
Publication number | Publication date |
---|---|
CN103891697B (en) | 2015-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103891697B (en) | The variable spray method of a kind of indoor autonomous spraying machine device people | |
CN109471434B (en) | Novel variable spray path planning autonomous navigation system and method | |
Meng et al. | Development of agricultural implement system based on machine vision and fuzzy control | |
CN109885063A (en) | A kind of application robot farmland paths planning method merging vision and laser sensor | |
CN103988824A (en) | Automatic targeting and spraying system based on binocular vision technology | |
CN106584451A (en) | Visual navigation based transformer substation automatic composition robot and method | |
CN113057154A (en) | Greenhouse liquid medicine spraying robot | |
CN204104581U (en) | A kind of automatic target detection spraying system based on binocular vision technology | |
CN111066442B (en) | Targeted variable fertilization method and device for corn and application | |
CN109964905B (en) | Self-propelled targeting pesticide application robot based on fruit tree identification and positioning and control method thereof | |
CN203860304U (en) | Automatic targeting and spraying system | |
CN111587872A (en) | Robot for spraying pesticide | |
CN203762122U (en) | Indoor autonomous mobile pesticide spraying robot | |
CN108669046B (en) | Plant protection unmanned vehicle integrating visual navigation and Beidou positioning and control method | |
CN109947115A (en) | Mower control system and control method thereof | |
CN108279678A (en) | A kind of field automatic travelling device and its ambulation control method for detecting plant growth condition | |
CN108196538A (en) | Three-dimensional point cloud model-based field agricultural robot autonomous navigation system and method | |
CN107509399A (en) | A kind of green intelligent weed-eradicating robot | |
CN207139809U (en) | A kind of agriculture inspecting robot with navigation barrier avoiding function | |
Roman et al. | Stereo vision controlled variable rate sprayer for specialty crops: Part II. Sprayer development and performance evaluation | |
Wang et al. | The identification of straight-curved rice seedling rows for automatic row avoidance and weeding system | |
Möller | Computer vision–a versatile technology in automation of agricultural machinery | |
CN117115769A (en) | Plant detection and positioning method based on semantic segmentation network | |
CN115451965B (en) | Relative heading information detection method for transplanting system of transplanting machine based on binocular vision | |
Chen et al. | Development and performance test of a height-adaptive pesticide spraying system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20150812 Termination date: 20190328 |