CN114967763A - Plant protection unmanned aerial vehicle sowing control method based on image positioning - Google Patents

Plant protection unmanned aerial vehicle sowing control method based on image positioning Download PDF

Info

Publication number
CN114967763A
CN114967763A CN202210914951.4A CN202210914951A CN114967763A CN 114967763 A CN114967763 A CN 114967763A CN 202210914951 A CN202210914951 A CN 202210914951A CN 114967763 A CN114967763 A CN 114967763A
Authority
CN
China
Prior art keywords
image
aerial vehicle
unmanned aerial
farmland
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210914951.4A
Other languages
Chinese (zh)
Other versions
CN114967763B (en
Inventor
蒋一民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202210914951.4A priority Critical patent/CN114967763B/en
Publication of CN114967763A publication Critical patent/CN114967763A/en
Application granted granted Critical
Publication of CN114967763B publication Critical patent/CN114967763B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a plant protection unmanned aerial vehicle sowing control method based on image positioning, and belongs to the field of image processing. The control method comprises the following steps: acquiring a first image, wherein the first image is an image of an area to be broadcast; acquiring a job request instruction; extracting feature data in the first image based on the job request instruction, wherein the feature data are pixel point data marked as a spread job area; controlling the plant protection unmanned aerial vehicle to perform scattering operation on the area to be scattered according to an operation path based on the characteristic data, wherein the operation path is a path for performing scattering operation on the area to be scattered; the method has the advantages that the method for determining whether to broadcast the current area or not is carried out by identifying the image, positioning the position of the unmanned aerial vehicle and accurately broadcasting according to the planned route, so that the no-repeat one-time broadcasting of the area to be broadcasted is realized, and the seeds to be broadcasted are saved.

Description

Plant protection unmanned aerial vehicle sowing control method based on image positioning
Technical Field
The invention relates to the field of image processing and matching, in particular to a sowing control method based on a plant protection unmanned aerial vehicle.
Background
Plant protection unmanned aerial vehicle, also called unmanned vehicles, the name is an unmanned aircraft who is used for agriculture and forestry plant protection operation as the name implies, and this type unmanned aircraft comprises flight platform (fixed wing, helicopter, multiaxis aircraft), navigation flight control, spraying mechanism triplex, flies through ground remote control or navigation and controls, realizes spraying the operation, can spray medicament, seed, powder etc. in agricultural production, saved the labour, reduced intensity of labour.
However, when the unmanned aerial vehicle in the prior art performs the sowing operation, the area for sowing is usually controlled by a manual control mode, but when the area for sowing is controlled by a manual mode, when the area of the sowing area is too large or the terrain is complex, the situation of missed sowing or repeated sowing usually occurs, so that the seed waste is caused.
In view of this, the present application is specifically made.
Disclosure of Invention
The invention aims to solve the technical problem that in the prior art, an unmanned aerial vehicle is manually controlled to broadcast areas, so that missing broadcast or repeated broadcast is easy to cause, and the invention aims to provide a broadcast control method based on a plant protection unmanned aerial vehicle, which can realize automatic non-repeated broadcast in the areas to be broadcast.
The invention is realized by the following technical scheme:
a plant protection unmanned aerial vehicle broadcast control method based on image positioning comprises the following steps:
step 1: acquiring a first image, wherein the first image is a ground image comprising an area to be broadcast;
and 2, step: recognizing the farmland region in the first image by adopting a neural network, and partitioning the farmland according to the actual division of the farmland in the image; the blocking method comprises the following steps: recognizing the boundary of a road, a tree, a ridge, a ditch or a man-made mark in the image, fitting the recognized boundary of the road, the tree, the ridge, the ditch or the man-made mark into straight lines which are regarded as the boundary of a farmland, partitioning the farmland by adopting the straight lines, and storing a network graph formed by all the straight lines, wherein the network graph is called a graph library;
and step 3: manually selecting farmland blocks to be sown, virtually dividing each individual farmland to be sown into a plurality of small squares, wherein each small square is called a second image, the width of each small square is the width of unmanned aerial vehicle sowing, and the unmanned aerial vehicle sowing width is adjustable;
and 4, step 4: planning an unmanned aerial vehicle sowing path with shortest length and least repetition according to the virtual small grids divided in the step 3; transmitting the image of the farmland block to be sown and the planned path to an unmanned aerial vehicle;
and 5: the unmanned aerial vehicle takes off after acquiring the image of the farmland blocks to be sown and the planned path, preliminarily sets the flight direction of the unmanned aerial vehicle, enables the unmanned aerial vehicle to fly towards the area to be sown, acquires the front image in real time, matches the image of the farmland blocks to be sown in the image acquired in real time until the image of the farmland blocks to be sown is matched in the image acquired in real time, and flies to the initial sowing position to prepare sowing operation according to the set sowing path;
the method for matching the image of the farmland block to be sown in the image acquired in real time comprises the following steps:
step 5.1: identifying a farmland area in the real-time acquired image, then identifying boundary lines of the farmland by adopting the same method in the step 2, carrying out size normalization on the network graph formed by all the boundary lines, and normalizing to obtain that the size represented by each pixel is the same as the size represented by each pixel of the network graph in the step 2;
step 5.2: the method comprises the steps that a farmland boundary network in an image is obtained in real time through sliding interception and is called as a network 1, the size of the network 1 is counted as C x D, and the interception step length is 5-10 pixels; counting the number a of nodes in the network 1; counting the number b of nodes around each node in the size of C x D in the graph library, selecting the nodes in the graph library corresponding to the number of the nodes around the node as a = b +/-3, and intercepting a graph library area which takes the node as the center and takes C x D as the size and is called as a network 2;
step 5.3: counting the number of straight lines in the network 1 and the network 2, and selecting the network 2 with the same number of straight lines;
step 5.4: matching the shape of the network 2 selected in the step 5.3 with the shape of the network 1, wherein the shape similarity is greater than a set threshold value, and considering that the matching is successful, otherwise, performing the next matching;
step 5.5: after matching is successful, corresponding the image acquired in real time with the first image according to the positions of the network 1 and the network 2 in the respective images; determining farmland blocks to be sown in the images acquired in real time;
step 6: identifying the edge of the image of the farmland block to be sown according to the obtained image of the farmland block to be sown, measuring the distances from the unmanned aerial vehicle to all the edges of the image of the farmland block to be sown, selecting the two closest edges as references, and positioning the position of the unmanned aerial vehicle; in the step 4, the unmanned aerial vehicle sowing path is determined, the distance from each point on the path to two nearest edges of the image of the farmland block to be sown is calculated in advance, in the actual sowing process, the records of the current unmanned aerial vehicle to the two nearest edges of the image of the farmland block to be sown are calculated, and the deviation of the unmanned aerial vehicle in the flying process is corrected according to the predetermined distance;
and 7: recording the farmland area which is sowed, and preventing repeated sowing; after the sowing is finished, the unmanned aerial vehicle flies back to the flying point.
Further, the flight control method of the unmanned aerial vehicle along the broadcast path specified in step 4 is segment control, and the unmanned aerial vehicle flies from one second image center to the next second image center each time, and the specific method is as follows:
the calculated flight direction is:
Figure 37960DEST_PATH_IMAGE001
wherein theta is flight angle information,x 1 for the plant protection unmanned aerial vehicle in the actual central point of the nth second image corresponding to the coordinate of the x axis,y 1 the coordinate of the plant protection unmanned aerial vehicle on the y axis corresponding to the actual central point of the nth second image,x 2 corresponding to the coordinate of the x axis at the actual central point of the (n + 1) th second image for the plant protection unmanned aerial vehicle,y 2 coordinates of the plant protection unmanned aerial vehicle on the y axis corresponding to the actual central point of the (n + 1) th second image;
the calculated flight distance is:
Figure 809607DEST_PATH_IMAGE002
and L is the actual flying distance.
According to the sowing control method based on the plant protection unmanned aerial vehicle, the unmanned aerial vehicle is planned to be a sowing path, the position of the unmanned aerial vehicle is accurately positioned, and sowing is carried out according to the sowing path; the method for judging whether the pixel points corresponding to the area to be sown are marked or not is used for determining whether the current area is sown or not, and the area to be sown is sown, so that the repeated and one-time sowing of the area to be sown is realized, and seeds to be sown are saved.
Drawings
FIG. 1 is a schematic diagram of a control method according to the present invention;
FIG. 2 is a flow chart of a control method of the present invention;
FIG. 3 is a positioning flowchart of the present invention.
Detailed Description
The embodiment provides a sowing control method based on a plant protection unmanned aerial vehicle, as shown in fig. 1, the control method includes the steps of:
s1: acquiring a first image, wherein the first image is an image of an area to be broadcast;
in step S1, before seeding the area to be sown, it is first known whether the area to be sown and the size of the edge area are large or small, and whether the area to be sown has been sown before, and the pixels of the sown area are marked in the first image to distinguish the sown area from the unsown area.
S2: and acquiring a job request instruction, wherein the job request instruction refers to an instruction for performing broadcast operation on an area to be broadcast, and triggering the plant protection unmanned aerial vehicle to perform related operations.
S3: extracting feature data in the first image based on the job request instruction, wherein the feature data are pixel point data marked as a broadcast job area;
in step S3, the marked feature data specifically refers to pixels marked as having a color, and before the area to be broadcast is broadcast, it is first determined whether the area has been broadcast, in this embodiment, it is determined whether the area has been broadcast by determining the color of the pixel corresponding to the area.
The characteristic data marked are specifically: acquiring pixel points of the area corresponding to the first image after the scattering operation; and color marking is carried out on the pixel point.
S4: and controlling the plant protection unmanned aerial vehicle to carry out scattering operation on the area to be scattered according to an operation path based on the characteristic data, wherein the operation path is a path for carrying out scattering operation on the area to be scattered.
In step S4, according to the extracted feature data, a path of the plant protection unmanned aerial vehicle when performing a broadcast operation in the area to be broadcast is further planned, and in the process of the broadcast operation, it is further necessary to determine whether a pixel point corresponding to the relevant area is marked, and further determine whether the area needs to be broadcast.
The sub-step of S4 includes:
acquiring pixel points corresponding to the first image in an area to be broadcast;
and judging whether the pixel point is the characteristic data, if so, not scattering the current area, and otherwise, scattering the current area.
The specific substep of judging whether the pixel point is the characteristic data is as follows:
acquiring LAB parameter values of pixel points;
and judging whether the LAB parameter value is in the parameter value range of the marked color, if so, taking the pixel point as the characteristic data.
In this embodiment, an LAB color model is used to detect whether a pixel point is marked, an LAB parameter value of a pixel point corresponding to a region to be broadcast is first obtained, and according to a parameter region range of a marked color of the pixel point, whether the obtained LAB parameter value falls within a parameter region range of the marked color pixel point is judged, if yes, the region is already broadcast, otherwise, the region is not broadcast, and the current region needs to be broadcast. Whether the current area is broadcast or not is determined by judging whether the pixel points are marked or not, the situation that the same area in the area to be broadcast is repeatedly broadcast or broadcast is missed is avoided, and the waste situation of seed broadcast is reduced.
In this embodiment, the control method further includes: when controlling plant protection unmanned aerial vehicle scatters the operation, it is right plant protection unmanned aerial vehicle flight path carries out accurate positioning, as shown in fig. 3, concrete step is:
dividing the first image into n second images;
acquiring first information, wherein the first information is a preset distance and a preset direction from the center point of the nth second image to the center point of the (n + 1) th second image;
calculating second information, wherein the second information is the actual flying distance and flying direction between the nth second image center point and the (n + 1) th second image center point;
the positioning method of the unmanned aerial vehicle in the flight process comprises the following steps:
the main innovation point in full-field positioning is partition positioning, the traditional visual positioning can be carried out only under the condition that the whole map can be seen in the visual field of the unmanned aerial vehicle, the operation area is divided into four areas, different boundaries are recognized in different areas, and the area capable of being positioned is enlarged by four times.
The calculation formula is as follows
Let the coordinates of the leftmost, rightmost, uppermost and lowermost boundaries be G1, G2, G3, G4,
the visual field center coordinate of the unmanned plane is
Figure 947197DEST_PATH_IMAGE003
The correction factor from camera 2D to real 2D is corr, and the longitudinal widths of the upper and lower half regions are respectively corr
Figure 767385DEST_PATH_IMAGE004
And
Figure 119869DEST_PATH_IMAGE005
the transverse width of the operation area is X;
the transformed location coordinates (x, y) when the drone is located in the upper left zone are then:
Figure 750833DEST_PATH_IMAGE006
Figure 126450DEST_PATH_IMAGE007
the transformed positioning coordinates (x, y) when the drone is located in the upper right zone are:
Figure 547067DEST_PATH_IMAGE008
Figure 206588DEST_PATH_IMAGE009
when unmanned aerial vehicle is located left lower district location coordinate (x, y) after the conversion be:
Figure 320037DEST_PATH_IMAGE010
Figure 182951DEST_PATH_IMAGE011
the transformed location coordinates (x, y) when the drone is located in the lower right region are:
Figure 83956DEST_PATH_IMAGE012
Figure 145453DEST_PATH_IMAGE013
and matching the second information with the first information, if the second information is the same as the first information, performing seeding operation on the (n + 1) th second image through the plant protection unmanned aerial vehicle, and otherwise, adjusting the position of the plant protection unmanned aerial vehicle.
By dividing the first image into a plurality of second images, the corresponding area to be broadcast is divided into a plurality of sub areas to be broadcast, and each sub area to be broadcast corresponds to one second image, as shown in fig. 2, the area to be broadcast can be broadcast in a divided area block manner, a path operation in a divided area manner is adopted, and a certain position deviation may be generated on a path where the unmanned aerial vehicle flies due to the influence of other environmental factors such as wind direction.
The specific expression of the flight direction is as follows:
Figure 367487DEST_PATH_IMAGE014
theta is the information on the flight angle,x 1 for the plant protection unmanned aerial vehicle in the nth second image actual central point corresponding to the x-axis coordinate,y 1 for the plant protection unmanned aerial vehicle, the actual central point of the nth second image corresponds to the coordinate of the y axis,x 2 corresponding to the coordinate of the x axis for the actual central point of the n +1 th second image of the plant protection unmanned aerial vehicle,y 2 coordinates of the plant protection unmanned aerial vehicle on the y axis corresponding to the actual central point of the (n + 1) th second image;
the specific expression of the actual flying distance is as follows:
Figure 966964DEST_PATH_IMAGE015
and L is the actual flying distance.
The control method further comprises the following steps: controlling the plant protection unmanned aerial vehicle to identify the edge of the area to be sowed, wherein the specific identification method comprises the following steps:
acquiring a regional edge image data set;
constructing a first model, and training the region edge image data set by adopting a deep learning network to obtain an optimal model;
and using the optimal model for identifying the edge of the area to be broadcast.
If the colors of the two sides of the edge of the area to be broadcast are obviously different, the edge is identified by identifying the color blocks, otherwise, the image of the edge of the area is collected in advance, a corresponding data set is established, a corresponding deep learning network is built for training, and the deep learning network is deployed on the unmanned aerial vehicle so as to identify the edge of the area.
According to the sowing control method based on the plant protection unmanned aerial vehicle, whether image pixel points corresponding to the area to be sown are marked or not is judged, if the image pixel points are marked, the corresponding area is not sown, otherwise, the corresponding area is sown, so that the accurate sowing operation of the area to be sown is realized, and the condition that the individual area is missed to be sown or is repeatedly sown is avoided.

Claims (2)

1. A plant protection unmanned aerial vehicle broadcast control method based on image positioning is characterized by comprising the following steps:
step 1: acquiring a first image, wherein the first image is a ground image comprising an area to be broadcast;
step 2: recognizing the farmland region in the first image by adopting a neural network, and partitioning the farmland according to the actual division of the farmland in the image; the blocking method comprises the following steps: recognizing the boundary of a road, a tree, a ridge, a ditch or a man-made mark in the image, fitting the recognized boundary of the road, the tree, the ridge, the ditch or the man-made mark into straight lines which are regarded as the boundary of a farmland, partitioning the farmland by adopting the straight lines, and storing a network graph formed by all the straight lines, wherein the network graph is called a graph library;
and step 3: manually selecting farmland blocks to be sown, virtually dividing each individual farmland to be sown into a plurality of small squares, wherein each small square is called a second image, the width of each small square is the width of unmanned aerial vehicle sowing, and the unmanned aerial vehicle sowing width is adjustable;
and 4, step 4: planning an unmanned aerial vehicle sowing path with shortest length and least repetition according to the virtual small grids divided in the step 3; transmitting the image of the farmland block to be sown and the planned path to an unmanned aerial vehicle;
and 5: the unmanned aerial vehicle takes off after acquiring the image of the farmland blocks to be sown and the planned path, preliminarily sets the flight direction of the unmanned aerial vehicle, enables the unmanned aerial vehicle to fly towards the area to be sown, acquires the front image in real time, matches the image of the farmland blocks to be sown in the image acquired in real time until the image of the farmland blocks to be sown is matched in the image acquired in real time, and flies to the initial sowing position to prepare sowing operation according to the set sowing path;
the method for matching the image of the farmland block to be sown in the image acquired in real time comprises the following steps:
step 5.1: identifying a farmland area in the real-time acquired image, then identifying boundary lines of the farmland by adopting the same method in the step 2, carrying out size normalization on a network graph formed by all the boundary lines, and normalizing to obtain that the size of each pixel representation is the same as the size of each pixel representation of the network graph in the step 2;
step 5.2: the method comprises the steps that a farmland boundary network in an image is obtained in real time through sliding interception and is called as a network 1, the size of the network 1 is counted as C x D, and the interception step length is 5-10 pixels; counting the number a of nodes in the network 1; counting the number b of nodes around each node in the size of C x D in the graph library, selecting the nodes in the graph library corresponding to the number of the nodes around the node as a = b +/-3, and intercepting a graph library area which takes the node as the center and takes C x D as the size and is called as a network 2;
step 5.3: counting the number of straight lines in the network 1 and the network 2, and selecting the network 2 with the same number of straight lines;
step 5.4: matching the shape of the network 2 selected in the step 5.3 with the shape of the network 1, wherein the shape similarity is greater than a set threshold value, and considering that the matching is successful, otherwise, performing the next matching;
step 5.5: after matching is successful, corresponding the image acquired in real time with the first image according to the positions of the network 1 and the network 2 in the respective images; determining farmland blocks to be sown in the images acquired in real time;
step 6: identifying the edge of the image of the farmland block to be sown according to the obtained image of the farmland block to be sown, measuring the distances from the unmanned aerial vehicle to all the edges of the image of the farmland block to be sown, selecting the two closest edges as references, and positioning the position of the unmanned aerial vehicle; 4, determining a sowing path of the unmanned aerial vehicle, calculating the distance from each point on the path to two nearest edges of the image of the farmland block to be sowed, calculating the distance from the current unmanned aerial vehicle to the two nearest edges of the image of the farmland block to be sowed in the actual sowing process, and correcting the deviation of the unmanned aerial vehicle in the flying process according to the predetermined distance;
and 7: recording the farmland area which is sowed, and preventing repeated sowing; after the sowing is finished, the unmanned aerial vehicle flies back to the flying point.
2. The method for controlling the spreading of the plant protection unmanned aerial vehicle based on image positioning as claimed in claim 1, wherein the flight control method of the unmanned aerial vehicle along the spreading path defined in step 4 is a segment control, and each time, the unmanned aerial vehicle flies from one second image center to the next second image center, and the specific method is as follows:
the calculated flight direction is:
Figure 564207DEST_PATH_IMAGE001
wherein theta is flight angle information,x 1 for the plant protection unmanned aerial vehicle in the actual central point of the nth second image corresponding to the coordinate of the x axis,y 1 the coordinate of the plant protection unmanned aerial vehicle on the y axis corresponding to the actual central point of the nth second image,x 2 corresponding to the coordinate of the x axis at the actual central point of the (n + 1) th second image for the plant protection unmanned aerial vehicle,y 2 coordinates of the plant protection unmanned aerial vehicle on the y axis corresponding to the actual central point of the (n + 1) th second image;
the calculated flight distance is:
Figure 932741DEST_PATH_IMAGE002
and L is the actual flying distance.
CN202210914951.4A 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning Expired - Fee Related CN114967763B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210914951.4A CN114967763B (en) 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210914951.4A CN114967763B (en) 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning

Publications (2)

Publication Number Publication Date
CN114967763A true CN114967763A (en) 2022-08-30
CN114967763B CN114967763B (en) 2022-11-08

Family

ID=82969098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210914951.4A Expired - Fee Related CN114967763B (en) 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning

Country Status (1)

Country Link
CN (1) CN114967763B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004310525A (en) * 2003-04-08 2004-11-04 Toyota Motor Corp Vehicular image processor
JP2006288467A (en) * 2005-04-06 2006-10-26 Fuji Photo Film Co Ltd Device and method for judging irradiation field and its program
CN103581501A (en) * 2012-07-31 2014-02-12 天津书生软件技术有限公司 Color correction method
CN104615146A (en) * 2015-02-05 2015-05-13 广州快飞计算机科技有限公司 Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal
US9429953B1 (en) * 2015-08-25 2016-08-30 Skycatch, Inc. Autonomously landing an unmanned aerial vehicle
CN107633202A (en) * 2017-08-11 2018-01-26 合肥嘉浓航空科技有限公司 A kind of plant protection unmanned plane based on the identification of farmland characteristics of image flies control method and system
US20180357909A1 (en) * 2015-12-09 2018-12-13 Dronesense Llc Drone Flight Operations
CN109358643A (en) * 2018-10-31 2019-02-19 阮镇荣 A kind of multi-mode unmanned plane pesticide spraying system and method based on image procossing
CN109859158A (en) * 2018-11-27 2019-06-07 邦鼓思电子科技(上海)有限公司 A kind of detection system, method and the machinery equipment on the working region boundary of view-based access control model
CN110140704A (en) * 2019-05-17 2019-08-20 安徽舒州农业科技有限责任公司 A kind of intelligent pesticide spraying method and system for plant protection drone
CN110728745A (en) * 2019-09-17 2020-01-24 上海大学 Underwater binocular stereoscopic vision three-dimensional reconstruction method based on multilayer refraction image model
CN110929598A (en) * 2019-11-07 2020-03-27 西安电子科技大学 Unmanned aerial vehicle-mounted SAR image matching method based on contour features
CN112434880A (en) * 2020-12-10 2021-03-02 清研灵智信息咨询(北京)有限公司 Patrol route planning and patrol personnel management system based on deep learning
CN112816939A (en) * 2020-12-31 2021-05-18 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN113409338A (en) * 2021-06-24 2021-09-17 西安交通大学 Super-pixel method based on probability distribution

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004310525A (en) * 2003-04-08 2004-11-04 Toyota Motor Corp Vehicular image processor
JP2006288467A (en) * 2005-04-06 2006-10-26 Fuji Photo Film Co Ltd Device and method for judging irradiation field and its program
CN103581501A (en) * 2012-07-31 2014-02-12 天津书生软件技术有限公司 Color correction method
CN104615146A (en) * 2015-02-05 2015-05-13 广州快飞计算机科技有限公司 Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal
US9429953B1 (en) * 2015-08-25 2016-08-30 Skycatch, Inc. Autonomously landing an unmanned aerial vehicle
US20180357909A1 (en) * 2015-12-09 2018-12-13 Dronesense Llc Drone Flight Operations
CN107633202A (en) * 2017-08-11 2018-01-26 合肥嘉浓航空科技有限公司 A kind of plant protection unmanned plane based on the identification of farmland characteristics of image flies control method and system
CN109358643A (en) * 2018-10-31 2019-02-19 阮镇荣 A kind of multi-mode unmanned plane pesticide spraying system and method based on image procossing
CN109859158A (en) * 2018-11-27 2019-06-07 邦鼓思电子科技(上海)有限公司 A kind of detection system, method and the machinery equipment on the working region boundary of view-based access control model
CN110140704A (en) * 2019-05-17 2019-08-20 安徽舒州农业科技有限责任公司 A kind of intelligent pesticide spraying method and system for plant protection drone
CN110728745A (en) * 2019-09-17 2020-01-24 上海大学 Underwater binocular stereoscopic vision three-dimensional reconstruction method based on multilayer refraction image model
CN110929598A (en) * 2019-11-07 2020-03-27 西安电子科技大学 Unmanned aerial vehicle-mounted SAR image matching method based on contour features
CN112434880A (en) * 2020-12-10 2021-03-02 清研灵智信息咨询(北京)有限公司 Patrol route planning and patrol personnel management system based on deep learning
CN112816939A (en) * 2020-12-31 2021-05-18 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN113409338A (en) * 2021-06-24 2021-09-17 西安交通大学 Super-pixel method based on probability distribution

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
LI LIU: "Boundary Tracking of Continuous Objects Based on Feasible Region Search in Underwater Acoustic Sensor Networks", 《IEEE TRANSACTIONS ON MOBILE COMPUTING ( EARLY ACCESS )》 *
刘晓晨: "基于贝叶斯理论的运动目标检测算法研究", 《中国优秀硕士学位论文库 信息科技辑》 *
吴婕: "基于视觉的无人机导航定位***关键技术研究", 《中国优秀硕士学位论文库 工程科技Ⅱ辑》 *
王林惠等: "基于图像识别的无人机精准喷雾控制***的研究", 《华南农业大学学报》 *
薛武: "无人机影像定位优化技术研究", 《中国优秀硕博论文库 基础科学辑》 *
贾银江: "无人机遥感图像拼接关键技术研究", 《中国优秀硕士学位论文库工程科技Ⅱ辑》 *

Also Published As

Publication number Publication date
CN114967763B (en) 2022-11-08

Similar Documents

Publication Publication Date Title
CN107148633B (en) Method for agronomic and agricultural monitoring using unmanned aerial vehicle system
WO2019179270A1 (en) Plant planting data measuring method, working route planning method, device and system
CN104615146B (en) Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal
EP3503025B1 (en) Utilizing artificial intelligence with captured images to detect agricultural failure
CN110282135B (en) Accurate pesticide spraying system and method for plant protection unmanned aerial vehicle
US10417753B2 (en) Location identifying device, location identifying method, and program therefor
CN110456820B (en) Pesticide spraying system based on ultra-bandwidth wireless positioning and control method
US11908074B2 (en) Method of identifying and displaying areas of lodged crops
CN112154447A (en) Surface feature recognition method and device, unmanned aerial vehicle and computer-readable storage medium
CN115761535B (en) Soil quality data analysis method and system
CN115657706B (en) Landform measurement method and system based on unmanned aerial vehicle
WO2021087685A1 (en) Operation planning method and device combining multispectral and earth surface semantic information, and apparatus
CN115689795A (en) Hillside orchard crop growth analysis method and system based on unmanned aerial vehicle remote sensing
CN112650215A (en) Plant protection operation method and plant protection operation device for unmanned vehicle, and unmanned vehicle control system
CN114967763B (en) Plant protection unmanned aerial vehicle sowing control method based on image positioning
CN117193347B (en) Unmanned aerial vehicle flight height control method and device, electronic equipment and storage medium
CN110487251B (en) Operation method for carrying out large-scale mapping by using unmanned aerial vehicle without measuring camera
CN113807128A (en) Seedling shortage marking method and device, computer equipment and storage medium
CN115619286B (en) Method and system for evaluating quality of sample plot of breeding field district
Peña-Barragán et al. Discrimination of Crop Rows using Object-Based Analysis in UAV Images for early Site-Specific Weed Management in Maize Fields
CN114485612B (en) Route generation method and device, unmanned operation vehicle, electronic equipment and storage medium
CN115661691A (en) Corn seedling supplementing scheme planning method based on unmanned aerial vehicle remote sensing image deep learning
CN118228944B (en) Protection forest degradation investigation and evaluation system based on artificial intelligence visual recognition
CN112565726B (en) Method for determining job prescription chart, job control method and related device
CN117389310B (en) Agricultural unmanned aerial vehicle sprays operation control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20221108