CN115909113B - Method for investigating forestry harmful organisms through unmanned aerial vehicle remote sensing monitoring - Google Patents

Method for investigating forestry harmful organisms through unmanned aerial vehicle remote sensing monitoring Download PDF

Info

Publication number
CN115909113B
CN115909113B CN202310029641.9A CN202310029641A CN115909113B CN 115909113 B CN115909113 B CN 115909113B CN 202310029641 A CN202310029641 A CN 202310029641A CN 115909113 B CN115909113 B CN 115909113B
Authority
CN
China
Prior art keywords
pest
point
area
rlsub
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310029641.9A
Other languages
Chinese (zh)
Other versions
CN115909113A (en
Inventor
刘春燕
刘华
许伟杰
周志新
梁祖锋
官东清
余国城
廖艳平
陈梦婷
曾泽方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Bohuan Ecological Technology Co ltd
Original Assignee
Guangdong Bohuan Ecological Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Bohuan Ecological Technology Co ltd filed Critical Guangdong Bohuan Ecological Technology Co ltd
Priority to CN202310029641.9A priority Critical patent/CN115909113B/en
Publication of CN115909113A publication Critical patent/CN115909113A/en
Application granted granted Critical
Publication of CN115909113B publication Critical patent/CN115909113B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Catching Or Destruction (AREA)

Abstract

The invention belongs to the technical field of forestry big data, and provides a method for remotely monitoring and investigating forestry pests by an unmanned aerial vehicle, which comprises the steps of continuously collecting spectral images of a region to be monitored to form a spectral image sequence, and acquiring pest areas of each spectral image in an identification sequence according to an RX algorithm; the position of the pest area on the reference image is corrected by the identification sequence. The method can identify the deviation positions of the insect pest areas and the identification points on the unmanned aerial vehicle airlines, can remove strong interference generated by accurate identification of the insect pest areas when the insect pest areas are locally changed in the natural state of plants in the reference images, and improves the identification accuracy of the insect pest areas. And the difference between the positions of the non-insect pest areas and the insect pest areas is uniformly judged, and the insect pest areas can be intelligently added and deleted to the reference image according to the difference.

Description

Method for investigating forestry harmful organisms through unmanned aerial vehicle remote sensing monitoring
Technical Field
The invention belongs to the technical field of forestry big data, and particularly relates to a method for remotely monitoring and investigating forestry harmful organisms by an unmanned aerial vehicle.
Background
At present, a method for monitoring forestry pests (such as pine worms, dead trees of pine nematodes, broadleaf tree species and the like) through remote sensing is to collect remote sensing images in a monitoring area, compare spectral characteristic information of pixels in historical remote sensing images in the monitoring area, and further judge whether the pests exist in the area, but the spectral characteristic information of the pixels is changed due to the change of plants in natural states (vegetation growth, blade water content change caused by a dry environment or blade coverage change), so that strong interference is generated on the spectral characteristic information identification of the pests, and the false identification rate is high.
In order to solve the problem, chinese patent publication No. CN115131683A is beneficial to 2022, 9 and 30, and discloses a forestry information identification method based on high-resolution remote sensing images, wherein different tree types of a region to be detected are partitioned by clustering pixel points by referring to spectral feature vectors of the pixel points in the image; the variability of the wavelength and the gray value in the gray value sequence of the pixel point is calculated to obtain the deformation of the spectrum curve of the pixel point and the offset of gray values of different wave bands, and meanwhile, the damage probability is comprehensively judged by combining the uniformity of the gray value variation of each wave band so as to eliminate the condition of higher curve offset caused by high vegetation coverage degree and improve the accuracy of forestry plant diseases and insect pests detection; the damage probability is adjusted according to the uniformity of the damage probability of the same tree species to obtain the final damage probability so as to judge the occurrence probability of the plant diseases and insect pests, so that the plant diseases and insect pests of the same tree species in different seasons can be detected; however, in many prior arts including this technique, if there is already a pest area in the reference image, or if the plant itself locally changes in a natural state, strong interference occurs, which results in a decrease in recognition accuracy, and the pest area cannot be recognized even, and since the image of the region is large, the recognition speed is slow, and a large calculation cost is required.
Disclosure of Invention
The invention aims to provide a method for remotely monitoring and investigating forestry pests by using an unmanned aerial vehicle, which aims to solve one or more technical problems in the prior art and at least provides a beneficial selection or creation condition.
In order to achieve the above purpose, the invention provides a method for remotely monitoring and investigating forestry pests by using an unmanned aerial vehicle, which specifically comprises the following steps:
s1: acquiring an electronic map of a forest land to be monitored as a region to be monitored, wherein the forest land to be monitored comprises pine family plants, broad-leaved tree species, eucalyptus tree species and/or vine plants (pine family plants such as wetland pine, masson pine and larch, broad-leaved tree species such as poplar, elm and camphor tree, eucalyptus tree species such as eucalyptus citriodora, guangjiu and the like, and vine plants such as mikania micrantha, five-claw dragon and jingjingteng and the like);
s2: setting the information of a starting point, an ending point and an obstacle position of the unmanned aerial vehicle in a region to be monitored, and setting a route path of the unmanned aerial vehicle from the starting point to the ending point in the region to be monitored;
s3: starting from a starting point, taking a plurality of points on the route path as identification points at fixed acquisition intervals;
s4: the unmanned aerial vehicle flies according to the route path and continuously acquires the spectrum images of the area to be monitored, a sequence formed by each spectrum image is sequentially used as a spectrum image sequence, and in the spectrum image sequence, one spectrum image is acquired at each identification point position; (because the narrow field of view of the spectrometer carried by the unmanned aerial vehicle limits the ground coverage of each spectrum image, the spectrum images in the field of view of the spectrometer need to be continuously collected in the flight process to form a spectrum image data set, and then the spectrum images in the spectrum image data set are spliced to obtain an integral spectrum image, so that the integral spectrum image can only effectively cover the area to be monitored);
S5: acquiring a sequence formed by all spectrum images of all identification point positions in the spectrum image sequence as an identification sequence;
s6: obtaining pest areas of each spectrum image in the identification sequence according to an RX algorithm;
s7: correcting the position of the pest area on the reference image through the identification sequence;
s8: marking the positions of pest areas on the corrected reference image on an electronic map of the forest land to be monitored.
Further, in S1, the electronic map is discrete data of ground elements and phenomena having determined coordinates and attributes in a certain coordinate system acquired through satellites and unmanned aerial vehicles.
Further, in S2, setting the route path of the unmanned aerial vehicle from the starting point to the end point in the area to be monitored comprises any one of an A-type algorithm, a K-shortest path-based algorithm, a Dijkstra algorithm, an APF algorithm (APF: artificial Potential Field, artificial potential field algorithm) and an SAA algorithm (SAA: simulated Annealing Algorithm, simulated annealing algorithm) in the area to be monitored.
The route path is a group of continuously-changed position coordinate data on the electronic map, and the unmanned aerial vehicle performs flight movement according to the position coordinate data corresponding to the route path in the region to be monitored on the electronic map.
The starting point is the take-off position (generally the current position) of the unmanned aerial vehicle in the area to be monitored, the end points are the flight end point positions of the unmanned aerial vehicle in the area to be monitored, the obstacle position information is the obstacle position in the area to be monitored, and the unmanned aerial vehicle needs to keep a preset obstacle safety distance (2-5 m) with the obstacle position.
Further, initial parameters of the unmanned aerial vehicle are set: the field angle of the spectrum imaging system (15-25 degrees), the route distance (8-35 m), the flying height (15-55 m), the flying speed (10-20 m/s), the boundary safety distance (2-5 m) and the obstacle safety distance (2-5 m).
Preferably, the field angle of the spectrum imaging system is set to 16 degrees, the flying height is 30m, and the flying speed is 13m/s.
Further, in S3, the fixed acquisition interval is 8 to 35m.
Further, in S4, the continuously collecting the spectral image of the area to be monitored is to collect the spectral image of the area to be monitored by the spectrometer carried by the unmanned plane in the process of flying according to the route.
Further, in S6, the method for acquiring the pest areas of each spectrum image in the identification sequence according to the RX algorithm includes: and (3) inputting the spectral reflectivity of each pixel point in the spectral image into an RX algorithm (Reed-Xiaoli hyperspectral target detection algorithm) to identify the insect pest area at the corresponding position of the area to be monitored in the spectral image.
The remote sensing monitoring of plant diseases and insect pests is carried out by measuring the change of chlorophyll content in plants, and the spectral reflectance of chlorophyll has obvious characteristics and can change along with the change of wavelength. The spectral reflectance of chlorophyll of plants is very low at 0.5-0.7 μm and increases significantly at 0.7-0.9 μm in the near infrared band, since green plants can absorb the radiant energy in this band. As chlorophyll in plants of plant diseases and insect pests is gradually reduced, the light absorption capacity is weakened, the reflectivity of visible light is obviously improved, and the reflectivity of infrared region is obviously reduced, especially in near infrared band.
Preferably, the method for acquiring the pest areas of each spectrum image in the identification sequence according to the RX algorithm comprises the following steps: insect pest areas (RX anomaly detection methods) at corresponding positions in the operation area of the disease and pest information unmanned aerial vehicle are obtained by a method disclosed in patent publication No. CN107347849 a.
The pest area is an area formed by pest pixel points or an inner area of an edge line formed by the pest pixel points; the disease and pest pixel points are corresponding pixel points in the spectrum image, wherein the visible light reflectivity of the corresponding pixel points is higher than the average value of the visible light reflectivity of each pixel point in the spectrum image, or the disease and pest pixel points are corresponding pixel points in the spectrum image, the near infrared band spectrum reflectivity of which is lower than the average value of the near infrared band spectrum reflectivity of each pixel point in the spectrum image.
Further, in S7, the reference image is an integral spectrum image of the area to be monitored, which is formed by stitching the spectrum images acquired in steps S1 to S6 in the previous time interval, and the integral spectrum image is marked with the positions of the insect pest areas identified in step S6 on the corresponding positions of the area to be monitored. (the time interval is typically 7-30 days), wherein the spectral image stitching comprises the steps of geometric correction, image preprocessing, image registration and image fusion.
Preferably, in S7, the reference image is a spectrum image of the area to be monitored obtained by an onboard full spectrum multi-mode imaging spectrometer or a spectrum image obtained by a hyperspectral remote sensing satellite, the pest area of the spectrum image is identified according to an RX algorithm, and the pest area is marked on the area to be monitored at a position corresponding to the pest area of the spectrum image.
Further, the unmanned aerial vehicle is a coaxial double-rotor unmanned aerial vehicle, a miniature rotor unmanned aerial vehicle or a multi-rotor unmanned aerial vehicle carrying a spectrometer; preferably, the spectrometer is a GaiaField-mini spectrometer, a specm AFX series hyperspectral camera or an ATH9020 hyperspectral imager.
Further, in S7, the method of correcting the position of the pest area on the reference image by the recognition sequence includes the steps of:
Recording an identification sequence as RLoceal, taking each spectrum image in the RLoceal as an identification partition, wherein RLoceal= { RL (i) }, i is the serial number of the identification partition, i epsilon [1, N1], N1 is the number of the identification partition, and RL (i) is the ith identification partition in the sequence RLoceal;
in the value range of i, taking the identification point corresponding to the RL (i) as P1 (i), and calculating the Euclidean distance between the geometric gravity center point of each pest area in the RL (i) and the P1 (i), wherein the average value of all the Euclidean distances in the RL (i) is RLmean (i); selecting a point of the geometrical gravity center point of each pest area in the RL (i) at a position corresponding to the geometrical gravity center point with the largest distance value from the P1 (i) on the reference image as a telecentric point P2 (i), and selecting a point of the geometrical gravity center point of each pest area in the RL (i) at a position corresponding to the geometrical gravity center point with the smallest distance value from the P1 (i) on the reference image as a near-center point P3 (i);
screening out the set formed by the pest areas with the distances from the geometric center of gravity points of all the pest areas to P2 (i) smaller than RLmean (i) from the reference image, and marking the set as RLSUB (i); and/or screening out the insect damage areas with the distances from the geometric center of gravity points of all the insect damage areas to P2 (i) smaller than RLmean (R) of near infrared band spectral reflectances of all the pixel points in the insect damage areas lower than the mean (R) of near infrared band spectral reflectances of all the pixel points of the corresponding insect damage areas of the telecentric point P2 (i) in RL (i), wherein the set formed by the insect damage areas is recorded as RLSUB (i);
Taking N2 as the number of elements in the RLSUB (i), wherein RLSUB (i, j) is the geometric gravity center point of the jth insect pest area in the RLSUB (i), j is the sequence number of the elements in the RLSUB (i), and j is [1, N2];
if n2=0, then RL (i) is noted as the identified partition without correction; if N2 > 0, correcting the position of the pest area in the corresponding position of RL (i) on the reference image in the value range of i.
According to the scheme, the deviation positions of the insect pest areas and the identification points on the unmanned aerial vehicle airlines can be identified, the accurate relative positions of the insect pest areas can be identified, and the relative accuracy of insect pest position coordinates is guaranteed.
Further, the method for correcting the position of the pest area in the corresponding position of the RL (i) on the reference image within the value range of the i comprises the following steps:
calculating the average value of the distances between the telecentric points P2 (i) and each RLSUB (i, j) in the RLSUB (i) within the value range of j, and recording the average value as a telecentric distance AD; calculating the average value of the distances between the near-heart points P3 (i) and each of the RLsub (i), and recording the average value as a near-heart distance BD; recording the point of the corresponding position of the RLSUB (i, j) with the shortest distance between the telecentric point P2 (i) and each RLSUB (i, j) on the reference image as PF; a point of the reference image corresponding to the shortest RLSUB (i, j) between the closest center point P3 (i) and each RLSUB (i, j) is PN;
When AD is more than or equal to BD, the direction from the telecentric point P2 (i) to PF is taken as the adjustment direction, the pest area with the geometrical gravity center closest to PF in each pest area on the reference image is moved by telecentric distance AD towards the adjustment direction,
when AD < BD, the direction from the near center point P3 (i) to PN is taken as the adjustment direction, and the pest area with the geometrical center point closest to PN in the pest areas on the reference image is moved by the near center distance BD towards the adjustment direction.
Preferably, when AD.gtoreq.BD, a correction range A is defined on the reference image with the center of P2 (i) and the radius of the line connecting P2 (i) to PF, and if there are pest areas other than the pest areas corresponding to P2 (i) in the corresponding area on the RL (i) in the correction range A, these pest areas on the RL (i) are copied to the corresponding positions of these pest areas on the reference image.
Preferably, when AD < BD, a range on the reference image with the center of P3 (i) and the radius of the line connecting P3 (i) to PN is the correction range B, and if there are pest areas other than the pest areas corresponding to P3 (i) in the corresponding area on RL (i) in the correction range B, these pest areas on RL (i) are copied to the corresponding positions of these pest areas on the reference image.
Preferably, when AD is equal to or greater than BD, a range on the reference image with P2 (i) as the center and a line connecting P2 (i) to PF as the radius is set as a correction range A, and if the correction range A does not include any insect pest area other than the insect pest area corresponding to P2 (i) in the corresponding area on RL (i), the insect pest area in the correction range A on the reference image is deleted.
Preferably, when AD < BD, a range on the reference image with the center of P3 (i) and the radius of the line connecting P3 (i) to PN is the correction range B, and if the correction range B does not have a pest area other than the pest area corresponding to P3 (i) in the corresponding area on RL (i), the pest area in the correction range B on the reference image is deleted.
According to the technical scheme, the telecentric distance AD and the near-heart distance BD are offset distances from the geometric center of gravity point of the insect pest area by taking the identification point as a reference, and by correcting the offset distances, strong interference generated by accurate identification of the insect pest area when the insect pest area is changed in a natural state through the local part of the plant in the reference image can be removed, so that the identification accuracy of the insect pest area is improved, and the identification speed is greatly improved and the operation cost is reduced because the identification of the whole image is not required by processing according to the current local image.
In order to reduce the influence of partial non-pest areas when the local part of the plant passing through the plant is changed in a natural state, and further improve the positioning and correction accuracy of the pest areas, the invention provides the following preferable scheme:
preferably, the method for correcting the position of the pest area on the reference image within the value range of i comprises the following steps:
calculating pest position deviation index dev (i) of RL (i), wherein the specific method comprises the following steps:
Figure BDA0004046301620000051
wherein the degree of deviation function
Figure BDA0004046301620000052
I P2 (i) -RLSUB (i, j) is the distance between telecentric point P2 (i) and RLSUB (i, j);
i P3 (i) -RLSUB (i, j) i is the distance between the near-heart point P3 (i) to RLSUB (i, j);
calculating the disease and pest position deviation indexes dev (i) of all RL (i) in the value range of i, calculating the average value of all dev (i) as meandev, and marking the identification partition RL (i) of all dev (i) more than or equal to meandev as the identification partition to be optimized;
when RL (i) is the identification partition to be optimized, calculating the average value of the distances between the telecentric point P2 (i) and each RLSUB (i, j) in the RLSUB (i) as a telecentric distance AD in the value range of j; calculating the average value of the distances between the near-heart points P3 (i) and each of the RLsub (i), and recording the average value as a near-heart distance BD; recording the point of the corresponding position of the RLSUB (i, j) with the shortest distance between the telecentric point P2 (i) and each RLSUB (i, j) on the reference image as PF; a point of the reference image corresponding to the shortest RLSUB (i, j) between the closest center point P3 (i) and each RLSUB (i, j) is PN;
When AD is more than or equal to BD, taking the direction from the telecentric point P2 (i) to the PF as the adjustment direction, and moving the insect pest area with the geometrical gravity center closest to the PF in each insect pest area on the reference image by the telecentric distance AD towards the adjustment direction;
when AD < BD, the direction from the near center point P3 (i) to PN is taken as the adjustment direction, and the pest area with the geometrical center point closest to PN in the pest areas on the reference image is moved by the near center distance BD towards the adjustment direction.
When AD is more than or equal to BD, the direction from the telecentric point P2 (i) to PF is taken as the adjustment direction, the pest area with the geometrical gravity center closest to PF in each pest area on the reference image is moved by telecentric distance AD towards the adjustment direction,
when AD < BD, the direction from the near center point P3 (i) to PN is taken as the adjustment direction, and the pest area with the geometrical center point closest to PN in the pest areas on the reference image is moved by the near center distance BD towards the adjustment direction.
Preferably, when AD.gtoreq.BD, a correction range A is defined on the reference image with the center of P2 (i) and the radius of the line connecting P2 (i) to PF, and if there are pest areas other than the pest areas corresponding to P2 (i) in the corresponding area on the RL (i) in the correction range A, these pest areas on the RL (i) are copied to the corresponding positions of these pest areas on the reference image.
Preferably, when AD < BD, a range on the reference image with the center of P3 (i) and the radius of the line connecting P3 (i) to PN is the correction range B, and if there are pest areas other than the pest areas corresponding to P3 (i) in the corresponding area on RL (i) in the correction range B, these pest areas on RL (i) are copied to the corresponding positions of these pest areas on the reference image.
Preferably, when AD is equal to or greater than BD, a range on the reference image with P2 (i) as the center and a line connecting P2 (i) to PF as the radius is set as a correction range A, and if the correction range A does not include any insect pest area other than the insect pest area corresponding to P2 (i) in the corresponding area on RL (i), the insect pest area in the correction range A on the reference image is deleted.
Preferably, when AD < BD, a range on the reference image with the center of P3 (i) and the radius of the line connecting P3 (i) to PN is the correction range B, and if the correction range B does not have a pest area other than the pest area corresponding to P3 (i) in the corresponding area on RL (i), the pest area in the correction range B on the reference image is deleted.
According to the preferred scheme, the deviation index of the pest positions is used for judging the deviation degree between the telecentric point and the near-central point of the pest positions and the identification point, so that the difference between the positions of the non-pest areas and the pest areas is judged in a balanced manner, the pest areas can be intelligently added and deleted to the reference image according to the difference, the positions of the pest areas are accurately regulated, and the influence on the pest areas when the local of the part of the non-pest areas passing through the plant in the natural state is changed is reduced.
The invention also provides a system for remotely monitoring and investigating forestry pests by using the unmanned aerial vehicle, which comprises: a processor, a memory, and a computer program stored in the memory and executable on the processor, wherein the processor implements steps in the method for remotely monitoring and investigating forest pests by using an unmanned aerial vehicle when executing the computer program, the system for remotely monitoring and investigating forest pests by using an unmanned aerial vehicle can be operated in a computing device such as a desktop computer, a notebook computer, a palm computer and a cloud data center, and the executable system can include, but is not limited to, a processor, a memory, and a server cluster, and the processor executes the computer program to be executed in units of the following systems:
the map acquisition unit is used for acquiring an electronic map of the forest land to be monitored as a region to be monitored;
the route setting unit is used for setting the information of the starting point, the finishing point and the obstacle position of the unmanned aerial vehicle flying in the area to be monitored, and setting a route path of the unmanned aerial vehicle from the starting point to the finishing point in the area to be monitored;
the identifying and dividing unit is used for taking a plurality of points on the route path as identifying points at fixed acquisition intervals from the starting point;
The system comprises a spectrum acquisition unit, a monitoring unit and a control unit, wherein the spectrum acquisition unit is used for enabling an unmanned aerial vehicle to fly according to a route and continuously acquiring spectrum images of a region to be monitored, and a sequence formed by each spectrum image is sequentially used as a spectrum image sequence, and in the spectrum image sequence, one spectrum image is acquired at each identification point position;
the identification extraction unit is used for acquiring a sequence formed by all the spectrum images at all the identification point positions in the spectrum image sequence as an identification sequence;
the pest identification unit is used for acquiring pest areas of each spectrum image in the identification sequence according to an RX algorithm;
a pest correction unit for correcting the position of the pest area on the reference image by the recognition sequence;
and the map marking unit is used for marking the positions of the insect pest areas on the corrected reference image on the electronic map of the forest land to be monitored.
The beneficial effects of the invention are as follows: according to the method for remotely monitoring and investigating forestry pests by using the unmanned aerial vehicle, disclosed by the invention, the offset positions of the pest areas and the identification points on the unmanned aerial vehicle airlines can be identified, the accurate relative positions of the pest areas can be identified, the relative accuracy of the coordinates of the pest positions is ensured, the strong interference on the accurate identification of the pest areas when the local changes of the pest areas in a natural state of a plant are carried out in a reference image can be removed by correcting the offset distance, the identification accuracy of the pest areas is improved, and the whole image is not required to be identified due to the fact that the identification is carried out according to the current local image, so that the identification speed is greatly improved, and the operation cost is reduced. The difference between the positions of the non-insect-attack areas and the insect-attack areas is judged in a balanced mode, the insect-attack areas of the reference image can be intelligently added and deleted according to the difference, the positions of the insect-attack areas are accurately adjusted, and the influence on the insect-attack areas when the parts of the non-insect-attack areas passing through the plants in the natural state are changed is reduced.
Drawings
The above and other features of the present invention will become more apparent from the detailed description of the embodiments thereof given in conjunction with the accompanying drawings, in which like reference characters designate like or similar elements, and it is apparent that the drawings in the following description are merely some examples of the present invention, and other drawings may be obtained from these drawings without inventive effort to those of ordinary skill in the art, in which:
FIG. 1 is a flow chart of a method for remotely monitoring and investigating forestry pests by an unmanned aerial vehicle;
fig. 2 is a block diagram of a system for remotely monitoring and investigating forestry pests by using an unmanned aerial vehicle
FIG. 3 is an electronic map of a woodland to be monitored in an embodiment of the present invention;
fig. 4 is a map labeling position of a pest area in an electronic map of a forest land to be monitored after identification according to an embodiment of the present invention.
Detailed Description
The conception, specific structure, and technical effects produced by the present invention will be clearly and completely described below with reference to the embodiments and the drawings to fully understand the objects, aspects, and effects of the present invention. It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The same reference numbers will be used throughout the drawings to refer to the same or like parts.
Referring to fig. 1, a flowchart of a method for remotely monitoring and investigating forest pests by using an unmanned aerial vehicle according to the present invention will be described, and a preferred embodiment of a method for remotely monitoring and investigating forest pests by using an unmanned aerial vehicle according to an embodiment of the present invention will be described in detail with reference to fig. 1. It should be emphasized that the following description is merely exemplary in nature and is in no way intended to limit the scope of the invention or its applications.
S1: acquiring an electronic map of a forest land to be monitored as a region to be monitored;
s2: setting the information of a starting point, an ending point and an obstacle position of the unmanned aerial vehicle in a region to be monitored, and setting a route path of the unmanned aerial vehicle from the starting point to the ending point in the region to be monitored;
s3: starting from a starting point, taking a plurality of points on the route path as identification points at fixed acquisition intervals;
s4: the unmanned aerial vehicle flies according to the route path and continuously acquires the spectrum images of the area to be monitored, a sequence formed by each spectrum image is sequentially used as a spectrum image sequence, and in the spectrum image sequence, one spectrum image is acquired at each identification point position; (because the narrow field of view of the spectrometer carried by the unmanned aerial vehicle limits the ground coverage of each spectrum image, the spectrum images in the field of view of the spectrometer need to be continuously collected in the flight process to form a spectrum image data set, and then the spectrum images in the spectrum image data set are spliced to obtain an integral spectrum image, so that the integral spectrum image can only effectively cover the area to be monitored);
S5: acquiring a sequence formed by all spectrum images of all identification point positions in the spectrum image sequence as an identification sequence;
s6: obtaining pest areas of each spectrum image in the identification sequence according to an RX algorithm;
s7: correcting the position of the pest area on the reference image through the identification sequence;
s8: marking the positions of pest areas on the corrected reference image on an electronic map of the forest land to be monitored.
Further, in S1, the electronic map is discrete data of ground elements and phenomena having determined coordinates and attributes in a certain coordinate system acquired through satellites and unmanned aerial vehicles.
Further, in S2, setting the route path of the unmanned aerial vehicle from the starting point to the end point in the area to be monitored comprises any one of an A-type algorithm, a K-shortest path-based algorithm, a Dijkstra algorithm, an APF algorithm (APF: artificial Potential Field, artificial potential field algorithm) and an SAA algorithm (SAA: simulated Annealing Algorithm, simulated annealing algorithm) in the area to be monitored.
The starting point is the take-off position (generally the current position) of the unmanned aerial vehicle in the area to be monitored, the end points are the flight end point positions of the unmanned aerial vehicle in the area to be monitored, the obstacle position information is the obstacle position in the area to be monitored, and the unmanned aerial vehicle needs to keep a preset obstacle safety distance (2-5 m) with the obstacle position.
Further, initial parameters of the unmanned aerial vehicle are set: the field angle of the spectrum imaging system (15-25 degrees), the route distance (8-35 m), the flying height (15-55 m), the flying speed (10-20 m/s), the boundary safety distance (2-5 m) and the obstacle safety distance (2-5 m).
Preferably, the field angle of the spectrum imaging system is set to 16 degrees, the flying height is 30m, and the flying speed is 13m/s.
Further, in S3, the fixed acquisition interval is 8 to 35m.
Further, in S4, the continuously collecting the spectral image of the area to be monitored is to collect the spectral image of the area to be monitored by the spectrometer carried by the unmanned plane in the process of flying according to the route.
Further, in S6, the method for acquiring the pest areas of each spectrum image in the identification sequence according to the RX algorithm includes: and (3) inputting the spectral reflectivity of each pixel point in the spectral image into an RX algorithm (Reed-Xiaoli hyperspectral target detection algorithm) to identify the insect pest area at the corresponding position of the area to be monitored in the spectral image.
The remote sensing monitoring of plant diseases and insect pests is carried out by measuring the change of chlorophyll content in plants, and the spectral reflectance of chlorophyll has obvious characteristics and can change along with the change of wavelength. The spectral reflectance of chlorophyll of plants is very low at 0.5-0.7 μm and increases significantly at 0.7-0.9 μm in the near infrared band, since green plants can absorb the radiant energy in this band. As chlorophyll in plants of plant diseases and insect pests is gradually reduced, the light absorption capacity is weakened, the reflectivity of visible light is obviously improved, and the reflectivity of infrared region is obviously reduced, especially in near infrared band.
Preferably, the method for acquiring the pest areas of each spectrum image in the identification sequence according to the RX algorithm comprises the following steps: insect pest areas (RX anomaly detection methods) at corresponding positions in the operation area of the disease and pest information unmanned aerial vehicle are obtained by a method disclosed in patent publication No. CN107347849 a.
The pest area is an area formed by pest pixel points or an inner area of an edge line formed by the pest pixel points; the disease and pest pixel points are corresponding pixel points in the spectrum image, wherein the visible light reflectivity of the corresponding pixel points is higher than the average value of the visible light reflectivity of each pixel point in the spectrum image, or the disease and pest pixel points are corresponding pixel points in the spectrum image, the near infrared band spectrum reflectivity of which is lower than the average value of the near infrared band spectrum reflectivity of each pixel point in the spectrum image.
Further, in S7, the reference image is an integral spectrum image of the area to be monitored, which is formed by stitching the spectrum images acquired in steps S1 to S6 in the previous time interval, and the integral spectrum image is marked with the positions of the insect pest areas identified in step S6 on the corresponding positions of the area to be monitored. (the time interval is typically 7-30 days), wherein the spectral image stitching comprises the steps of geometric correction, image preprocessing, image registration and image fusion.
Preferably, in S7, the reference image is a spectrum image of the area to be monitored obtained by an onboard full spectrum multi-mode imaging spectrometer or a spectrum image obtained by a hyperspectral remote sensing satellite, the pest area of the spectrum image is identified according to an RX algorithm, and the pest area is marked on the area to be monitored at a position corresponding to the pest area of the spectrum image.
Further, the unmanned aerial vehicle is a coaxial double-rotor unmanned aerial vehicle, a miniature rotor unmanned aerial vehicle or a multi-rotor unmanned aerial vehicle carrying a spectrometer; preferably, the spectrometer is a GaiaField-mini spectrometer, a specm AFX series hyperspectral camera or an ATH9020 hyperspectral imager.
Further, in S7, the method of correcting the position of the pest area on the reference image by the recognition sequence includes the steps of:
recording an identification sequence as RLoceal, taking each spectrum image in the RLoceal as an identification partition, wherein RLoceal= { RL (i) }, i is the serial number of the identification partition, i epsilon [1, N1], N1 is the number of the identification partition, and RL (i) is the ith identification partition in the sequence RLoceal;
in the value range of i, taking the identification point corresponding to the RL (i) as P1 (i), and calculating the Euclidean distance between the geometric gravity center point of each pest area in the RL (i) and the P1 (i), wherein the average value of all the Euclidean distances in the RL (i) is RLmean (i); selecting a point of the geometrical gravity center point of each pest area in the RL (i) at a position corresponding to the geometrical gravity center point with the largest distance value from the P1 (i) on the reference image as a telecentric point P2 (i), and selecting a point of the geometrical gravity center point of each pest area in the RL (i) at a position corresponding to the geometrical gravity center point with the smallest distance value from the P1 (i) on the reference image as a near-center point P3 (i);
Screening out the set formed by the pest areas with the distances from the geometric center of gravity points of all the pest areas to P2 (i) smaller than RLmean (i) from the reference image, and marking the set as RLSUB (i); and/or screening out the insect damage areas with the distances from the geometric center of gravity points of all the insect damage areas to P2 (i) smaller than RLmean (R) of near infrared band spectral reflectances of all the pixel points in the insect damage areas lower than the mean (R) of near infrared band spectral reflectances of all the pixel points of the corresponding insect damage areas of the telecentric point P2 (i) in RL (i), wherein the set formed by the insect damage areas is recorded as RLSUB (i);
taking N2 as the number of elements in the RLSUB (i), wherein RLSUB (i, j) is the geometric gravity center point of the jth insect pest area in the RLSUB (i), j is the sequence number of the elements in the RLSUB (i), and j is [1, N2];
if n2=0, then RL (i) is noted as the identified partition without correction; if N2 > 0, correcting the position of the pest area in the corresponding position of RL (i) on the reference image in the value range of i.
According to the scheme, the deviation positions of the insect pest areas and the identification points on the unmanned aerial vehicle airlines can be identified, the accurate relative positions of the insect pest areas can be identified, and the relative accuracy of insect pest position coordinates is guaranteed.
Further, the method for correcting the position of the pest area in the corresponding position of the RL (i) on the reference image within the value range of the i comprises the following steps:
Calculating the average value of the distances between the telecentric points P2 (i) and each RLSUB (i, j) in the RLSUB (i) within the value range of j, and recording the average value as a telecentric distance AD; calculating the average value of the distances between the near-heart points P3 (i) and each of the RLsub (i), and recording the average value as a near-heart distance BD; recording the point of the corresponding position of the RLSUB (i, j) with the shortest distance between the telecentric point P2 (i) and each RLSUB (i, j) on the reference image as PF; a point of the reference image corresponding to the shortest RLSUB (i, j) between the closest center point P3 (i) and each RLSUB (i, j) is PN;
when AD is more than or equal to BD, the direction from the telecentric point P2 (i) to PF is taken as the adjustment direction, the pest area with the geometrical gravity center closest to PF in each pest area on the reference image is moved by telecentric distance AD towards the adjustment direction,
when AD < BD, the direction from the near center point P3 (i) to PN is taken as the adjustment direction, and the pest area with the geometrical center point closest to PN in the pest areas on the reference image is moved by the near center distance BD towards the adjustment direction.
Preferably, when AD.gtoreq.BD, a correction range A is defined on the reference image with the center of P2 (i) and the radius of the line connecting P2 (i) to PF, and if there are pest areas other than the pest areas corresponding to P2 (i) in the corresponding area on the RL (i) in the correction range A, these pest areas on the RL (i) are copied to the corresponding positions of these pest areas on the reference image.
Preferably, when AD < BD, a range on the reference image with the center of P3 (i) and the radius of the line connecting P3 (i) to PN is the correction range B, and if there are pest areas other than the pest areas corresponding to P3 (i) in the corresponding area on RL (i) in the correction range B, these pest areas on RL (i) are copied to the corresponding positions of these pest areas on the reference image.
Preferably, when AD is equal to or greater than BD, a range on the reference image with P2 (i) as the center and a line connecting P2 (i) to PF as the radius is set as a correction range A, and if the correction range A does not include any insect pest area other than the insect pest area corresponding to P2 (i) in the corresponding area on RL (i), the insect pest area in the correction range A on the reference image is deleted.
Preferably, when AD < BD, a range on the reference image with the center of P3 (i) and the radius of the line connecting P3 (i) to PN is the correction range B, and if the correction range B does not have a pest area other than the pest area corresponding to P3 (i) in the corresponding area on RL (i), the pest area in the correction range B on the reference image is deleted.
According to the technical scheme, the telecentric distance AD and the near-heart distance BD are offset distances from the geometric center of gravity point of the insect pest area by taking the identification point as a reference, and by correcting the offset distances, strong interference generated by accurate identification of the insect pest area when the insect pest area is changed in a natural state through the local part of the plant in the reference image can be removed, so that the identification accuracy of the insect pest area is improved, and the identification speed is greatly improved and the operation cost is reduced because the identification of the whole image is not required by processing according to the current local image.
In order to reduce the influence of partial non-pest areas when the local part of the plant passing through the plant is changed in a natural state, and further improve the positioning and correction accuracy of the pest areas, the invention provides the following preferable scheme:
preferably, the method for correcting the position of the pest area on the reference image within the value range of i comprises the following steps:
calculating pest position deviation index dev (i) of RL (i), wherein the specific method comprises the following steps:
Figure BDA0004046301620000121
wherein the degree of deviation function
Figure BDA0004046301620000122
I P2 (i) -RLSUB (i, j) is the distance between telecentric point P2 (i) and RLSUB (i, j);
i P3 (i) -RLSUB (i, j) i is the distance between the near-heart point P3 (i) to RLSUB (i, j);
calculating the disease and pest position deviation indexes dev (i) of all RL (i) in the value range of i, calculating the average value of all dev (i) as meandev, and marking the identification partition RL (i) of all dev (i) more than or equal to meandev as the identification partition to be optimized;
when RL (i) is the identification partition to be optimized, calculating the average value of the distances between the telecentric point P2 (i) and each RLSUB (i, j) in the RLSUB (i) as a telecentric distance AD in the value range of j; calculating the average value of the distances between the near-heart points P3 (i) and each of the RLsub (i), and recording the average value as a near-heart distance BD; recording the point of the corresponding position of the RLSUB (i, j) with the shortest distance between the telecentric point P2 (i) and each RLSUB (i, j) on the reference image as PF; a point of the reference image corresponding to the shortest RLSUB (i, j) between the closest center point P3 (i) and each RLSUB (i, j) is PN;
When AD is more than or equal to BD, taking the direction from the telecentric point P2 (i) to the PF as the adjustment direction, and moving the insect pest area with the geometrical gravity center closest to the PF in each insect pest area on the reference image by the telecentric distance AD towards the adjustment direction;
when AD < BD, the direction from the near center point P3 (i) to PN is taken as the adjustment direction, and the pest area with the geometrical center point closest to PN in the pest areas on the reference image is moved by the near center distance BD towards the adjustment direction.
When AD is more than or equal to BD, the direction from the telecentric point P2 (i) to PF is taken as the adjustment direction, the pest area with the geometrical gravity center closest to PF in each pest area on the reference image is moved by telecentric distance AD towards the adjustment direction,
when AD < BD, the direction from the near center point P3 (i) to PN is taken as the adjustment direction, and the pest area with the geometrical center point closest to PN in the pest areas on the reference image is moved by the near center distance BD towards the adjustment direction.
Preferably, when AD.gtoreq.BD, a correction range A is defined on the reference image with the center of P2 (i) and the radius of the line connecting P2 (i) to PF, and if there are pest areas other than the pest areas corresponding to P2 (i) in the corresponding area on the RL (i) in the correction range A, these pest areas on the RL (i) are copied to the corresponding positions of these pest areas on the reference image.
Preferably, when AD < BD, a range on the reference image with the center of P3 (i) and the radius of the line connecting P3 (i) to PN is the correction range B, and if there are pest areas other than the pest areas corresponding to P3 (i) in the corresponding area on RL (i) in the correction range B, these pest areas on RL (i) are copied to the corresponding positions of these pest areas on the reference image.
Preferably, when AD is equal to or greater than BD, a range on the reference image with P2 (i) as the center and a line connecting P2 (i) to PF as the radius is set as a correction range A, and if the correction range A does not include any insect pest area other than the insect pest area corresponding to P2 (i) in the corresponding area on RL (i), the insect pest area in the correction range A on the reference image is deleted.
Preferably, when AD < BD, a range on the reference image with the center of P3 (i) and the radius of the line connecting P3 (i) to PN is the correction range B, and if the correction range B does not have a pest area other than the pest area corresponding to P3 (i) in the corresponding area on RL (i), the pest area in the correction range B on the reference image is deleted.
According to the preferred scheme, the deviation index of the pest positions is used for judging the deviation degree between the telecentric point and the near-central point of the pest positions and the identification point, so that the difference between the positions of the non-pest areas and the pest areas is judged in a balanced manner, the pest areas can be intelligently added and deleted to the reference image according to the difference, the positions of the pest areas are accurately regulated, and the influence on the pest areas when the local of the part of the non-pest areas passing through the plant in the natural state is changed is reduced.
The system for remotely monitoring and investigating forestry pests by using the unmanned aerial vehicle provided by the embodiment of the invention, as shown in fig. 2, comprises: a processor, a memory and a computer program stored in the memory and executable on the processor, the processor implementing the steps in an embodiment of a method for remotely monitoring and investigating forestry pests by a drone when executing the computer program, the processor executing the computer program being executed in the units of the following system:
the map acquisition unit is used for acquiring an electronic map of the forest land to be monitored as a region to be monitored;
the route setting unit is used for setting the information of the starting point, the finishing point and the obstacle position of the unmanned aerial vehicle flying in the area to be monitored, and setting a route path of the unmanned aerial vehicle from the starting point to the finishing point in the area to be monitored;
the identifying and dividing unit is used for taking a plurality of points on the route path as identifying points at fixed acquisition intervals from the starting point;
the system comprises a spectrum acquisition unit, a monitoring unit and a control unit, wherein the spectrum acquisition unit is used for enabling an unmanned aerial vehicle to fly according to a route and continuously acquiring spectrum images of a region to be monitored, and a sequence formed by each spectrum image is sequentially used as a spectrum image sequence, and in the spectrum image sequence, one spectrum image is acquired at each identification point position;
The identification extraction unit is used for acquiring a sequence formed by all the spectrum images at all the identification point positions in the spectrum image sequence as an identification sequence;
the pest identification unit is used for acquiring pest areas of each spectrum image in the identification sequence according to an RX algorithm;
a pest correction unit for correcting the position of the pest area on the reference image by the recognition sequence;
and the map marking unit is used for marking the positions of the insect pest areas on the corrected reference image on the electronic map of the forest land to be monitored.
The system for remotely monitoring and investigating forestry pests by using the unmanned aerial vehicle comprises: the steps in the method embodiment of the unmanned aerial vehicle remote sensing monitoring and forestry pest investigation method are realized when the processor executes the computer program, and the system for unmanned aerial vehicle remote sensing monitoring and forestry pest investigation can be operated in a computing device such as a desktop computer, a notebook computer, a palm computer and a cloud data center, and the operable system can comprise, but is not limited to, a processor, a memory and a server cluster.
The system for remotely monitoring and investigating forestry pests by the unmanned aerial vehicle can be operated in computing equipment such as a desktop computer, a notebook computer, a palm computer and a cloud data center. The system for remotely monitoring and investigating forestry pests by using the unmanned aerial vehicle comprises, but is not limited to, a processor and a memory. It will be appreciated by those skilled in the art that the example is merely an example of a method for remotely monitoring and investigating forestry pests by an unmanned aerial vehicle, and is not intended to limit the method for remotely monitoring and investigating forestry pests by an unmanned aerial vehicle, and may include more or fewer components than the example, or may combine certain components, or different components, e.g., the system for remotely monitoring and investigating forestry pests by an unmanned aerial vehicle may further include an input/output device, a network access device, a bus, etc.
Preferably, according to the system for remotely monitoring and investigating forest pests by using an unmanned aerial vehicle provided by the embodiment of the invention, as shown in fig. 3, an electronic map of a forest land to be monitored is shown in the embodiment of the invention; fig. 4 shows a map marking position of a pest area in an electronic map of a forest land to be monitored after recognition by a system for remotely monitoring and investigating forest pests by using an unmanned aerial vehicle according to an embodiment of the present invention, so that the recognition rate of the embodiment of the present invention to the pest area of the forest pests can reach more than 95% through multiple tests, and the position accuracy of the pest area is higher through comparison with manual recognition.
The processor may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete component gate or transistor logic devices, discrete hardware components, or the like. The general processor can be a microprocessor or any conventional processor, and the processor is a control center of the system for remotely monitoring and investigating forestry pests by using various interfaces and lines to connect various sub-areas of the whole system for remotely monitoring and investigating forestry pests by using the unmanned aerial vehicle.
The memory can be used for storing the computer program and/or the module, and the processor can realize various functions of the method for remotely monitoring and investigating forestry pests by the unmanned aerial vehicle by running or executing the computer program and/or the module stored in the memory and calling the data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
Although the present invention has been described in considerable detail and with particularity with respect to several described embodiments, it is not intended to be limited to any such detail or embodiment or any particular embodiment so as to effectively cover the intended scope of the invention. Furthermore, the foregoing description of the invention has been presented in its embodiments contemplated by the inventors for the purpose of providing a useful description, and for the purposes of providing a non-essential modification of the invention that may not be presently contemplated, may represent an equivalent modification of the invention.

Claims (5)

1. A method for remotely monitoring and investigating forestry pests by using an unmanned aerial vehicle, which is characterized by comprising the following steps:
s1: acquiring an electronic map of a forest land to be monitored as a region to be monitored;
s2: setting the information of a starting point, an ending point and an obstacle position of the unmanned aerial vehicle in a region to be monitored, and setting a route path of the unmanned aerial vehicle from the starting point to the ending point in the region to be monitored;
s3: starting from a starting point, taking a plurality of points on the route path as identification points at fixed acquisition intervals;
s4: the unmanned aerial vehicle flies according to the route path and continuously acquires the spectrum images of the area to be monitored, a sequence formed by each spectrum image is sequentially used as a spectrum image sequence, and in the spectrum image sequence, one spectrum image is acquired at each identification point position;
S5: acquiring a sequence formed by all spectrum images of all identification point positions in the spectrum image sequence as an identification sequence;
s6: obtaining pest areas of each spectrum image in the identification sequence according to an RX algorithm;
s7: correcting the position of the pest area on the reference image through the identification sequence;
s8: marking the positions of pest areas on the corrected reference image on an electronic map of the forest land to be monitored;
wherein in S7, the method of correcting the position of the pest area on the reference image by the recognition sequence includes the steps of:
recording an identification sequence as RLoceal, taking each spectrum image in the RLoceal as an identification partition, wherein RLoceal= { RL (i) }, i is the serial number of the identification partition, i epsilon [1, N1], N1 is the number of the identification partition, and RL (i) is the ith identification partition in the sequence RLoceal;
in the value range of i, taking the identification point corresponding to the RL (i) as P1 (i), and calculating the Euclidean distance between the geometric gravity center point of each pest area in the RL (i) and the P1 (i), wherein the average value of all the Euclidean distances in the RL (i) is RLmean (i); selecting a point of the geometrical gravity center point of each pest area in the RL (i) at a position corresponding to the geometrical gravity center point with the largest distance value from the P1 (i) on the reference image as a telecentric point P2 (i), and selecting a point of the geometrical gravity center point of each pest area in the RL (i) at a position corresponding to the geometrical gravity center point with the smallest distance value from the P1 (i) on the reference image as a near-center point P3 (i);
Screening out the set formed by the pest areas with the distances from the geometric center of gravity points of all the pest areas to P2 (i) smaller than RLmean (i) from the reference image, and marking the set as RLSUB (i); and/or screening out the insect damage areas with the distances from the geometric center of gravity points of all the insect damage areas to P2 (i) smaller than RLmean (R) of near infrared band spectral reflectances of all the pixel points in the insect damage areas lower than the mean (R) of near infrared band spectral reflectances of all the pixel points of the corresponding insect damage areas of the telecentric point P2 (i) in RL (i), wherein the set formed by the insect damage areas is recorded as RLSUB (i);
taking N2 as the number of elements in the RLSUB (i), wherein RLSUB (i, j) is the geometric gravity center point of the jth insect pest area in the RLSUB (i), j is the sequence number of the elements in the RLSUB (i), and j is [1, N2];
if n2=0, then RL (i) is noted as the identified partition without correction; if N2 is more than 0, correcting the position of the pest area in the corresponding position of RL (i) on the reference image in the value range of i;
the method for correcting the position of the pest area in the corresponding position of the RL (i) on the reference image in the value range of the i comprises the following steps:
calculating the average value of the distances between the telecentric points P2 (i) and each RLSUB (i, j) in the RLSUB (i) within the value range of j, and recording the average value as a telecentric distance AD; calculating the average value of the distances between the near-heart points P3 (i) and each of the RLsub (i), and recording the average value as a near-heart distance BD; recording the point of the corresponding position of the RLSUB (i, j) with the shortest distance between the telecentric point P2 (i) and each RLSUB (i, j) on the reference image as PF; a point of the reference image corresponding to the shortest RLSUB (i, j) between the closest center point P3 (i) and each RLSUB (i, j) is PN;
When AD is more than or equal to BD, the direction from the telecentric point P2 (i) to PF is taken as the adjustment direction, the pest area with the geometrical gravity center closest to PF in each pest area on the reference image is moved by telecentric distance AD towards the adjustment direction,
when AD < BD, the direction from the near center point P3 (i) to PN is taken as the adjustment direction, and the pest area with the geometrical center point closest to PN in the pest areas on the reference image is moved by the near center distance BD towards the adjustment direction.
2. The method for remotely monitoring and investigating forestry pests by using an unmanned aerial vehicle according to claim 1, wherein in S6, the method for acquiring pest areas of each spectral image in the identification sequence according to the RX algorithm is as follows: inputting the spectral reflectivity of each pixel point in the spectral image into an RX algorithm to identify a pest area at a position corresponding to the area to be monitored in the spectral image; the pest area is an area formed by pest pixel points or an inner area of an edge line formed by the pest pixel points; the disease and pest pixel points are corresponding pixel points in the spectrum image, wherein the visible light reflectivity of the corresponding pixel points is higher than the average value of the visible light reflectivity of each pixel point in the spectrum image, or the disease and pest pixel points are corresponding pixel points in the spectrum image, the near infrared band spectrum reflectivity of which is lower than the average value of the near infrared band spectrum reflectivity of each pixel point in the spectrum image.
3. The method for remotely monitoring and investigating forestry pests by using an unmanned aerial vehicle according to claim 1, wherein the unmanned aerial vehicle is a coaxial double-rotor unmanned aerial vehicle, a miniature rotor unmanned aerial vehicle or a multi-rotor unmanned aerial vehicle carrying a spectrometer; the spectrometer is a GaiaField-mini spectrometer, a Specim AFX series hyperspectral camera or an ATH9020 hyperspectral imager.
4. The method for remotely monitoring and investigating forestry pests by using an unmanned aerial vehicle according to claim 1, wherein the method for correcting the position of the pest area on the reference image within the value range of i is replaced by:
calculating pest position deviation index dev (i) of RL (i), wherein the specific method comprises the following steps:
Figure FDA0004231778950000021
wherein the degree of deviation function
Figure FDA0004231778950000031
I P2 (i) -RLSUB (i, j) is the distance between telecentric point P2 (i) and RLSUB (i, j);
i P3 (i) -RLSUB (i, j) i is the distance between the near-heart point P3 (i) to RLSUB (i, j);
calculating the disease and pest position deviation indexes dev (i) of all RL (i) in the value range of i, calculating the average value of all dev (i) as meandev, and marking the identification partition RL (i) of all dev (i) more than or equal to meandev as the identification partition to be optimized;
when RL (i) is the identification partition to be optimized, calculating the average value of the distances between the telecentric point P2 (i) and each RLSUB (i, j) in the RLSUB (i) as a telecentric distance AD in the value range of j; calculating the average value of the distances between the near-heart points P3 (i) and each of the RLsub (i), and recording the average value as a near-heart distance BD; recording the point of the corresponding position of the RLSUB (i, j) with the shortest distance between the telecentric point P2 (i) and each RLSUB (i, j) on the reference image as PF; a point of the reference image corresponding to the shortest RLSUB (i, j) between the closest center point P3 (i) and each RLSUB (i, j) is PN;
When AD is more than or equal to BD, taking the direction from the telecentric point P2 (i) to the PF as the adjustment direction, and moving the insect pest area with the geometrical gravity center closest to the PF in each insect pest area on the reference image by the telecentric distance AD towards the adjustment direction;
when AD < BD, the direction from the near center point P3 (i) to PN is taken as the adjustment direction, and the pest area with the geometrical center point closest to PN in the pest areas on the reference image is moved by the near center distance BD towards the adjustment direction.
5. The utility model provides a system of unmanned aerial vehicle remote sensing monitoring investigation forestry harmful organism which characterized in that, a system of unmanned aerial vehicle remote sensing monitoring investigation forestry harmful organism includes: a processor, a memory and a computer program stored in the memory and running on the processor, wherein the processor, when executing the computer program, implements the steps in the method for remotely monitoring and investigating forestry pests by using an unmanned aerial vehicle according to any one of claims 1 to 4, and the system for remotely monitoring and investigating forestry pests by using an unmanned aerial vehicle is running in a computing device of a desktop computer, a notebook computer, a palm computer or a cloud data center.
CN202310029641.9A 2023-01-09 2023-01-09 Method for investigating forestry harmful organisms through unmanned aerial vehicle remote sensing monitoring Active CN115909113B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310029641.9A CN115909113B (en) 2023-01-09 2023-01-09 Method for investigating forestry harmful organisms through unmanned aerial vehicle remote sensing monitoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310029641.9A CN115909113B (en) 2023-01-09 2023-01-09 Method for investigating forestry harmful organisms through unmanned aerial vehicle remote sensing monitoring

Publications (2)

Publication Number Publication Date
CN115909113A CN115909113A (en) 2023-04-04
CN115909113B true CN115909113B (en) 2023-06-16

Family

ID=86497082

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310029641.9A Active CN115909113B (en) 2023-01-09 2023-01-09 Method for investigating forestry harmful organisms through unmanned aerial vehicle remote sensing monitoring

Country Status (1)

Country Link
CN (1) CN115909113B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117115662A (en) * 2023-09-25 2023-11-24 中国科学院空天信息创新研究院 Jujube tree spider mite pest identification method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103364781A (en) * 2012-04-11 2013-10-23 南京财经大学 Remote sensing data and geographical information system-based grainfield ground reference point screening method
CN115311588A (en) * 2022-09-16 2022-11-08 航天宏图信息技术股份有限公司 Pine wood nematode disease stumpage detection method and device based on unmanned aerial vehicle remote sensing image
WO2022257139A1 (en) * 2021-06-11 2022-12-15 深圳市大疆创新科技有限公司 Plant state determination method, terminal, and computer-readable storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7653218B1 (en) * 2006-05-02 2010-01-26 Orbimage Si Opco, Inc. Semi-automatic extraction of linear features from image data
CN103456011A (en) * 2013-09-02 2013-12-18 杭州电子科技大学 Improved hyperspectral RX abnormal detection method by utilization of complementary information
CN104266982B (en) * 2014-09-04 2017-03-15 浙江托普仪器有限公司 A kind of large area insect pest quantifies monitoring system
CN106915462A (en) * 2017-02-14 2017-07-04 福建兴宇信息科技有限公司 Forestry pests & diseases intelligent identifying system based on multi-source image information
CN107347849B (en) * 2017-07-18 2020-09-29 河海大学 Intelligent spraying system based on hyperspectral real-time detection technology
CN108693119B (en) * 2018-04-20 2020-09-25 北京麦飞科技有限公司 Intelligent pest and disease damage investigation and printing system based on unmanned aerial vehicle hyperspectral remote sensing
US11744168B2 (en) * 2019-11-29 2023-09-05 Soilmetrix, Inc. Enhanced management zones for precision agriculture
CN110472525B (en) * 2019-07-26 2021-05-07 浙江工业大学 Noise detection method for time series remote sensing vegetation index
CN113378912B (en) * 2021-06-08 2023-05-12 长光卫星技术股份有限公司 Forest illegal reclamation land block detection method based on deep learning target detection
CN114694023A (en) * 2022-03-18 2022-07-01 河南农业大学 Cotton aphid severity grading method based on spectral index data reconstruction
CN114936971A (en) * 2022-06-08 2022-08-23 浙江理工大学 Unmanned aerial vehicle remote sensing multispectral image splicing method and system for water area
CN115131683B (en) * 2022-08-25 2022-12-09 金乡县林业保护和发展服务中心(金乡县湿地保护中心、金乡县野生动植物保护中心、金乡县国有白洼林场) Forestry information identification method based on high-resolution remote sensing image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103364781A (en) * 2012-04-11 2013-10-23 南京财经大学 Remote sensing data and geographical information system-based grainfield ground reference point screening method
WO2022257139A1 (en) * 2021-06-11 2022-12-15 深圳市大疆创新科技有限公司 Plant state determination method, terminal, and computer-readable storage medium
CN115311588A (en) * 2022-09-16 2022-11-08 航天宏图信息技术股份有限公司 Pine wood nematode disease stumpage detection method and device based on unmanned aerial vehicle remote sensing image

Also Published As

Publication number Publication date
CN115909113A (en) 2023-04-04

Similar Documents

Publication Publication Date Title
US10614562B2 (en) Inventory, growth, and risk prediction using image processing
Osco et al. A CNN approach to simultaneously count plants and detect plantation-rows from UAV imagery
Feng et al. A comprehensive review on recent applications of unmanned aerial vehicle remote sensing with various sensors for high-throughput plant phenotyping
Iqbal et al. Gray level co-occurrence matrix (GLCM) texture based crop classification using low altitude remote sensing platforms
Calou et al. The use of UAVs in monitoring yellow sigatoka in banana
Jiménez-Brenes et al. Automatic UAV-based detection of Cynodon dactylon for site-specific vineyard management
US20170177938A1 (en) Automated detection of nitrogen deficiency in crop
JP5560157B2 (en) Spectral information extraction device
CN115909113B (en) Method for investigating forestry harmful organisms through unmanned aerial vehicle remote sensing monitoring
Ramesh et al. Detection of rows in agricultural crop images acquired by remote sensing from a UAV
WO2020208641A1 (en) Recurrent pattern image classification and registration
CA3138812A1 (en) Automatic crop classification system and method
Ferro et al. Technologies and innovative methods for precision viticulture: a comprehensive review
Khuzaimah et al. Application and potential of drone technology in oil palm plantation: Potential and limitations
Suresh Kumar et al. Selective fruit harvesting: Research, trends and developments towards fruit detection and localization–A review
CN117193347B (en) Unmanned aerial vehicle flight height control method and device, electronic equipment and storage medium
Zou et al. The fusion of satellite and unmanned aerial vehicle (UAV) imagery for improving classification performance
Anuar et al. Remote sensing for detection of ganoderma disease and bagworm infestation in oil palm
Badeka et al. Navigation route mapping for harvesting robots in vineyards using UAV-based remote sensing
Adão et al. UAS-based hyperspectral sensing methodology for continuous monitoring and early detection of vineyard anomalies
Marin et al. Individual Olive Tree Detection in RGB Images
Gromova Weed detection in UAV images of cereal crops with instance segmentation
Popescu et al. Orchard monitoring based on unmanned aerial vehicles and image processing by artificial neural networks: a systematic review
Kim et al. Deep Learning Performance Comparison Using Multispectral Images and Vegetation Index for Farmland Classification
Yu et al. Advancements in Utilizing Image-Analysis Technology for Crop-Yield Estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant