CN117745536B - Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles - Google Patents

Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles Download PDF

Info

Publication number
CN117745536B
CN117745536B CN202311793744.9A CN202311793744A CN117745536B CN 117745536 B CN117745536 B CN 117745536B CN 202311793744 A CN202311793744 A CN 202311793744A CN 117745536 B CN117745536 B CN 117745536B
Authority
CN
China
Prior art keywords
fire
unmanned aerial
wire
aerial vehicle
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311793744.9A
Other languages
Chinese (zh)
Other versions
CN117745536A (en
Inventor
李兴东
王远朋
岳妍
孙龙
林传营
黄启超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeast Forestry University
Original Assignee
Northeast Forestry University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeast Forestry University filed Critical Northeast Forestry University
Priority to CN202311793744.9A priority Critical patent/CN117745536B/en
Publication of CN117745536A publication Critical patent/CN117745536A/en
Application granted granted Critical
Publication of CN117745536B publication Critical patent/CN117745536B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/28Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture specially adapted for farming

Landscapes

  • Image Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles relates to the field of large-scale forest fire monitoring research. The invention aims to solve the problems that when multiple unmanned aerial vehicles are used for monitoring, each unmanned aerial vehicle is used for regional monitoring, data fusion among the multiple unmanned aerial vehicles is lacked, unavoidable errors exist among sensors carried by each unmanned aerial vehicle, and the like. The technical key points are as follows: the method comprises the steps of performing block acquisition on fire scene edge data to obtain a fire scene temperature infrared image containing time information, a fire scene point cloud and coordinates of an inertial navigation system under a world coordinate system; extracting a fire wire by using the acquired fire field infrared image; detecting the loss of the real fire wire caused by the shielding of the tree crown so as to restore the shielded real fire wire; identifying local characteristics of the fire wires, simplifying the fire wires into a series of orderly characteristic region representations, carrying out characteristic region matching on the two fire wires, and carrying out similarity comparison on fire wire fragments represented by the matched characteristic regions to obtain identical-name fire wire fragments and identical-name fire points; the Euclidean distance of all the same-name live wire fragments under the world coordinate system is defined as an error, and the overall error is minimized by searching a group of optimal unmanned aerial vehicle pose, so that the positions of all the observation live wires are adjusted to realize live wire splicing.

Description

Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles
Technical Field
The invention relates to the field of large-scale forest fire monitoring and research, in particular to a method and a system for splicing large-scale fire wires of forest fires based on multiple unmanned aerial vehicles.
Background
Once forest fires occur, the damage is huge, the whole information of the fire scene is obtained in time, and the forest fires are put out with emphasis, so that the destruction of forest resources can be reduced, and the life and property loss of human beings can be saved.
The single unmanned aerial vehicle monitoring efficiency is lower, and the reliability is not high, hardly satisfies the demand of carrying out comprehensive monitoring to large-scale conflagration. The main advantages of the multi-unmanned aerial vehicle collaborative monitoring of forest fires are as follows:
(1) When a fire disaster happens, the fire scene is shot through the cooperative monitoring of the unmanned aerial vehicles, so that a wider range can be observed and covered at the same time, the comprehensive and fine monitoring of the large-scale fire disaster is realized, and more accurate fire scene information is acquired;
(2) The multi-unmanned aerial vehicle collaborative monitoring is utilized, so that the fire scene information can be obtained rapidly, emergency response measures can be made timely, and the fire condition can be controlled effectively;
(3) The fire scene environment is complex, personnel are easy to be injured in the live line inspection process, and the multiple unmanned aerial vehicles are used for collaborative monitoring to inspect and monitor the fire, so that personnel can be prevented from entering a dangerous area, and the risk of casualties is reduced;
(4) The fire wire splicing technology of the multiple unmanned aerial vehicles can cover a wider area by simultaneously carrying multiple sensors, so that the fire monitoring efficiency is improved;
under the scene of large-scale fire, the monitoring range of a single unmanned aerial vehicle is limited, and the single unmanned aerial vehicle cannot accurately and efficiently monitor the large-scale fire at the present stage and cannot cope with the complex fire scene environment. And the problem can be effectively alleviated by the cooperative monitoring of multiple unmanned aerial vehicles.
The monitoring is carried out by using a plurality of unmanned aerial vehicles, each unmanned aerial vehicle carries out regional monitoring during monitoring, data fusion among the unmanned aerial vehicles is lacked, and unavoidable errors exist among sensors carried by each unmanned aerial vehicle. Therefore, it is highly desirable to provide a multi-unmanned aerial vehicle live wire splicing technology for fusing data acquired by a plurality of unmanned aerial vehicles to form a complete fire information diagram.
The document number is CN116222571A, which is a prior art of an unmanned aerial vehicle cluster track planning method (CN 202310129615.3) for paying attention to the edge of a forest fire, and discloses an unmanned aerial vehicle cluster track planning method for paying attention to the edge of the forest fire, wherein an edge map of the forest fire is firstly obtained after the edge is expanded, a convex polygon of a scanning range is obtained by calculation, the distance from a discrete point in a fire area to each side of the convex polygon of the scanning range is obtained, the shortest distance in the fire area is used as an importance index of the discrete point, an equal importance line graph is drawn according to the index, different areas in the map are weighted, the distance between adjacent tracks of the unmanned aerial vehicle scanning in the area is determined according to the weight of the area, and the side overlapping rate of the unmanned aerial vehicle scanning in the edge area is improved, so that the aerial image precision of the unmanned aerial vehicle is improved; then constructing a multi-unmanned aerial vehicle scanning range distribution function and an unmanned aerial vehicle three-dimensional track cost function, and solving an optimal position of a range segmentation ray and an optimal three-dimensional track planning by adopting a particle swarm algorithm and a genetic simulated annealing algorithm. The prior art is how to shoot a live wire through unmanned aerial vehicle track planning, is unmanned aerial vehicle track planning, and does not relate to how to perform data processing. How to fuse the data acquired by a plurality of unmanned aerial vehicles to form a complete fire information graph and how to reduce errors of sensors mounted on each unmanned aerial vehicle are not described.
Disclosure of Invention
The invention aims to solve the technical problems that:
The invention aims to solve the problems that a single unmanned aerial vehicle cannot accurately and efficiently monitor a large-scale fire disaster at the present stage, even if multiple unmanned aerial vehicles are used for monitoring, each unmanned aerial vehicle is used for regional monitoring during monitoring, data fusion among the multiple unmanned aerial vehicles is lacked, unavoidable errors exist among sensors carried by each unmanned aerial vehicle, and the like, and further provides a forest fire disaster large-scale live wire splicing method and system based on the multiple unmanned aerial vehicles.
The invention adopts the technical scheme for solving the technical problems:
The invention provides a forest fire large-scale fire wire splicing method based on multiple unmanned aerial vehicles, which adopts a method for optimizing the pose of the unmanned aerial vehicle to splice multiple fire wires, and comprises the following steps:
Step one: the method comprises the steps that a plurality of unmanned aerial vehicles carrying infrared cameras, laser radars and inertial navigation systems are used for carrying out block acquisition on fire scene edge data, and a fire scene temperature infrared image containing time information, a fire scene point cloud and coordinates of the inertial navigation systems under a world coordinate system are obtained;
step two: extracting a fire wire by utilizing the fire field infrared image obtained in the first step;
step three: detecting the loss of a real fire wire caused by the shielding of a crown, and restoring the shielded real fire wire;
Step four: converting the live wire from a camera coordinate system to a laser radar coordinate system to obtain a three-dimensional live wire, identifying local characteristics of the live wire, simplifying the live wire into a series of ordered characteristic region representations, carrying out characteristic region matching on two live wires, and carrying out similarity comparison on live wire fragments represented by the matched characteristic regions to obtain homonymous live wire fragments and homonymous fire points;
Step five: and respectively converting the observation live wire of each unmanned aerial vehicle into a world coordinate system according to the pose of each unmanned aerial vehicle under the world coordinate system provided by the inertial navigation system. The Euclidean distance of all the same-name live wire fragments under the world coordinate system is defined as an error, and the overall error is minimized by searching a group of optimal unmanned aerial vehicle pose [ T 1…Tn ], so that the positions of all the observation live wires are adjusted, and the aim of splicing the live wires is fulfilled.
The invention provides a forest fire large-scale live wire splicing method based on multiple unmanned aerial vehicles, which comprises the following specific technical means:
the method comprises the steps that a plurality of unmanned aerial vehicles carrying infrared cameras, laser radars and an inertial navigation system are used for carrying out block collection on fire scene edge data, a fire scene temperature infrared image containing time information, a fire scene point cloud and coordinates of the inertial navigation system under a world coordinate system are obtained, wherein each pixel point in the fire scene infrared image is provided with the temperature information, the data obtained during monitoring by the unmanned aerial vehicles can be transmitted to a ground end in real time, and the ground end can carry out data processing;
In the invention, the mode of acquiring fire scene data mainly through a plurality of unmanned aerial vehicles in a blocking way is shown in fig. 2. Each unmanned aerial vehicle carries out the blocking to the fire scene area that contains live wire part and gathers, to each unmanned aerial vehicle, acquires unmanned aerial vehicle three-dimensional coordinate through inertial navigation, acquires the infrared image of fire scene through infrared camera.
The equipment has time difference in data acquisition, and the live wire splicing error caused by data time mismatch is avoided by time stamp alignment. The method of time stamp alignment is as follows: and taking the time of the mobile phone as a reference, recording the difference value between the time of the equipment and the reference, and changing the time of the equipment for collecting data to the same time according to the difference value.
Step two: extracting a fire wire by utilizing the fire field infrared image obtained in the first step;
In the invention, aiming at the shot infrared image of the fire scene, the area of the fire wire is positioned rapidly, then the edge fire points are found out by adopting a mode of identifying the marked area pixel by pixel, and the edge fire points are connected into the fire wire, and the specific process is as follows:
step two, carrying out contrast stretching on the collected infrared images of the fire scene, and enhancing the contrast between the fire scene area and the surrounding environment;
Step two, converting the infrared image subjected to contrast stretching into a gray level image;
And step two, performing traversal detection on the gray level image by taking a pixel area with the size of n multiplied by n as a unit, wherein the up-and-down sliding step length is n. For each detected n multiplied by n pixel area, only selecting the pixel point at the outermost periphery of the detection area, taking the pixel value at the upper left corner of the pixel area as a reference, calculating the difference value between the pixel value at the periphery of the area and the reference, and setting a threshold value interval of one difference value as (-110, 110). If the pixel difference value exceeds the threshold value interval, the n multiplied by n area is determined to have a burning area and an unburned area at the same time, namely a fire wire exists, and the area is marked;
Step two, in all marked areas, if a certain marked area is not connected with any other marked area, the burning area in the marked area is noise, and the area is not marked any more;
And fifthly, marking the average value of the pixels of the pixel points in all the marked areas as c. In the marking area, if the pixel value of the pixel point is larger than c and the pixel values larger than c and smaller than c exist at the same time in four adjacent pixel points, the pixel point is a fire scene edge fire point; traversing and detecting all pixel points in the marked area by the method, and finding out the fire scene edge points;
step two, connecting fire points at the edge of a fire scene by using a Catmull-Rom curve fitting method to obtain a fire wire;
step three: detecting the loss of a real fire wire caused by the shielding of a crown, and restoring the shielded real fire wire;
according to the invention, the complex environment of a fire scene is fully considered, when a shielding object exists, a real fire wire cannot be obtained (as shown in fig. 4), the fire wire of the shielded area is supplemented by utilizing the coordinate change of a fire point on two sides of the shielded area compared with an upper image and the prediction of the fire wire of the shielded area by deep learning, and the specific process is as follows:
Step three, obtaining a temperature value of each fire point on a fire wire by using an infrared image, setting a proper threshold interval for the temperature of the fire scene edge according to the temperature difference between the fire scene edge and the interior of the fire scene, and if the temperature of all the fire points on the fire wire is in the interval, no shielding exists; if the fire point temperature on the fire wire segment is not in the interval, the fire wire segment is the boundary line between the interior of the fire field and the shielding object, and the real fire wire position is shielded;
If the real fire wire is blocked at the moment t, connecting two end fire points of the blocked area with a straight line, and if the included angle between the straight line and the horizontal direction of the pixel coordinate system is smaller than 45 degrees, taking the ordinate of the fire wire at the moment t-deltat (deltat is a shooting time interval and the fire wire is not blocked at the moment t-deltat) as a reference, and calculating the change distance of the ordinate of the fire wire of the blocked area in deltat time, so as to restore the real fire wire; if the included angle is larger than 45 degrees, calculating by taking the abscissa as a reference; taking the included angle smaller than 45 degrees as an example, the following steps are carried out on live wire alignment. Deltat is a shooting time interval, and no live wire shielded area exists at the moment of t-deltat;
In other words: if the included angle between the straight line and the horizontal direction of the pixel coordinate system is less than or equal to 45 degrees, taking the ordinate of the live wire at the time t-delta t as a reference, calculating the ordinate change distance of the live wire of the shielded area in delta t time, if the included angle between the straight line and the horizontal direction of the pixel coordinate system is greater than 45 degrees, taking the abscissa of the live wire at the time t-delta t as a reference, and calculating the abscissa change distance of the live wire of the shielded area in delta t time, thereby restoring (carrying out live wire compensation) on the real live wire; the horizontal included angle is less than or equal to 45 degrees, and the calculation is carried out by taking the ordinate as a reference (when the included angle is greater than 45 degrees, the ordinate is replaced by the abscissa);
And thirdly, marking the live wire image at the t-delta t moment in the live wire image at the t moment, respectively taking 30 fire points at the outer two sides of the shielded area, wherein 60 fire points on the same column at the t-delta t moment correspond to the fire points. Calculating the average value of the sum of the longitudinal coordinate differences of the corresponding fire points under the same row of two fire lines according to the formula (1), and marking the average value as d a;
Wherein, The ordinate of the ith row of fire points at the moment t;
the ordinate of the ith row of fire points at the time t-delta t;
step three, four, collecting fire scene topography variables (elevation, gradient, slope direction and combustible type) and weather variables (wind speed, wind direction, temperature and humidity) together with a non-shielding live wire image as a data set;
the various data are acquired as follows:
elevation: the unmanned aerial vehicle collects point cloud data or downloads tif format terrain files;
Slope and slope direction: the elevation data are made into a dem-format file, and are exported by ArcGis software data;
type of combustible: the method comprises three types of grass, shrubs and fallen leaves, and is used for on-site collection;
Wind speed, wind direction: the global wind data are measured by an anemometer or obtained by a meteorological website, and the local wind data are obtained by WindNinja software;
temperature, humidity: the miniature weather station is obtained through a weather website;
fifthly, constructing a convolutional neural network model, taking a live wire, a weather variable and a topographic variable at the current moment as inputs, predicting a live wire at the next moment, and training and verifying by using a data set;
inputting a live wire image, a weather variable and a topography variable at the moment t-delta t by using a trained convolutional neural network model to obtain a predicted live wire at the moment t;
Step notoginseng, calculating the average value of the sum of 30 fire point longitudinal coordinate differences of the outside two sides of the shielded area of the fire wire at the moment t and the predicted fire wire at the moment t by the formula (2), and marking the average value as d b,
Wherein,The ordinate of the ith row of fire points at the moment t;
predicting the ordinate of the ith train of fire points of the fire wire at the moment t;
Calculating delta d i according to a formula (3), and for a blocked area of the fire wire at the moment t, changing delta d i on the basis of the ordinate of the fire wire corresponding to the train fire point at the moment t-delta t to obtain the fire point after the blocked area is filled;
Wherein d i is the difference value of the longitudinal coordinates of the ith row fire point of the live wire at the moment of t and the live wire at the moment of t-delta t,
Thirdly, connecting the alignment fire points by using CatmullRom curve fitting method to obtain a fire wire;
Step four: converting the live wire from a camera coordinate system to a laser radar coordinate system to obtain a three-dimensional live wire, identifying local characteristics of the live wire, simplifying the live wire into a series of ordered characteristic region representations, carrying out characteristic region matching on two live wires, and carrying out similarity comparison on live wire fragments represented by the matched characteristic regions to obtain homonymous live wire fragments and homonymous fire points;
step four, assuming that the coordinates of the fire point observed by the ith unmanned aerial vehicle in the infrared image are (u, v), the coordinates of the fire point under the radar coordinate system are obtained by a formula (4)
Wherein (R CL)i、(tLC)i) is a rotation matrix and a translation vector between the infrared camera coordinate system and the radar coordinate system of the ith unmanned aerial vehicle, and M i is an internal reference of the infrared camera of the ith unmanned aerial vehicle.
Step four, the fire wire is transferred from a camera coordinate system to a laser radar coordinate system to obtain a series of discrete three-dimensional fire points, the fire points are defined in a clockwise sequence, namely P 1,P2…Pn, and a three-time B spline curve fitting method is adopted to fit the fire points into the fire wire;
And step four, three: detecting the slope of a fire wire segment between every three adjacent fire points, taking a fire point P i-1,Pi,Pi+1 as an example, if the slopes of two fire wire segments of the XY plane P i-1~Pi and P i~Pi+1 are different (shown as a turning position of the fire wire), the fire wire segment formed by P i-1~Pi+1 is called a characteristic region of the fire wire, and P i is called a characteristic point of the characteristic region, and at the moment, the fire wire is simplified into a series of ordered characteristic regions;
Step four, knowing that P i-1~Pi+1 is a characteristic area, projecting the live wire segment to XY, XZ and YZ planes of a coordinate system respectively, connecting two points P i-1 and P i, two points P i and P i+1 by straight lines in each coordinate plane, and recording the included angle of the two projected straight lines as the characteristic angle of the characteristic area, wherein three characteristic angles exist in one characteristic area;
And step four, five, two adjacent unmanned aerial vehicles are provided, two live wires L 1 and L 2 acquired at the same time are compared, if L 1 has a characteristic angle difference between a characteristic area and three corresponding planes of a certain characteristic area in L 2, an angle difference threshold is set at the moment, and if the three characteristic angle differences do not exceed the angle difference threshold, the two characteristic areas are matched characteristic area pairs.
Fourthly, sixth, the live wire segments connected with all the matching characteristic areas in the L 1 are marked as L 1, the live wire segments connected with all the matching characteristic areas in the L 2 are marked as L 2, L 1 and L 2 are respectively mapped on XY, XZ and YZ planes, the starting points and the end points of two sections of live wires are respectively overlapped in each plane, the Frechet distance between the two sections of live wires at the moment is calculated, a proper distance threshold is selected, if the Frechet distance in three planes is smaller than the distance threshold, the two sections of live wires are the live wires corresponding to the same shooting area, the live wires are named as the same-name live wires, if the requirements are not met, the fact that the characteristic areas at the two ends of the live wire segments are in mismatching is achieved, the matching relation between the L 1 and the matching characteristic areas at the two ends of the L 2 is removed, the remaining live wire segments are compared until two sections of the same-name live wire segments meeting the conditions are obtained, and the corresponding characteristic points on the two sections of the same-name live wire segments are named as the same-name live wires;
seventhly, comparing the live wires shot by every two adjacent unmanned aerial vehicles at the same time according to the steps four, three and four to six to find out the same-name live wire and the same-name fire point;
Step five: and respectively converting the observation live wire of each unmanned aerial vehicle into a world coordinate system according to the pose of each unmanned aerial vehicle under the world coordinate system provided by the inertial navigation system. The Euclidean distance of all the same-name live wire fragments under the world coordinate system is defined as an error, and the overall error is minimized by searching a group of optimal unmanned aerial vehicle pose [ T 1…Tn ], so that the positions of all the observation live wires are adjusted, and the aim of splicing the live wires is fulfilled.
Fifthly, according to the pose relation between the laser radar of each unmanned aerial vehicle terminal and the inertial navigation system and the pose initial value of each unmanned aerial vehicle provided by the inertial navigation system, converting the local live wire observed by the N unmanned aerial vehicle terminals into a global coordinate system.
The coordinates of the fire point under the laser radar coordinate system under the inertial coordinate system are obtained through a formula (5):
Wherein, (R IL)i、(tIL)i) is a rotation matrix and a translation vector of inertial navigation achieved by the ith unmanned aerial vehicle.
The coordinates of the fire point in the world coordinate system are obtained by the formula (6):
In the formula, (R wI)i、(tWI)i) is the pose of the ith unmanned aerial vehicle under the world coordinate system.
And fifthly, defining Euclidean distances of all the identical-name live wire fragments appearing in the world coordinate system as global errors, and constructing a mathematical model shown in a formula (7) to describe the problem of global live wire splicing through unmanned aerial vehicle pose optimization.
Wherein N represents the number of unmanned aerial vehicles, T i is the pose of the ith unmanned aerial vehicle, and N represents the number of same-name fire points on the same-name fire wire segments observed by the ith unmanned aerial vehicle and the (i+1) th unmanned aerial vehicle. P ij represents the j-th homonymous fire point coordinate observed by the i-th unmanned aerial vehicle.
Fifthly, expressing the pose by using a lie algebra, wherein the objective function is represented by a formula (8)
Wherein epsilon i represents the pose of the ith unmanned aerial vehicle, and p ij represents the j-th homonymous fire point coordinate observed by the ith unmanned aerial vehicle;
The unmanned aerial vehicle pose is iterated continuously by using a genetic algorithm, a group of unmanned aerial vehicle poses with the minimum value of an objective function smaller than a given threshold epsilon L are found, epsilon L is 0.1n all, the unit is m, n all is the total logarithm of the same-name fire points, and 0.1 means that the average distance between the same-name fire points is smaller than 0.1 m. The genetic algorithm is within the scope of the prior art.
And fifthly, performing conversion shown in a formula (6) on the local live wire observed by each unmanned aerial vehicle based on the optimized pose of the unmanned aerial vehicle, and realizing global splicing of the live wires.
Compared with the prior art, the invention has the beneficial effects that:
the invention relates to a forest fire large-scale fire wire splicing method based on a plurality of unmanned aerial vehicles, wherein when a large-scale fire disaster occurs, the plurality of unmanned aerial vehicles monitor the external outline of a fire scene in a zoned manner, and then the information acquired by the plurality of unmanned aerial vehicles is fused to obtain a complete global fire wire; the invention establishes the objective function as constraint, and optimizes the observation fire wire by optimizing the pose of each unmanned aerial vehicle, thereby accurately realizing the splicing of the large-scale fire wire and intuitively showing the change trend of the fire field;
the invention fully considers the fact that the fire scene environment is complex, and for the shielded part in the fire wire, the fire wire of the shielded part is restored by adopting a real fire wire spreading distance and deep learning prediction method; and aiming at the characteristics of the fire wires, the local characteristic areas of the fire wires are identified, the corresponding characteristic areas in the two fire wires are found out according to the geometric characteristics of the characteristic areas, and the same-name fire wire fragments and the same-name fire points are obtained according to the shape similarity of the fire wire fragments.
The multi-unmanned aerial vehicle live wire splicing technology provided by the invention can fuse data acquired by a plurality of unmanned aerial vehicles to form a complete fire information graph. Through the analysis of the information graph, the scale, the position, the combustion state and other conditions of the fire disaster can be comprehensively known, and researchers are helped to make timely and accurate analysis.
The invention has the function of fusing the fire scene edge data monitored by a plurality of unmanned aerial vehicles to form a complete three-dimensional fire wire under a world coordinate system (the fire scene edge fire wires measured by a plurality of unmanned aerial vehicles are spliced together to form the complete three-dimensional fire wire). Each unmanned aerial vehicle is provided with an infrared camera, a laser radar and an inertial navigation system, and a fire scene temperature infrared image containing time information, a fire scene point cloud and coordinates of the inertial navigation system under a world coordinate system are obtained; through the external parameter calibration of the infrared camera and the laser radar, the external parameter calibration of the laser radar and the inertial navigation is carried out, a live wire is transferred from a two-dimensional pixel coordinate system to a three-dimensional laser radar coordinate system, and then transferred from the laser radar coordinate system to the world coordinate system. The invention provides the technical problems which are not found by people in the prior art, provides an effective solution, realizes global splicing of the fire wires to intuitively present the change of the fire scene, and thus, monitors the fire disaster in a large range accurately and efficiently.
Drawings
Fig. 1 is a flow chart (technical roadmap) of a forest fire large-scale live wire splicing method based on multiple unmanned aerial vehicles; FIG. 2 is a schematic view of a multi-unmanned aerial vehicle shooting fire scene; FIG. 3 is a diagram of an n pixel area search fire line area;
Fig. 4 is a fire wire shielding schematic diagram. Fig. 5 is a diagram of the result of optimizing FARSITE simulated fire wires by the multi-unmanned aerial vehicle optimizing method of the present invention: (a) Three observation live wires before optimization are shown, and (b) the splicing method is shown after optimization. Fig. 6 is a simulation live wire obtained by using the FARSITE module in FlamMap, and fig. 7 is a result of optimizing the FARSITE simulation live wire by using the multi-unmanned aerial vehicle optimizing method.
Detailed Description
In the description of the present invention, it should be noted that the terms mentioned in the embodiments of the present invention are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or as implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", or a third "may explicitly or implicitly include one or more such feature.
In order that the above objects, features and advantages of the present invention will be readily understood, a more particular description of the invention will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings.
The specific embodiment I is as follows: as shown in fig. 1, the invention provides a forest fire large-scale live wire splicing method based on multiple unmanned aerial vehicles, and a flow chart (technical route diagram) of the method is shown in fig. 1, and mainly comprises the following steps:
Step one: the method comprises the steps that a plurality of unmanned aerial vehicles carrying infrared cameras, laser radars and inertial navigation systems are used for carrying out block acquisition on fire scene edge data, and a fire scene temperature infrared image containing time information, a fire scene point cloud and coordinates of the inertial navigation systems under a world coordinate system are obtained;
step two: extracting a fire wire by utilizing the fire field infrared image obtained in the first step;
step three: detecting the loss of a real fire wire caused by the shielding of a crown, and restoring the shielded real fire wire;
Step four: converting the live wire from a camera coordinate system to a laser radar coordinate system to obtain a three-dimensional live wire, identifying local characteristics of the live wire, simplifying the live wire into a series of ordered characteristic region representations, carrying out characteristic region matching on two live wires, and carrying out similarity comparison on live wire fragments represented by the matched characteristic regions to obtain homonymous live wire fragments and homonymous fire points;
Step five: and respectively converting the observation live wire of each unmanned aerial vehicle into a world coordinate system according to the pose of each unmanned aerial vehicle under the world coordinate system provided by the inertial navigation system. The Euclidean distance of all the same-name live wire fragments under the world coordinate system is defined as an error, and the overall error is minimized by searching a group of optimal unmanned aerial vehicle pose [ T 1…Tn ], so that the positions of all the observation live wires are adjusted, and the aim of splicing the live wires is fulfilled.
Because the fire scene environment is complicated, the unmanned aerial vehicle needs to have better anti-interference performance and stability, adopts a plurality of unmanned aerial vehicles carrying with infrared cameras, laser radars, inertial navigation systems, industrial personal computers and other equipment, flies to the upper 25-30 meters of a designated fire scene area respectively to hover, collects infrared images and fire scene point clouds of the fire scene edge area, and finishes the external parameter calibration of the infrared cameras, the laser radars and the inertial navigation room and the internal parameter calibration of the infrared cameras before the unmanned aerial vehicle takes off.
The equipment has time difference in data acquisition, and the live wire splicing error caused by data time mismatch is avoided by time stamp alignment. The method of time stamp alignment is as follows: and taking the time of the mobile phone as a reference, recording the difference value between the time of the equipment and the reference, and changing the time of the equipment for collecting data to the same time according to the difference value.
And a specific embodiment II: the specific process of live wire extraction in the second step is as follows:
step two, carrying out contrast stretching on the collected infrared images of the fire scene, and enhancing the contrast between the fire scene area and the surrounding environment;
Step two, converting the infrared image subjected to contrast stretching into a gray level image;
And step two, performing traversal detection on the gray level image by taking a pixel area with the size of n multiplied by n as a unit, wherein the up-and-down sliding step length is n. For each detected n multiplied by n pixel area, only selecting the pixel point at the outermost periphery of the detection area, taking the pixel value at the upper left corner of the pixel area as a reference, calculating the difference value between the pixel value at the periphery of the area and the reference, and setting a threshold value interval of one difference value as (-110, 110). If the pixel difference value exceeds the threshold value interval, the n multiplied by n area is determined to have a burning area and an unburned area at the same time, namely a fire wire exists, and the area is marked;
Step two, in all marked areas, if a certain marked area is not connected with any other marked area, the burning area in the marked area is noise, and the area is not marked any more;
And fifthly, marking the average value of the pixels of the pixel points in all the marked areas as c. In the marking area, if the pixel value of the pixel point is larger than c and the pixel values larger than c and smaller than c exist at the same time in four adjacent pixel points, the pixel point is a fire scene edge fire point; traversing and detecting all pixel points in the marked area by the method, and finding out the fire scene edge points;
step two, connecting fire points at the edge of a fire scene by using a Catmull-Rom curve fitting method to obtain a fire wire;
And a third specific embodiment: the method for detecting the loss of the real fire wire caused by the shielding of the tree crown comprises the following specific processes of:
Step three, obtaining a temperature value of each fire point on a fire wire by using an infrared image, setting a proper threshold interval for the temperature of the fire scene edge according to the temperature difference between the fire scene edge and the interior of the fire scene, and if the temperature of all the fire points on the fire wire is in the interval, no shielding exists; if the fire point temperature on the fire wire segment is not in the interval, the fire wire segment is the boundary line between the interior of the fire field and the shielding object, and the real fire wire position is shielded;
If the real fire wire is blocked at the moment t, connecting two end fire points of the blocked area with a straight line, and if the included angle between the straight line and the horizontal direction of the pixel coordinate system is smaller than 45 degrees, taking the ordinate of the fire wire at the moment t-deltat (deltat is a shooting time interval and the fire wire is not blocked at the moment t-deltat) as a reference, and calculating the change distance of the ordinate of the fire wire of the blocked area in deltat time, so as to restore the real fire wire; if the included angle is larger than 45 degrees, the calculation is carried out by taking the abscissa as the reference. Taking the included angle smaller than 45 degrees as an example, the following steps are carried out on live wire alignment.
And thirdly, marking the live wire image at the t-delta t moment in the live wire image at the t moment, respectively taking 30 fire points at the outer two sides of the shielded area, wherein 60 fire points on the same column at the t-delta t moment correspond to the fire points. Calculating the average value of the sum of the longitudinal coordinate differences of the corresponding fire points under the same row of two fire lines according to the formula (1), and marking the average value as d a;
Wherein, The ordinate of the ith row of fire points at the moment t;
the ordinate of the ith row of fire points at the time t-delta t;
step three, four, collecting fire scene topography variables (elevation, gradient, slope direction and combustible type) and weather variables (wind speed, wind direction, temperature and humidity) together with a non-shielding live wire image as a data set;
the various data are acquired as follows:
elevation: the unmanned aerial vehicle collects point cloud data or downloads tif format terrain files;
Slope and slope direction: the elevation data are made into a dem-format file, and are exported by ArcGis software data;
type of combustible: the method comprises three types of grass, shrubs and fallen leaves, and is used for on-site collection;
Wind speed, wind direction: the global wind data are measured by an anemometer or obtained by a meteorological website, and the local wind data are obtained by WindNinja software;
temperature, humidity: the miniature weather station is obtained through a weather website;
fifthly, constructing a convolutional neural network model, taking a live wire, a weather variable and a topographic variable at the current moment as inputs, predicting a live wire at the next moment, and training and verifying by using a data set;
inputting a live wire image, a weather variable and a topography variable at the moment t-delta t by using a trained convolutional neural network model to obtain a predicted live wire at the moment t;
Step notoginseng, calculating the average value of the sum of 30 fire point longitudinal coordinate differences of the outside two sides of the shielded area of the fire wire at the moment t and the predicted fire wire at the moment t by the formula (2), and marking the average value as d b,
Wherein,The ordinate of the ith row of fire points at the moment t;
predicting the ordinate of the ith train of fire points of the fire wire at the moment t;
Calculating delta d i according to a formula (3), and for a blocked area of the fire wire at the moment t, changing delta d i on the basis of the ordinate of the fire wire corresponding to the train fire point at the moment t-delta t to obtain the fire point after the blocked area is filled;
Wherein d i is the difference value of the longitudinal coordinates of the ith row fire point of the live wire at the moment of t and the live wire at the moment of t-delta t,
Thirdly, connecting the alignment fire points by using CatmullRom curve fitting method to obtain a fire wire;
And a specific embodiment IV: converting a live wire from a camera coordinate system to a laser radar coordinate system to obtain a three-dimensional live wire, identifying local characteristics of the live wire, simplifying the live wire into a series of ordered characteristic region representations, carrying out characteristic region matching on two live wires, and carrying out similarity comparison on live wire fragments represented by the matched characteristic regions to obtain identical-name live wire fragments and identical-name fire points, wherein the specific process comprises the following steps:
step four, assuming that the coordinates of the fire point observed by the ith unmanned aerial vehicle in the infrared image are (u, v), the coordinates of the fire point under the radar coordinate system are obtained by a formula (4)
Wherein (R CL)i、(tLC)i) is a rotation matrix and a translation vector between the infrared camera coordinate system and the radar coordinate system of the ith unmanned aerial vehicle, and M i is an internal reference of the infrared camera of the ith unmanned aerial vehicle.
Step four, the fire wire is transferred from a camera coordinate system to a laser radar coordinate system to obtain a series of discrete three-dimensional fire points, the fire points are defined in a clockwise sequence, namely P 1,P2…Pn, and a three-time B spline curve fitting method is adopted to fit the fire points into the fire wire;
And step four, three: detecting the slope of a fire wire segment between every three adjacent fire points, taking a fire point P i-1,Pi,Pi+1 as an example, if the slopes of two fire wire segments of the XY plane P i-1~Pi and P i~Pi+1 are different (shown as a turning position of the fire wire), the fire wire segment formed by P i-1~Pi+1 is called a characteristic region of the fire wire, and P i is called a characteristic point of the characteristic region, and at the moment, the fire wire is simplified into a series of ordered characteristic regions;
Step four, knowing that P i-1~Pi+1 is a characteristic area, projecting the live wire segment to XY, XZ and YZ planes of a coordinate system respectively, connecting two points P i-1 and P i, two points P i and P i+1 by straight lines in each coordinate plane, and recording the included angle of the two projected straight lines as the characteristic angle of the characteristic area, wherein three characteristic angles exist in one characteristic area;
And step four, five, two adjacent unmanned aerial vehicles are provided, two live wires L 1 and L 2 acquired at the same time are compared, if L 1 has a characteristic angle difference between a characteristic area and three corresponding planes of a certain characteristic area in L 2, an angle difference threshold is set at the moment, and if the three characteristic angle differences do not exceed the angle difference threshold, the two characteristic areas are matched characteristic area pairs.
Fourthly, sixth, the live wire segments connected with all the matching characteristic areas in the L 1 are marked as L 1, the live wire segments connected with all the matching characteristic areas in the L 2 are marked as L 2, L 1 and L 2 are respectively mapped on XY, XZ and YZ planes, the starting points and the end points of two sections of live wires are respectively overlapped in each plane, the Frechet distance between the two sections of live wires at the moment is calculated, a proper distance threshold is selected, if the Frechet distance in three planes is smaller than the distance threshold, the two sections of live wires are the live wires corresponding to the same shooting area, the live wires are named as the same-name live wires, if the requirements are not met, the fact that the characteristic areas at the two ends of the live wire segments are in mismatching is achieved, the matching relation between the L 1 and the matching characteristic areas at the two ends of the L 2 is removed, the remaining live wire segments are compared until two sections of the same-name live wire segments meeting the conditions are obtained, and the corresponding characteristic points on the two sections of the same-name live wire segments are named as the same-name live wires;
seventhly, comparing the live wires shot by every two adjacent unmanned aerial vehicles at the same time according to the steps four, three and four to six to find out the same-name live wire and the same-name fire point;
Fifth embodiment: and respectively converting the observation live wire of each unmanned aerial vehicle into a world coordinate system according to the pose of each unmanned aerial vehicle under the world coordinate system provided by the inertial navigation system. The Euclidean distance of all the same-name live wire fragments under the world coordinate system is defined as an error, and the overall error is minimized by searching a group of optimal unmanned aerial vehicle pose [ T 1…Tn ], so that the positions of all the observation live wires are adjusted, and the specific process for realizing the aim of live wire splicing is as follows:
fifthly, according to the pose relation between the laser radar of each unmanned aerial vehicle terminal and the inertial navigation system and the pose initial value of each unmanned aerial vehicle provided by the inertial navigation system, converting the local live wire observed by the N unmanned aerial vehicle terminals into a global coordinate system.
The coordinates of the fire point under the laser radar coordinate system under the inertial coordinate system are obtained through a formula (5):
Wherein, (R IL)i、(tIL)i) is a rotation matrix and a translation vector of inertial navigation achieved by the ith unmanned aerial vehicle.
The coordinates of the fire point in the world coordinate system are obtained by the formula (6):
In the formula, (R wI)i、(tWI)i) is the pose of the ith unmanned aerial vehicle under the world coordinate system.
And fifthly, defining Euclidean distances of all the identical-name live wire fragments appearing in the world coordinate system as global errors, and constructing a mathematical model shown in a formula (7) to describe the problem of global live wire splicing through unmanned aerial vehicle pose optimization.
Wherein N represents the number of unmanned aerial vehicles, T i is the pose of the ith unmanned aerial vehicle, and N represents the number of same-name fire points on the same-name fire wire segments observed by the ith unmanned aerial vehicle and the (i+1) th unmanned aerial vehicle. P ij represents the j-th homonymous fire point coordinate observed by the i-th unmanned aerial vehicle.
Fifthly, expressing the pose by using a lie algebra, wherein the objective function is represented by a formula (8)
Wherein epsilon i represents the pose of the ith unmanned aerial vehicle, and p ij represents the j-th homonymous fire point coordinate observed by the ith unmanned aerial vehicle.
The unmanned aerial vehicle pose is iterated continuously by using a genetic algorithm, a group of unmanned aerial vehicle poses with the minimum value of an objective function smaller than a given threshold epsilon L are found, epsilon L is 0.1n all, the unit is m, n all is the total logarithm of the same-name fire points, and 0.1 means that the average distance between the same-name fire points is smaller than 0.1 m.
And fifthly, performing conversion shown in a formula (6) on the local live wire observed by each unmanned aerial vehicle based on the optimized pose of the unmanned aerial vehicle, and realizing global splicing of the live wires.
The invention has the function of splicing a plurality of fire scene edge firing lines measured by unmanned aerial vehicles together to form a finished firing line (three-dimensional). Each unmanned aerial vehicle carries at least three sensors with different functions: the system comprises an infrared camera, a laser radar and an inertial navigation system, wherein the infrared camera shoots an infrared image of a fire scene, the laser radar is used for obtaining a fire scene point cloud, and the inertial navigation system is used for obtaining coordinates under a world coordinate system. The infrared camera and the laser radar perform external parameter calibration, the laser radar and the inertial navigation system perform external parameter calibration, and the live wire is converted into three dimensions from two dimensions and then is transferred to a world coordinate system. The fire wire splicing method of the multiple unmanned aerial vehicles is verified, so that fusion of data acquired by the multiple unmanned aerial vehicles can be completely realized, a complete and comprehensive fire information diagram is formed, and the conditions of the scale, the position, the combustion state and the like of the fire can be comprehensively known through analysis of the information diagram, so that researchers are helped to make timely and accurate analysis.
In order to fully verify the technical effects of the application, the inventor of the application performs multiple times of verification aiming at the technical scheme provided by the application:
Simulation experiment one: fig. 5 is a result of optimizing FARSITE simulated fire wires by the multi-unmanned-plane optimizing method of the invention, and the effectiveness of the invention is verified. The specific process is as follows: firstly, using FARSITE command line version to obtain simulated fire wire, deriving the coordinate of each fire point, deriving the correspondent elevation of each fire point according to the elevation file of the topography, according to the existent conversion matrix, transferring the fire point from pixel coordinate system to UTM (universal transverse ink card grid system) coordinate system, transferring from UTM coordinate system to world coordinate to obtain longitude and latitude of fire point, combining the previous elevation of each fire point to obtain the coordinate of fire point under the world coordinate system, making three-time B spline curve fitting on fire point to obtain fire wire, dividing fire wire into three-section fire wire L 1、L2、L3, defining their correspondent unmanned aerial vehicle pose T 1、T2、T3 for three-section fire wire respectively, namely, the unmanned aerial vehicle 1 observes the live wire L 1 under the pose T 1, the unmanned aerial vehicle 2 observes the live wire L 2 under the pose T 2, the unmanned aerial vehicle 3 observes the live wire L 3 under the pose T 3, a random disturbance is added to the pose of each unmanned aerial vehicle and the observation live wire thereof, so as to represent the sensor positioning error among multiple unmanned aerial vehicles, the pose T 1′、T2′、T3 'of the unmanned aerial vehicle and the observation live wire L' 1、L′2、L′3 are obtained, aiming at the pose of the unmanned aerial vehicle and the observation live wire thereof after the disturbance is added, constructing an objective function of Euclidean distance between all the same-name fire points of the observation fire wire, simultaneously optimizing the pose of each unmanned aerial vehicle by using a genetic algorithm to minimize the objective function, obtaining an optimized pose T 1″、T2″、T3 ' of the unmanned aerial vehicle and an optimized observation fire wire L ' 1′、L′2′、L′3 ', wherein the average Euclidean distance between all the same-name fire points of the optimized observation fire wire L ' 1′、L′2′、L′3 ' is 0.07m, and completely meeting the precision requirement of forest fire large-scale fire wire monitoring. The pose parameters of the unmanned aerial vehicle before and after optimization are shown in table 1 and table 2, and the pose of the unmanned aerial vehicle is expressed in the form of quaternion (w+xi+yj+zk, i, j and k are imaginary units).
TABLE 1 pose parameters of unmanned aerial vehicle before optimization
TABLE 2 optimized unmanned aerial vehicle pose parameters
Simulation experiment II:
Fig. 7 is a result of optimizing FARSITE simulated fire wires by the multi-unmanned-plane optimizing method of the invention, and the effectiveness of the invention is verified. The specific process is as follows: firstly, a FARSITE module in FlamMap is used for obtaining a simulated fire wire (as shown in figure 6), the coordinate of each fire point is derived, the corresponding elevation of each fire point is derived according to the elevation file of the terrain, the fire points are transferred from a pixel coordinate system to a UTM (universal transverse ink card grid) coordinate system according to the existing conversion matrix, then the UTM coordinate system is transferred to world coordinates to obtain the longitude and latitude of the fire points, the coordinate of the fire point under the world coordinate system can be obtained by combining the elevation of each fire point before, the fire points are subjected to three-time B spline curve fitting to obtain the fire wire, the fire wire is divided into three sections of fire wires L 1、L2、L3, defining corresponding unmanned aerial vehicle pose T 1、T2、T3 for three segments of fire wires respectively, namely indicating that the unmanned aerial vehicle 1 observes the fire wire L 1 under the pose T 1, the unmanned aerial vehicle 2 observes the fire wire L 2 under the pose T 2, the unmanned aerial vehicle 3 observes the fire wire L 3 under the pose T 3, adding a random disturbance to the pose of each unmanned aerial vehicle and the observed fire wire thereof so as to represent the sensor positioning error among multiple unmanned aerial vehicles, obtaining the unmanned aerial vehicle pose T 1′、T2′、T3' and the observed fire wire L 1、L′2、L′3, aiming at the unmanned aerial vehicle pose and the observation fire wire added with disturbance, an objective function of Euclidean distance among all the fire points of the same name of the observation fire wire is constructed, and the genetic algorithm is used for simultaneously optimizing the poses of the unmanned aerial vehicles so as to minimize the objective function, so that the optimized pose T 1″、T2″、T3 ' of the unmanned aerial vehicle and the observation fire wire L ' 1′、L′2′、L′3 ' thereof are obtained, the average Euclidean distance among all the fire points of the same name of the optimized observation fire wire L ' 1′、L′2′、L′3 ' is 0.06m, and the precision requirement of forest fire large-scale fire wire monitoring is completely met. The pose parameters of the unmanned aerial vehicle before and after optimization are shown in table 3 and table 4, and the pose of the unmanned aerial vehicle is expressed in the form of quaternion (w+xi+yj+zk, i, j and k are imaginary units).
TABLE 3 pose parameters of unmanned aerial vehicle before optimization
TABLE 4 optimized unmanned aerial vehicle pose parameters
The invention provides a forest fire large-scale live wire splicing method (algorithm) based on multiple unmanned aerial vehicles, which is a technical kernel of the bottom layer of the invention, and various products can be derived based on the algorithm.
The method (algorithm) provided by the invention is used for developing a forest fire large-scale live wire splicing system based on the multiple unmanned aerial vehicles by using a program language, the system is provided with a program module corresponding to the steps of the technical scheme, and the steps in the forest fire large-scale live wire splicing method based on the multiple unmanned aerial vehicles are executed during operation.
The computer program of the developed system (software) is stored on a computer readable storage medium, and the computer program is configured to realize the steps of the forest fire large-scale fire wire splicing method based on the multi-unmanned aerial vehicle when being called by a processor. I.e. the invention is embodied on a carrier as a computer program product.
Various embodiments of the systems described herein may be implemented in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
The computing programs (also referred to as programs, software applications, or code) in the present invention include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable logic devices, PLDs) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel or sequentially or in a different order, and as long as the desired results of the technical solution disclosed in the present application can be achieved, those skilled in the art may make various changes and modifications without departing from the spirit and scope of the present application, and these changes and modifications fall within the protection scope of the present application. Although the present disclosure is disclosed above, the scope of the present disclosure is not limited thereto.

Claims (7)

1. A forest fire large-scale live wire splicing method based on multiple unmanned aerial vehicles is characterized by comprising the following steps:
Step one: the method comprises the steps that a plurality of unmanned aerial vehicles carrying infrared cameras, laser radars and inertial navigation systems are used for carrying out block acquisition on fire scene edge data, and a fire scene temperature infrared image containing time information, a fire scene point cloud and coordinates of the inertial navigation systems under a world coordinate system are obtained;
step two: extracting a fire wire by utilizing the fire field infrared image obtained in the first step;
step three: detecting the loss of a real fire wire caused by the shielding of a crown, and restoring the shielded real fire wire;
Step four: converting the live wire from a camera coordinate system to a laser radar coordinate system to obtain a three-dimensional live wire, identifying local characteristics of the live wire, simplifying the live wire into a series of ordered characteristic region representations, carrying out characteristic region matching on two live wires, and carrying out similarity comparison on live wire fragments represented by the matched characteristic regions to obtain homonymous live wire fragments and homonymous fire points;
Step five: according to the pose of each unmanned aerial vehicle under the world coordinate system provided by the inertial navigation system, respectively converting the observation live wire of each unmanned aerial vehicle into the world coordinate system; the Euclidean distance of all the same-name live wire fragments under the world coordinate system is defined as an error, and the overall error is minimized by searching a group of optimal unmanned aerial vehicle pose [ T 1…Tn ], so that the positions of all the observation live wires are adjusted to realize live wire splicing;
In the third step, the real live wire segment loss caused by crown shielding is detected, and the specific process of restoring the shielded real live wire is as follows:
Step three, obtaining a temperature value of each fire point on the fire wire from the infrared image, setting a proper threshold interval for the temperature of the fire scene edge according to the temperature difference between the fire scene edge and the interior of the fire scene, and if the temperature of all the fire points on the fire wire is in the interval, no shielding exists; if the fire point temperature on the fire wire segment is not in the interval, the fire wire segment is the boundary line between the interior of the fire field and the shielding object, and the real fire wire position is shielded;
Step three, if the real fire wire is blocked at the moment t, connecting the two fire points of the blocked area with straight lines for the blocked area of the real fire wire,
If the included angle between the straight line and the horizontal direction of the pixel coordinate system is less than or equal to 45 degrees, taking the ordinate of the fire wire at the time t-delta t as a reference, calculating the change distance of the ordinate of the fire wire of the shielded area in delta t time,
If the included angle between the straight line and the horizontal direction of the pixel coordinate system is larger than 45 degrees, taking the abscissa of the fire wire at the time t-delta t as a reference, calculating the change distance of the abscissa of the fire wire of the shielded area in delta t time,
Thereby restoring the real live wire;
Then, calculating by taking the ordinate as a reference, wherein the included angle in the horizontal direction is smaller than or equal to 45 degrees;
deltat is a shooting time interval, and no live wire shielded area exists at the moment of t-deltat;
Marking a live wire image at the t-delta t moment in the live wire image at the t moment, respectively taking 30 fire points on the outer two sides of the shielded area, wherein 60 fire points on the same column at the t-delta t moment correspond to the fire points; calculating the average value of the sum of the longitudinal coordinate differences of the corresponding fire points under the same row of two fire lines according to the formula (1), and marking the average value as d a;
Wherein, The ordinate of the ith row of fire points at the moment t;
the ordinate of the ith row of fire points at the time t-delta t;
Step three, collecting a fire scene topography variable and a weather variable, and taking the fire scene topography variable and the weather variable together with a non-shielding fire wire image as a data set; the fire scene topography variables include elevation, grade, slope direction, and combustible type; weather variables include wind speed, wind direction, temperature, humidity;
Fifthly, constructing a convolutional neural network model, using a live wire, a weather variable and a topographic variable at the current moment as inputs, predicting a live wire at the next moment, and training and verifying by using a data set;
Inputting a live wire image, weather variables and topography variables at the moment t-delta t by using a trained convolutional neural network model to obtain a predicted live wire at the moment t;
Step notoginseng, calculating the average value of the sum of 30 fire point longitudinal coordinate differences of the outside two sides of the shielded area of the fire wire at the moment t and the predicted fire wire at the moment t according to the formula (2), marking as d b,
Wherein,The ordinate of the ith row of fire points at the moment t;
predicting the ordinate of the ith train of fire points of the fire wire at the moment t;
Calculating delta d i according to a formula (3), and for a blocked area of the fire wire at the moment t, changing delta d i on the basis of the ordinate of the fire wire corresponding to the train fire point at the moment t-delta t to obtain the fire point after the blocked area is filled;
Wherein d i is the difference value of the longitudinal coordinates of the ith row fire point of the live wire at the moment of t and the live wire at the moment of t-delta t,
And thirdly, connecting the supplementary firing points by using CatmullRom curve fitting method to obtain the firing line.
2. The method for splicing forest fire large-scale fire wires based on multiple unmanned aerial vehicles according to claim 1, wherein in the first step, for each unmanned aerial vehicle, the unmanned aerial vehicle flies above a designated fire scene edge area, and a fire scene infrared image containing time information, a fire scene point cloud and coordinates of the inertial navigation system in a world coordinate system are obtained through an airborne inertial navigation system, an infrared camera and a laser radar, wherein each pixel point in the fire scene infrared image is provided with temperature information.
3. The method for splicing the large-scale fire wires of the forest fires based on the multiple unmanned aerial vehicles as claimed in claim 1 or 2, wherein in the second step, the specific process of extracting the fire wires is as follows:
Step two, carrying out contrast stretching on the collected infrared images of the fire scene, and enhancing the contrast between the fire scene area and the surrounding environment;
Step two, converting the infrared image subjected to contrast stretching into a gray level image;
Step two, performing traversal detection on the gray level image by taking a pixel area with the size of n multiplied by n as a unit, wherein the up-and-down sliding step length is n; for each detected n multiplied by n pixel area, only selecting the pixel point at the outermost periphery of the detection area, taking the pixel value at the upper left corner of the pixel area as a reference, calculating the difference value between the pixel value at the periphery of the area and the reference, and setting a threshold value interval of one difference value as (-110, 110); if the pixel difference value exceeds the threshold value interval, the n multiplied by n area is determined to have a burning area and an unburned area at the same time, namely a fire wire exists, and the area is marked;
Step two, in all marked areas, if a certain marked area is not connected with any other marked area, the burning area in the marked area is noise, and the area is not marked any more;
Fifthly, marking the average value of the pixels in all the marked areas as c; in the marking area, if the pixel value of the pixel point is larger than c and the pixel values larger than c and smaller than c exist at the same time in four adjacent pixel points, the pixel point is a fire scene edge fire point; traversing and detecting all pixel points in the marked area by the method, and finding out the fire scene edge points;
And step six, connecting fire points at the edge of the fire scene by using a Catmull-Rom curve fitting method to obtain the fire wire.
4. The method for splicing the large-scale fire wires of the forest fire based on the multiple unmanned aerial vehicles is characterized in that in the fourth step, the fire wires are converted into a laser radar coordinate system from a camera coordinate system to obtain three-dimensional fire wires, the local characteristics of the fire wires are identified, the fire wires are simplified into a series of ordered characteristic region representations, the characteristic region matching is carried out on the two fire wires, the similarity comparison is carried out on the fire wire fragments represented by the matching characteristic regions, and the specific process for obtaining the same-name fire wire fragments and the same-name fire points is as follows:
step four, assuming that the coordinates of the fire point observed by the ith unmanned aerial vehicle in the infrared image are (u, v), the coordinates of the fire point under the radar coordinate system are obtained by a formula (4)
Wherein (R CL)i、(tLC)i is a rotation matrix and a translation vector between an infrared camera coordinate system and a radar coordinate system on the ith unmanned aerial vehicle, M i is an internal reference of the infrared camera on the ith unmanned aerial vehicle;
Step four, converting the fire wire from a camera coordinate system to a laser radar coordinate system to obtain a series of discrete three-dimensional fire points, defining the fire points in a clockwise sequence, namely P 1,P2…Pn, and fitting the fire points into the fire wire by adopting a cubic B spline curve fitting method;
Detecting the slope of a fire wire segment between every three adjacent fire points, taking a fire point P i-1,Pi,Pi+1 as an example, if the slopes of two fire wire segments of the XY plane P i-1~Pi and the fire wire segment of the XY plane P i~Pi+1 are different, the fire wire segment formed by the P i-1~Pi+1 is called a characteristic region of the fire wire, and the P i is called a characteristic point of the characteristic region, and at the moment, the fire wire is simplified into a series of ordered characteristic regions;
Fourthly, taking the known P i-1~Pi+1 as a characteristic area, respectively projecting the live wire segment to XY, XZ and YZ planes of a coordinate system, connecting two points P i-1 and P i and two points P i and P i+1 by straight lines in each coordinate plane, and recording the included angle of the two projection straight lines as the characteristic angle of the characteristic area, wherein three characteristic angles exist in one characteristic area;
Step four, five, two adjacent unmanned aerial vehicles, two live wires L 1 and L 2 collected at the same time, compare the characteristic area of two live wires, if L 1 has a characteristic area and a characteristic angle of a certain characteristic area in L 2 three corresponding planes make a difference, set up a angle difference threshold at this moment, if three characteristic angle differences do not exceed the angle difference threshold, these two characteristic areas are the matched characteristic area pair;
Fourthly, marking fire wire segments connected with all matching characteristic areas in L 1 as L 1, marking fire wire segments connected with all matching characteristic areas in L 2 as L 2, mapping L 1 and L 2 on XY, XZ and YZ planes respectively, enabling the starting points and the end points of two sections of fire wires to overlap respectively in each plane, calculating the Frechet distance between the two sections of fire wires at the moment, selecting a proper distance threshold, if the Frechet distance in three planes is smaller than the distance threshold, indicating that the two sections of fire wires are fire wires corresponding to the same shooting area, namely the same-name fire wires, if the requirements are not met, indicating that the characteristic areas at the two ends of the fire wire segments are in mismatching, removing the matching relation between the L 1 and the matching characteristic areas at the two ends of L 2, and comparing the rest fire wire segments until two sections of same-name fire wire segments meeting the conditions are obtained, and the corresponding characteristic points on the two sections of same-name fire wire segments are the same-name fire points;
And fourthly, seventhly, comparing the live wires shot by every two adjacent unmanned aerial vehicles at the same time according to the steps from four to three to four to six, and finding out the same-name live wire and the same-name fire point.
5. The method for splicing forest fire large-scale fire wires based on multiple unmanned aerial vehicles according to claim 4 is characterized in that in the fifth step, the observation fire wires of the unmanned aerial vehicles provided by an inertial navigation system are respectively converted into a world coordinate system according to the pose of the unmanned aerial vehicles in the world coordinate system; the Euclidean distance of all the same-name live wire fragments under the world coordinate system is defined as an error, and the overall error is minimized by searching a group of optimal unmanned aerial vehicle pose [ T 1…Tn ], so that the positions of all the observation live wires are adjusted, and the specific process for realizing the aim of live wire splicing is as follows:
Fifthly, converting local live wires observed by N unmanned aerial vehicle terminals into a global coordinate system according to the pose relation between the laser radars of the unmanned aerial vehicle terminals and the inertial navigation system and the pose initial value of each unmanned aerial vehicle provided by the inertial navigation system;
The coordinates of the fire point under the laser radar coordinate system under the inertial coordinate system are obtained through a formula (5):
Wherein, (R IL)i、(tIL)i is a rotation matrix and a translation vector of inertial navigation achieved by the ith unmanned aerial vehicle;
The coordinates of the fire point in the world coordinate system are obtained by the formula (6):
Wherein, (R wI)i、(tWI)i is the pose of the ith unmanned aerial vehicle under the world coordinate system;
Fifthly, defining Euclidean distances of all the identical-name live wire fragments appearing in the world coordinate system as global errors, and constructing a mathematical model shown in a formula (7) to describe the problem of global live wire splicing through unmanned aerial vehicle pose optimization;
Wherein: n represents the number of unmanned aerial vehicles, T i is the pose of the ith unmanned aerial vehicle, N represents the number of same-name fires on the same-name fire wire segments observed by the ith unmanned aerial vehicle and the (i+1) th unmanned aerial vehicle; p ij represents the j-th homonymous fire point coordinate observed by the i-th unmanned aerial vehicle;
fifthly, expressing the pose by using a lie algebra, wherein the objective function is represented by a formula (8)
Wherein epsilon i represents the pose of the ith unmanned aerial vehicle, and p ij represents the j-th homonymous fire point coordinate observed by the ith unmanned aerial vehicle;
Continuously iterating the poses of the unmanned aerial vehicle by using a genetic algorithm, finding a group of poses of the unmanned aerial vehicle, wherein the minimum value of the objective function is smaller than a given threshold epsilon L, the value epsilon L is 0.1n all, the unit is m, n all is the total logarithm of the same-name fire points, and the average distance between each pair of same-name fire points is smaller than 0.1 m by using the threshold epsilon L;
And fifthly, performing conversion shown in formula (6) on the local live wire observed by each unmanned aerial vehicle based on the optimized pose of the unmanned aerial vehicle, and realizing global splicing of the live wire.
6. Forest fire large-scale live wire splicing system based on many unmanned aerial vehicle, its characterized in that: the system has program modules corresponding to the steps of any one of the preceding claims 1-5, and performs the steps in the method for splicing forest fires with large-scale fire wires based on multiple unmanned aerial vehicles.
7. A computer-readable storage medium, characterized by: the computer readable storage medium stores a computer program configured to implement the steps of a forest fire large scale fire wire stitching method based on a multiple unmanned aerial vehicle as claimed in any one of claims 1 to 5 when invoked by a processor.
CN202311793744.9A 2023-12-25 2023-12-25 Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles Active CN117745536B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311793744.9A CN117745536B (en) 2023-12-25 2023-12-25 Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311793744.9A CN117745536B (en) 2023-12-25 2023-12-25 Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles

Publications (2)

Publication Number Publication Date
CN117745536A CN117745536A (en) 2024-03-22
CN117745536B true CN117745536B (en) 2024-06-11

Family

ID=90281183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311793744.9A Active CN117745536B (en) 2023-12-25 2023-12-25 Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles

Country Status (1)

Country Link
CN (1) CN117745536B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108763811A (en) * 2018-06-08 2018-11-06 中国科学技术大学 Dynamic data drives forest fire appealing prediction technique
CN112464819A (en) * 2020-11-27 2021-03-09 清华大学 Forest fire spreading data assimilation method and device based on unmanned aerial vehicle video
CN115115595A (en) * 2022-06-30 2022-09-27 东北林业大学 Real-time calibration method of airborne laser radar and infrared camera for forest fire monitoring
CN115661245A (en) * 2022-10-24 2023-01-31 东北林业大学 Large-scale live wire instantaneous positioning method based on unmanned aerial vehicle
KR102511408B1 (en) * 2022-11-24 2023-03-20 김병준 Apparatus and method for precision detection and response forest fire based on edge artificial intelligence
CN116310898A (en) * 2023-02-28 2023-06-23 武汉理工大学 Forest fire spread prediction method and system based on neural network and Huygens principle

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9989965B2 (en) * 2015-08-20 2018-06-05 Motionloft, Inc. Object detection and analysis via unmanned aerial vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108763811A (en) * 2018-06-08 2018-11-06 中国科学技术大学 Dynamic data drives forest fire appealing prediction technique
CN112464819A (en) * 2020-11-27 2021-03-09 清华大学 Forest fire spreading data assimilation method and device based on unmanned aerial vehicle video
CN115115595A (en) * 2022-06-30 2022-09-27 东北林业大学 Real-time calibration method of airborne laser radar and infrared camera for forest fire monitoring
CN115661245A (en) * 2022-10-24 2023-01-31 东北林业大学 Large-scale live wire instantaneous positioning method based on unmanned aerial vehicle
KR102511408B1 (en) * 2022-11-24 2023-03-20 김병준 Apparatus and method for precision detection and response forest fire based on edge artificial intelligence
CN116310898A (en) * 2023-02-28 2023-06-23 武汉理工大学 Forest fire spread prediction method and system based on neural network and Huygens principle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
旋翼无人机林火点定位技术研究;何诚;张思玉;姚树人;;测绘通报;20141225(第12期);全文 *

Also Published As

Publication number Publication date
CN117745536A (en) 2024-03-22

Similar Documents

Publication Publication Date Title
CA2994508C (en) Vegetation management for power line corridor monitoring using computer vision
CA2994511C (en) Condition detection using image processing
CN109636848B (en) Unmanned aerial vehicle-based oil and gas pipeline inspection method
CN111860205B (en) Forest fire evaluation method based on multisource remote sensing images and grids and storage medium
Kim et al. Forest fire monitoring system based on aerial image
Yandouzi et al. Review on forest fires detection and prediction using deep learning and drones
CN112132144B (en) Unmanned aerial vehicle air line ground collision risk assessment method based on remote sensing image
WO2018136517A1 (en) Augmented/virtual mapping system
CN115240093B (en) Automatic power transmission channel inspection method based on visible light and laser radar point cloud fusion
CN113537180B (en) Tree obstacle identification method and device, computer equipment and storage medium
Zhang et al. MMFNet: Forest fire smoke detection using multiscale convergence coordinated pyramid network with mixed attention and fast-robust NMS
CN117876874A (en) Forest fire detection and positioning method and system based on high-point monitoring video
CN114648709A (en) Method and equipment for determining image difference information
CN116206223A (en) Fire detection method and system based on unmanned aerial vehicle edge calculation
CN116258980A (en) Unmanned aerial vehicle distributed photovoltaic power station inspection method based on vision
Dong et al. Real-time survivor detection in UAV thermal imagery based on deep learning
Avola et al. Automatic estimation of optimal UAV flight parameters for real-time wide areas monitoring
CN117745536B (en) Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles
CN117710874A (en) Fire disaster identification method, device, equipment and storage medium for target area
CN113673288A (en) Idle parking space detection method and device, computer equipment and storage medium
Li et al. Intelligent detection method with 3D ranging for external force damage monitoring of power transmission lines
CN116543322B (en) Intelligent property routing inspection method based on community potential safety hazards
Karaali Detection of tilted electricity poles using image processing and computer vision techniques
Tian Effective image enhancement and fast object detection for improved UAV applications
Ali Visual localisation of electricity pylons for power line inspection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant