CN116820128B - Automatic patrol system for realizing large forest - Google Patents

Automatic patrol system for realizing large forest Download PDF

Info

Publication number
CN116820128B
CN116820128B CN202310769777.3A CN202310769777A CN116820128B CN 116820128 B CN116820128 B CN 116820128B CN 202310769777 A CN202310769777 A CN 202310769777A CN 116820128 B CN116820128 B CN 116820128B
Authority
CN
China
Prior art keywords
area
unmanned aerial
aerial vehicle
fire
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310769777.3A
Other languages
Chinese (zh)
Other versions
CN116820128A (en
Inventor
刘世明
周小燕
王泽征
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huimingjie Technology Co ltd
Original Assignee
Shenzhen Huimingjie Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huimingjie Technology Co ltd filed Critical Shenzhen Huimingjie Technology Co ltd
Priority to CN202310769777.3A priority Critical patent/CN116820128B/en
Publication of CN116820128A publication Critical patent/CN116820128A/en
Application granted granted Critical
Publication of CN116820128B publication Critical patent/CN116820128B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/28Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture specially adapted for farming

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Alarm Systems (AREA)

Abstract

The invention relates to an automatic patrol system for realizing a large forest, which is characterized in that: the intelligent unmanned aerial vehicle comprises a main unmanned aerial vehicle, a secondary unmanned aerial vehicle and a cloud platform, wherein the main unmanned aerial vehicle and the secondary unmanned aerial vehicle are provided with a camera module, an infrared thermal imaging module and a temperature and humidity detection module, and the main unmanned aerial vehicle uploads collected image data and temperature and humidity data to the cloud platform; the cloud platform processes the infrared thermal image and the visible light image to identify a fire disaster occurrence area in the grid, predicts the fire disaster area when the auxiliary unmanned aerial vehicle arrives, and accordingly obtains the flight height and the path of the auxiliary unmanned aerial vehicle, and the auxiliary unmanned aerial vehicle flies to the central point of the fire disaster area according to the flight height and the path strength of the auxiliary unmanned aerial vehicle. The scheme does not influence the patrol of the main unmanned aerial vehicle to other grid areas through the cooperation mode of the main unmanned aerial vehicle and the auxiliary unmanned aerial vehicle, can realize continuous monitoring of a plurality of fire occurrence areas, and improves the fire patrol efficiency and the fire patrol capacity of a large forest.

Description

Automatic patrol system for realizing large forest
Technical Field
The invention relates to the technical field of unmanned aerial vehicle patrol, in particular to an automatic patrol system for realizing a large forest.
Background
For a long time, the traditional manual inspection forest fire inspection mode has the defects of limited coverage area, high labor intensity, high investigation and evidence collection risk, high labor cost and the like.
Unmanned aerial vehicle has advantages such as efficient, wide range, command dispatch are nimble, through shooting and real-time transmission patrol picture, patrol district forest fire, make the patrol person grasp forest area fire condition more directly perceivedly, more clearly, effectively improve big forest fire and patrol and early warning ability.
Traditional unmanned aerial vehicle forest patrol is single-machine type patrol, namely fly according to default flight height through an unmanned aerial vehicle, shoot the image of below forest while flying, through the fire feature discernment in the image with judging whether take the conflagration in the image. When a fire disaster area is found, the unmanned aerial vehicle hovers above the fire disaster area to shoot a fire scene image and upload the image to the cloud monitoring platform, so that the patrol and early warning of forest fires are realized.
The patrol mode has the obvious defects that: unmanned aerial vehicle can hover in this region after finding the conflagration region, can not continue to patrol and examine other regions, because the big forest probably has several different regions to take place the conflagration simultaneously, just can not discover the conflagration in other regions when unmanned aerial vehicle stops in a conflagration region, this kind of traditional mode of patrolling and defending clearly patrol and defend efficiency and patrol and defend ability all have the defect when facing big forest, need to improve.
Disclosure of Invention
Based on the expression, the invention provides an automatic patrol system for realizing a large forest, so as to solve the technical problem that the existing unmanned aerial vehicle patrol mode can only identify and continuously monitor a fire disaster area.
The technical scheme for solving the technical problems is as follows:
The automatic patrol system for realizing the large forest comprises a main unmanned aerial vehicle, a secondary unmanned aerial vehicle and a cloud platform, wherein the main unmanned aerial vehicle and the secondary unmanned aerial vehicle are provided with a camera module, an infrared thermal imaging module and a temperature and humidity detection module, a patrol area is divided into a plurality of grids with the same area, the main unmanned aerial vehicle flies through the grids in sequence at a first flying height and in a spiral progressive path, the main unmanned aerial vehicle shoots a visible light image and an infrared thermal image at the same time when flying to the center point of each grid, the shot visible light image and the infrared thermal image coincide, the image shot by the main unmanned aerial vehicle can cover the whole range of the grids, the main unmanned aerial vehicle acquires environmental temperature and humidity data at the same time of shooting the image, and the main unmanned aerial vehicle uploads the acquired image data and the acquired temperature and humidity data to the cloud platform; the cloud platform processes the infrared thermal image and the visible light image to identify a fire occurrence area in the grid, when the fire occurrence area is identified, the cloud platform processes the visible light image to obtain the information of the degree of withered and yellow of the forest, the cloud platform predicts the fire spread rate according to the area of the fire occurrence area, the degree of withered and yellow of the forest and the environmental temperature and humidity, and obtains the predicted fire area when the auxiliary unmanned aerial vehicle arrives at the fire occurrence area center point according to the time from flying to the fire occurrence area center point, the cloud platform determines the second flying height according to the preset image shooting proportion of the auxiliary unmanned aerial vehicle, the predicted fire area and the first flying height, the cloud platform generates the flying line of the auxiliary unmanned aerial vehicle according to the flying point geographic coordinates of the auxiliary unmanned aerial vehicle, the center point geographic coordinates of the fire area and the second flying height, the auxiliary unmanned aerial vehicle transmits the flying line information of the auxiliary unmanned aerial vehicle to the auxiliary unmanned aerial vehicle according to the center point of the fire occurrence area, and the auxiliary unmanned aerial vehicle continuously shoots the visible light image and the infrared thermal image of the auxiliary unmanned aerial vehicle to the fire occurrence area.
As a preferable scheme: the cloud platform comprises a data storage unit, an image processing unit, an alarm unit, a calculation unit, a prediction unit and a line generation unit cell data transmission unit; the computing unit scans all pixel points in the infrared thermal image to obtain brightness values of all the pixel points, compares the brightness values of all the pixel points with preset brightness values, and judges an area formed by the pixel points with the brightness values larger than the preset brightness values as an abnormal highlight area; the image processing unit performs sharpening processing on the infrared thermal image to improve the contrast ratio of the infrared thermal image, and then extracts the edge contour of the abnormally high area, wherein the area contained in the edge contour is the abnormally high area; when the abnormal high temperature area is identified, the image processing unit generates a rectangular area to enclose the abnormal high temperature area, and identifies fire features in a rectangular sampling area with the same coverage area as the rectangular area in the contemporaneously shot visible light image so as to judge whether the abnormal high temperature area is in fire or not.
As a preferable scheme: the rectangular area is generated by respectively making a first vertical line and a second vertical line from the leftmost point A and the rightmost point B of the edge of the abnormal high temperature area, respectively making a first transverse line and a second transverse line from the uppermost point C and the bottommost point D of the edge of the abnormal high temperature area, and forming the rectangular area by encircling the two vertical lines and the two transverse lines.
As a preferable scheme: when the abnormal high-temperature area is judged to have a fire disaster, the abnormal high-temperature area is marked as a fire disaster area, the computing unit scans the pixel points contained in the fire disaster area to count the number of the pixel points in the fire disaster area, the total number of the pixel points of the grid image and the forest area covered by the grid image are obtained, the computing unit computes the actual area of the fire disaster area according to the ratio of the number of the pixel points in the fire disaster area to the total number of the pixel points of the grid image and the forest area covered by the grid image, namely the fire area is defined, and the fire area is defined as F0; meanwhile, the image processing unit randomly divides a certain number of square areas with the same size from the visible light image, and scans and extracts color values of all pixel points in each square area; the calculating unit counts the number of pixels with color values in yellow-green and yellow color value intervals in the square area, calculates the ratio of the number of the pixels to the number of all pixels in the square area, defines the ratio as a withered-yellow ratio, and calculates the average value of the withered-yellow ratio of all the square area, namely the average withered-yellow ratio; the calculating unit compares the average withered and yellow ratio with a plurality of preset groups of withered and yellow ratio intervals, each group of withered and yellow ratio intervals corresponds to one assignment, namely the withered and yellow ratio assignment is defined as H; meanwhile, the calculation comparison unit acquires environmental temperature and humidity data detected by the main unmanned aerial vehicle from the data storage unit, compares the environmental temperature value with a plurality of preset temperature value intervals to determine which temperature value interval the current environmental temperature is in, and each set of temperature value interval corresponds to one assignment, namely, temperature assignment, and defines the temperature assignment as T; the computing and comparing unit compares the environmental humidity value with a plurality of groups of preset humidity value intervals to determine which humidity value interval the current environmental humidity is located in, each group of humidity value intervals corresponds to one assignment, namely a humidity assignment, and the humidity assignment is defined as W; the prediction unit is preset with a calculation model for predicting the fire spread rate, the calculation unit inputs the withered and yellow duty ratio value as H, the temperature value as T and the humidity value as W to the calculation model, the calculation model outputs a corresponding fire spread rate prediction value S, and the prediction unit calculates a predicted firing area F1 when the secondary unmanned aerial vehicle reaches the central point of the fire area according to the current actual fire area F0, the fire spread rate prediction value S and the flight time T of the secondary unmanned aerial vehicle reaching the central point of the fire area, wherein F1=F0×S×t.
As a preferable scheme: the specific calculation formula in the calculation model is as follows: s=a×h+b×t+c×w, where a, b, and c are weight coefficients.
As a preferable scheme: the flight time t=l/V of the auxiliary unmanned aerial vehicle, wherein L is the linear distance between the central point of the fire area and the departure point of the auxiliary unmanned aerial vehicle, and V is the flight speed of the auxiliary unmanned aerial vehicle.
As a preferable scheme: the step of determining the second flying height is that an image shooting proportion k of the auxiliary unmanned aerial vehicle is preset by a computing unit, after the predicted fire area is calculated, the image area to be actually shot by the auxiliary unmanned aerial vehicle is determined according to the predicted fire area and the shooting proportion, and then the second flying height is calculated according to the image area to be actually shot by the auxiliary unmanned aerial vehicle, the actual area of the grid and the first flying height.
As a preferable scheme: the calculation formula of the second flight height is H2=H2× (F1× (k)/G, wherein H2 represents the second flight height, H1 represents the first flight height, F1 is the area of the predicted fire area, k is the image capturing proportion, and G is the actual area of the single grid.
Compared with the prior art, the technical scheme of the application has the following beneficial technical effects: according to the scheme, the main unmanned aerial vehicle can be prevented from staying in a grid area where a fire disaster is located when finding out the forest fire disaster, images and environmental data are collected and then continuously fly to the next grid according to a given flight line of the main unmanned aerial vehicle, uninterrupted advancing of the main unmanned aerial vehicle is achieved, the cloud platform positions the fire disaster occurrence position according to data uploaded by the main unmanned aerial vehicle and predicts a fire disaster spreading situation, then the flight line of the auxiliary unmanned aerial vehicle is automatically generated, the auxiliary unmanned aerial vehicle is navigated to the fire disaster occurrence area, the auxiliary unmanned aerial vehicle with the same quantity can be obtained according to the quantity of the fire disaster occurrence areas, the auxiliary unmanned aerial vehicles are stopped above the fire disaster occurrence areas to continuously shoot and upload on-site images, the main unmanned aerial vehicle and the auxiliary unmanned aerial vehicle cooperate to not only monitor other grid areas, continuous monitoring of the fire disaster occurrence areas can be achieved, and fire disaster inspection efficiency and fire disaster prevention capability of a large forest are improved.
Drawings
Fig. 1 is a schematic block diagram of the patrol system in the present embodiment;
Fig. 2 is a schematic diagram of a patrol area grid and a main unmanned aerial vehicle flight path in the present embodiment;
FIG. 3 is a schematic illustration of a fire area in an image taken by a primary drone;
FIG. 4 is a schematic diagram of a predicted fire area.
Detailed Description
Referring to fig. 1, an automatic patrol system for realizing a large forest comprises a main unmanned aerial vehicle, a secondary unmanned aerial vehicle and a cloud platform.
The main unmanned aerial vehicle and the auxiliary unmanned aerial vehicle in the embodiment comprise a micro control module, and further comprise a data acquisition module, an obstacle avoidance sensor, an attitude sensor, a positioning navigation module, a wireless communication module, a motor, a storage module and a power module which are connected with the micro control module. Unmanned aerial vehicle in this embodiment is loaded with camera module, infrared thermal imaging module and temperature and humidity detection module, and camera module, infrared thermal imaging module, temperature and humidity detection module's output is connected with data acquisition module's input.
The obstacle avoidance sensor is used for sensing obstacles in the surrounding environment of the unmanned aerial vehicle, feeding back sensed obstacle information to the micro control module, and controlling the unmanned aerial vehicle to avoid the obstacle in the flight process by the micro control module; the attitude sensor is used for detecting the flight attitude of the unmanned aerial vehicle and feeding back attitude information to the micro-control module, and the micro-control module controls the flight attitude of the unmanned aerial vehicle in the flight process; the positioning navigation module is used for receiving the flight path information sent by the cloud platform, generating navigation data, transmitting the navigation data to the micro control module, and controlling a power motor of the unmanned aerial vehicle according to the navigation data by the micro control module so as to enable the unmanned aerial vehicle to fly according to the flight path; the wireless communication module is used for communication and data transmission between the unmanned aerial vehicle and the cloud platform.
The camera module is used for shooting visible light images of forests below the unmanned aerial vehicle and outputting visible light image data to the data acquisition module; the infrared thermal imaging module is used for shooting a forest infrared thermal image below the unmanned aerial vehicle and outputting infrared thermal image data to the data acquisition module; the temperature and humidity detection module is used for detecting the air temperature and humidity of the area where the unmanned aerial vehicle is located and outputting environmental temperature and humidity data to the data acquisition module.
The cloud platform in the embodiment comprises a data storage unit, an image processing unit, an alarm unit, a calculation unit, a prediction unit and a line generation unit cell data transmission unit.
The working principle of the system is as follows:
referring to fig. 2, the patrol area is divided into grid arrays (solid line square frames in the figure are boundaries of the patrol area, grids divided by dotted lines are grid arrays), the length, width and area G of each grid are identical, and the geographic coordinates of the central point of each grid are determined according to the geographic position, length, width and area of the patrol area.
The flying height of the main unmanned aerial vehicle is set to be a first flying height, and the flying route of the main unmanned aerial vehicle is set to be a spiral progressive path, as shown by an arrow solid line in fig. 2.
The main unmanned aerial vehicle flies at a first flying height, when the main unmanned aerial vehicle flies to the central part of the grid, the micro control module controls the camera module and the infrared thermal imaging module to shoot the visible light image and the infrared thermal image of the forest below simultaneously, the carried camera module just can shoot the visible light image of the whole range of the current grid, and the carried infrared thermal imaging module just can shoot the infrared thermal image of the whole range of the current grid. The visible light image and the infrared thermal image shot in the same period are overlapped.
When the unmanned aerial vehicle flies to the central part of a certain grid, the camera module shoots a visible light image of the current grid, the infrared thermal imaging module shoots an infrared thermal image of the current grid, and meanwhile, the data acquisition module acquires and acquires temperature and humidity data output by the temperature and humidity detection module; the data acquisition module transmits the acquired visible light image data, infrared thermal image data and temperature and humidity data to the micro control module, the micro control module packages the data and uploads the data to the cloud platform through the wireless communication module, and the data are stored in the data storage unit.
And then the main unmanned aerial vehicle continuously flies to the next grid along the flying line, and the shooting process and the data acquisition module are repeated when flying to the next grid.
The cloud platform processes the data uploaded by the main unmanned aerial vehicle, and the processing process is as follows: the method comprises the steps of acquiring a visible light image and an infrared thermal image shot at the same time from a data storage unit, and processing the infrared thermal image to identify whether an abnormal high-temperature area appears in the infrared thermal image.
Specific: because the brightness corresponding to the part with higher temperature in the infrared thermal image is higher, the abnormal high-temperature area is identified by the computing unit scanning all the pixel points in the infrared thermal image to obtain the brightness value of each pixel point, comparing the brightness value of each pixel point with a preset brightness value, and judging the area formed by the pixel points with the brightness value larger than the preset brightness value as the abnormal high-brightness area; then the image processing unit performs sharpening processing on the infrared thermal image to improve the contrast, and then extracts the edge contour of the abnormally high area, wherein the area contained in the edge contour is the abnormally high temperature area.
As shown in fig. 3, the area surrounded by the curve is the abnormal high temperature area obtained after the treatment.
Referring to fig. 3, the image processing unit respectively makes a first vertical line and a second vertical line from leftmost point a and rightmost point B of the edge of the abnormally high temperature region, respectively makes a first horizontal line and a second horizontal line from uppermost point C and bottommost point D of the edge of the abnormally high temperature region, respectively, and the two vertical lines and the two horizontal lines form a rectangular region in a surrounding manner to obtain a coverage area of the rectangular region, and a central point of the rectangular region is defined as a central point of the abnormally high temperature region.
When the abnormal high temperature area is identified, a rectangular sampling area with the same coverage as the rectangular area is selected from the visible light image shot at the same time, and fire characteristics (fire characteristics comprise flame and smoke) in the rectangular sampling area are identified so as to judge whether the abnormal high temperature area has fire or not.
When the abnormal high temperature area is judged to have fire disaster, marking the abnormal high temperature area as a fire disaster area; at the moment, the alarm unit sends out alarm information, the calculation unit scans pixel points contained in the fire area to count the number of the pixel points in the fire area, the total number of the pixel points in the grid image and the forest area covered by the grid image are obtained, the calculation unit calculates the actual area of the fire area according to the ratio of the number of the pixel points in the fire area to the total number of the pixel points in the grid image and the forest area covered by the grid image, and the current actual area of the fire area is defined as F0.
Meanwhile, the image processing unit randomly divides a certain number of square areas with the same size from the visible light image, and scans and extracts color values of all pixel points in each square area; the calculating unit counts the number of pixels with color values in yellow-green and yellow color value intervals in the square area, calculates the ratio of the number of the pixels to the number of all pixels in the square area, defines the ratio as a withered-yellow ratio, and calculates the average value of the withered-yellow ratio of all the square area, namely the average withered-yellow ratio; the calculating unit compares the average withered and yellow ratio with a plurality of preset groups of withered and yellow ratio intervals, each group of withered and yellow ratio intervals corresponds to one assignment, namely the withered and yellow ratio assignment, and the withered and yellow ratio assignment is defined as H.
Meanwhile, the calculation comparison unit acquires environmental temperature and humidity data detected by the main unmanned aerial vehicle from the data storage unit, compares the environmental temperature value with a plurality of preset temperature value intervals to determine which temperature value interval the current environmental temperature is in, and each set of temperature value interval corresponds to one assignment, namely, temperature assignment, and defines the temperature assignment as T; and the calculation comparison unit compares the environmental humidity value with a plurality of groups of preset humidity value intervals to determine which humidity value interval the current environmental humidity is in, each group of humidity value intervals corresponds to one assignment, namely a humidity assignment, and the humidity assignment is defined as W.
The prediction unit is pre-provided with a calculation model for predicting the fire spread rate, the withered and yellow duty ratio assignment H, the temperature assignment T and the humidity assignment W are input into the calculation model, and the calculation model outputs a corresponding fire spread rate prediction value S.
The specific calculation formula in the calculation model is as follows: s=a ×H+b×T+c× W, where a, b, c are weight coefficients. The reference values of a, b and c can be determined through a plurality of simulation tests in advance, and the reference values are written into a calculation formula.
The calculation unit outputs H, T, W values to the prediction model, and the prediction unit directly calculates a predicted value S of the fire spread rate; the prediction unit then calculates a predicted fire area F1 when the secondary unmanned aerial vehicle reaches the center point of the fire area, f1=f0×s×t, based on the current actual area F0 of the fire area, the fire spread rate prediction value S, and the time of flight t when the secondary unmanned aerial vehicle reaches the center point of the fire area.
As shown in fig. 4, the area of the predicted fire area is F1 when the spread of the fire area is predicted after the lapse of time t.
The flight time t=l/V of the secondary unmanned aerial vehicle, where L is the linear distance between the center point of the fire area and the departure point of the secondary unmanned aerial vehicle, and V is the flight speed of the secondary unmanned aerial vehicle (assuming constant speed flight).
It should be noted that: in order to ensure that the secondary unmanned aerial vehicle shoots the full view of the fire area and ensure that the image shot by the secondary unmanned aerial vehicle is close to the fire area enough to obtain the image as clear as possible, the flying height of the secondary unmanned aerial vehicle, namely the second flying height, needs to be determined.
In this embodiment, the step of determining the second flight level includes: the calculating unit is preset with an image shooting proportion k of the auxiliary unmanned aerial vehicle, wherein the image shooting proportion is the proportion of the predicted fire area to the actual area of a shooting picture (for example, the proportion of the predicted fire area of the picture shot by the auxiliary unmanned aerial vehicle to the actual area in the shooting picture is 0.5), when the predicted fire area is calculated, the picture area to be actually shot by the auxiliary unmanned aerial vehicle is determined according to the predicted fire area and the shooting proportion, and then the second flying height is calculated according to the actual area of the picture to be actually shot by the auxiliary unmanned aerial vehicle, the actual area of the grid and the first flying height.
The calculation formula is as follows: h2 =h1× (f1++k)/G, where H2 represents the second flight level, H1 represents the first flight level, F1 is the predicted fire area, k is the image capture ratio, and G is the actual area of the single grid.
The calculating unit outputs the calculated second flying height to the line generating unit, and the line generating unit generates a flying line of the auxiliary unmanned aerial vehicle according to the geographical coordinates of the flying spot of the auxiliary unmanned aerial vehicle, the geographical coordinates of the central point of the fire area (for convenience of analysis, the central point of the fire area is considered not to move after the fire is spread in the embodiment), and the second flying height, wherein the flying line information comprises a flying path and a flying height.
And the data transmission unit transmits the flight line data of the auxiliary unmanned aerial vehicle to the positioning navigation module of the auxiliary unmanned aerial vehicle, and then the auxiliary unmanned aerial vehicle takes off and flies to the central point of the fire disaster area according to the flight line. The auxiliary unmanned aerial vehicle shoots a visible light image and an infrared thermal image of a fire disaster area, and uploads image data to the cloud platform, and a person behind can acquire a shot image of a fire disaster scene through the cloud platform.
Because the auxiliary unmanned aerial vehicle flies according to the calculated height, the ratio of the area of the fire area to the area of the shooting picture in the image directly shot when flying to the fire area approaches to the preset shooting proportion, and the shot image not only can contain the full view of the fire area, but also has enough definition and detail. The unmanned aerial vehicle does not need to fly according to the default flight height like the traditional single-machine type patrol unmanned aerial vehicle, and after a fire disaster area is found, the unmanned aerial vehicle is hovered above the fire disaster area for multiple times, so that the overall view of the fire disaster area is ensured to be shot, and the unmanned aerial vehicle is shot after the height is adjusted. Obviously, the traditional patrol mode needs to spend more time to adjust the height of the shot image when shooting, and the scheme can save the adjustment time and more quickly and efficiently finish the image shooting.
According to the scheme, the main unmanned aerial vehicle can be prevented from staying in a grid area where the fire disaster is located when finding out the forest fire disaster, images and environmental data are collected and then continuously fly to the next grid according to the established flight line of the main unmanned aerial vehicle, uninterrupted advancing of the main unmanned aerial vehicle is achieved, the cloud platform positions the fire disaster occurrence position according to data uploaded by the main unmanned aerial vehicle and predicts a fire disaster spreading situation, then the flight line of the auxiliary unmanned aerial vehicle is automatically generated, the auxiliary unmanned aerial vehicle is navigated to the fire disaster occurrence area, the auxiliary unmanned aerial vehicle with the same quantity can be dispatched according to the quantity of the fire disaster occurrence area, the auxiliary unmanned aerial vehicles are stopped above each fire disaster occurrence area to continuously shoot and upload on-site images, the main unmanned aerial vehicle and the auxiliary unmanned aerial vehicle cooperate to prevent other grid areas from being affected, the fire disaster occurrence areas can be continuously monitored, and fire disaster patrol efficiency and fire disaster patrol capacity of a large forest are improved.
The foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the invention are intended to be included within the scope of the invention.

Claims (5)

1. An automatic patrol system for realizing a large forest is characterized in that: comprises a main unmanned aerial vehicle, a subsidiary unmanned aerial vehicle and a cloud platform, wherein the main unmanned aerial vehicle and the subsidiary unmanned aerial vehicle are provided with a camera module, an infrared thermal imaging module and a temperature and humidity detection module, a patrol area is divided into a plurality of grids with the same area, the main unmanned aerial vehicle flies through each grid in turn by a spiral progressive path at a first flying height, the main unmanned aerial vehicle simultaneously shoots a visible light image and an infrared thermal image when flying to the central point of each grid, the simultaneously shot visible light image and infrared thermal image are overlapped, the image shot by the main unmanned aerial vehicle can cover the whole range of the grids under the first flying height, the method comprises the steps that a main unmanned aerial vehicle collects environmental temperature and humidity data while an image is shot, and the main unmanned aerial vehicle uploads the collected image data and the collected temperature and humidity data to a cloud platform; the cloud platform processes the infrared thermal image and the visible light image to identify a fire occurrence area in the grid, when the fire occurrence area is identified, the cloud platform processes the visible light image to acquire the information of the degree of withered and yellow of the forest, predicts the fire spread rate according to the area of the fire occurrence area, the degree of withered and yellow of the forest and the environmental temperature and humidity, obtains the predicted fire area when the secondary unmanned aerial vehicle arrives at the center point of the fire occurrence area according to the time when the secondary unmanned aerial vehicle flies to the center point of the fire occurrence area, determines the second flying height according to the preset image shooting proportion of the secondary unmanned aerial vehicle, the predicted fire area and the first flying height, the cloud platform generates a flight line of the auxiliary unmanned aerial vehicle according to the geographical coordinates of the flying spot of the auxiliary unmanned aerial vehicle, the geographical coordinates of the central point of the fire area and the second flight height, the cloud platform transmits the flight line information of the auxiliary unmanned aerial vehicle to the auxiliary unmanned aerial vehicle, the auxiliary unmanned aerial vehicle flies to the central point of the fire occurrence area according to the flight line of the auxiliary unmanned aerial vehicle, and the auxiliary unmanned aerial vehicle continuously shoots a visible light image and an infrared thermal image of the fire area and uploads image data to the cloud platform; the cloud platform comprises a data storage unit, an image processing unit, an alarm unit, a calculation unit, a prediction unit and a line generation unit cell data transmission unit; the computing unit scans all pixel points in the infrared thermal image to obtain brightness values of all the pixel points, compares the brightness values of all the pixel points with preset brightness values, and judges an area formed by the pixel points with the brightness values larger than the preset brightness values as an abnormal highlight area; the image processing unit performs sharpening processing on the infrared thermal image to improve the contrast ratio of the infrared thermal image, and then extracts the edge contour of the abnormally high area, wherein the area contained in the edge contour is the abnormally high area; when the abnormal high-temperature area is identified, the image processing unit generates a rectangular area to enclose the abnormal high-temperature area, and identifies fire features in a rectangular sampling area which is the same as the coverage area of the rectangular area in the contemporaneously shot visible light image so as to judge whether the abnormal high-temperature area has fire or not; when the abnormal high-temperature area is judged to have a fire disaster, the abnormal high-temperature area is marked as a fire disaster area, the computing unit scans the pixel points contained in the fire disaster area to count the number of the pixel points in the fire disaster area, the total number of the pixel points of the grid image and the forest area covered by the grid image are obtained, the computing unit computes the actual area of the fire disaster area according to the ratio of the number of the pixel points in the fire disaster area to the total number of the pixel points of the grid image and the forest area covered by the grid image, namely the fire area is defined, and the fire area is defined as F0; meanwhile, the image processing unit randomly divides a certain number of square areas with the same size from the visible light image, and scans and extracts color values of all pixel points in each square area; the calculating unit counts the number of pixels with color values in yellow-green and yellow color value intervals in the square area, calculates the ratio of the number of the pixels to the number of all pixels in the square area, defines the ratio as a withered-yellow ratio, and calculates the average value of the withered-yellow ratio of all the square area, namely the average withered-yellow ratio; the calculating unit compares the average withered and yellow ratio with a plurality of preset groups of withered and yellow ratio intervals, each group of withered and yellow ratio intervals corresponds to one assignment, namely the withered and yellow ratio assignment is defined as H; meanwhile, the calculation comparison unit acquires environmental temperature and humidity data detected by the main unmanned aerial vehicle from the data storage unit, compares the environmental temperature value with a plurality of preset temperature value intervals to determine which temperature value interval the current environmental temperature is in, and each set of temperature value interval corresponds to one assignment, namely, temperature assignment, and defines the temperature assignment as T; the computing and comparing unit compares the environmental humidity value with a plurality of groups of preset humidity value intervals to determine which humidity value interval the current environmental humidity is located in, each group of humidity value intervals corresponds to one assignment, namely a humidity assignment, and the humidity assignment is defined as W; the prediction unit is preset with a calculation model for predicting the fire spread rate, the calculation unit inputs the withered and yellow duty ratio value H, the temperature value T and the humidity value W into the calculation model, the calculation model outputs a corresponding fire spread rate prediction value S, and the prediction unit calculates a predicted firing area F1 when the secondary unmanned aerial vehicle reaches the central point of the fire area according to the current actual area F0 of the fire area, the fire spread rate prediction value S and the flight time T of the secondary unmanned aerial vehicle reaching the central point of the fire area, wherein F1=F0×S×t; the specific calculation formula in the calculation model is as follows: s=a×h+b×t+c×w, where a, b, and c are weight coefficients.
2. The automatic patrol system for realizing the large forest according to claim 1, wherein: the rectangular area is generated by respectively making a first vertical line and a second vertical line from the leftmost point A and the rightmost point B of the edge of the abnormal high temperature area, respectively making a first transverse line and a second transverse line from the uppermost point C and the bottommost point D of the edge of the abnormal high temperature area, and forming the rectangular area by encircling the two vertical lines and the two transverse lines.
3. The automatic patrol system for realizing the large forest according to claim 1, wherein: the flight time t=l/V of the auxiliary unmanned aerial vehicle, wherein L is the linear distance between the central point of the fire area and the departure point of the auxiliary unmanned aerial vehicle, and V is the flight speed of the auxiliary unmanned aerial vehicle.
4. The automatic patrol system for realizing the large forest according to claim 1, wherein: the step of determining the second flying height is that an image shooting proportion k of the auxiliary unmanned aerial vehicle is preset by a computing unit, after the predicted fire area is calculated, the image area to be actually shot by the auxiliary unmanned aerial vehicle is determined according to the predicted fire area and the shooting proportion, and then the second flying height is calculated according to the image area to be actually shot by the auxiliary unmanned aerial vehicle, the actual area of the grid and the first flying height.
5. The automatic patrol system for realizing the large forest according to claim 1, wherein: the calculation formula of the second flight height is H2=H2× (F1× (k)/G, wherein H2 represents the second flight height, H1 represents the first flight height, F1 is the area of the predicted fire area, k is the image capturing proportion, and G is the actual area of the single grid.
CN202310769777.3A 2023-06-27 2023-06-27 Automatic patrol system for realizing large forest Active CN116820128B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310769777.3A CN116820128B (en) 2023-06-27 2023-06-27 Automatic patrol system for realizing large forest

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310769777.3A CN116820128B (en) 2023-06-27 2023-06-27 Automatic patrol system for realizing large forest

Publications (2)

Publication Number Publication Date
CN116820128A CN116820128A (en) 2023-09-29
CN116820128B true CN116820128B (en) 2024-06-07

Family

ID=88114101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310769777.3A Active CN116820128B (en) 2023-06-27 2023-06-27 Automatic patrol system for realizing large forest

Country Status (1)

Country Link
CN (1) CN116820128B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117516627B (en) * 2023-11-10 2024-05-07 鄄城县自然资源和规划局 Regional vegetation state monitoring system based on unmanned aerial vehicle data

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106448019A (en) * 2016-11-14 2017-02-22 徐志勇 Unmanned aerial vehicle monitoring system for monitoring forest fire in real time
CN108416963A (en) * 2018-05-04 2018-08-17 湖北民族学院 Forest Fire Alarm method and system based on deep learning
CN111311865A (en) * 2020-02-25 2020-06-19 东北农业大学 Forest fire prevention unmanned aerial vehicle platform based on carry on thermal imager
CN111508181A (en) * 2020-04-28 2020-08-07 江苏理工学院 Forest fire prevention system based on multiple unmanned aerial vehicles and method thereof
CN112435427A (en) * 2020-11-12 2021-03-02 光谷技术股份公司 Forest fire monitoring system and method
CN113086189A (en) * 2021-04-02 2021-07-09 重庆万重山智能科技有限公司 Unmanned aerial vehicle and forest fire monitoring system based on unmanned aerial vehicle
KR102398978B1 (en) * 2021-04-29 2022-05-18 주식회사 제노코 FOREST FIRE MONITORING AND EXTINGUISHING SYSTEM USING UAVs AND AN AIRSHIP
CN115546987A (en) * 2022-09-19 2022-12-30 天立泰科技股份有限公司 Forest fire prevention monitoring and patrolling system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110122245A1 (en) * 2009-11-23 2011-05-26 Ashok Kumar Sinha FOREST FIRE CONTROL SYSTEMS (FFiCS) WITH SCANNER AND OPTICAL /INFRARED RADIATION DETECTOR (SOIRD) AND OPTIONALLY ALSO INCLUDING A SCANNER WITH ACCURATE LOCATION CALCULATOR (SALC) AND A SUPER-EFFICIENT SATELLITE/WIRELESS ANTENNA SYSTEM (SSWAS)
BR102021016894A2 (en) * 2021-08-26 2023-03-07 André Augusto Ceballos Melo METHOD AND SYSTEM OF ARTIFICIAL INTELLIGENCE AND SWARRM INTELLIGENCE IN SIMULATED ENVIRONMENTS FOR DRONES AND AUTONOMOUS ROBOTS FOR FOREST FIRE SUPPRESSION.

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106448019A (en) * 2016-11-14 2017-02-22 徐志勇 Unmanned aerial vehicle monitoring system for monitoring forest fire in real time
CN108416963A (en) * 2018-05-04 2018-08-17 湖北民族学院 Forest Fire Alarm method and system based on deep learning
CN111311865A (en) * 2020-02-25 2020-06-19 东北农业大学 Forest fire prevention unmanned aerial vehicle platform based on carry on thermal imager
CN111508181A (en) * 2020-04-28 2020-08-07 江苏理工学院 Forest fire prevention system based on multiple unmanned aerial vehicles and method thereof
CN112435427A (en) * 2020-11-12 2021-03-02 光谷技术股份公司 Forest fire monitoring system and method
CN113086189A (en) * 2021-04-02 2021-07-09 重庆万重山智能科技有限公司 Unmanned aerial vehicle and forest fire monitoring system based on unmanned aerial vehicle
KR102398978B1 (en) * 2021-04-29 2022-05-18 주식회사 제노코 FOREST FIRE MONITORING AND EXTINGUISHING SYSTEM USING UAVs AND AN AIRSHIP
CN115546987A (en) * 2022-09-19 2022-12-30 天立泰科技股份有限公司 Forest fire prevention monitoring and patrolling system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
无人机在森林防火中的应用分析;王振师;周宇飞;李小川;吴泽鹏;岑棓琛;曾宇;;林业与环境科学;20160220(第01期);全文 *

Also Published As

Publication number Publication date
CN116820128A (en) 2023-09-29

Similar Documents

Publication Publication Date Title
CN108496129B (en) Aircraft-based facility detection method and control equipment
Sherstjuk et al. Forest fire-fighting monitoring system based on UAV team and remote sensing
CN116820128B (en) Automatic patrol system for realizing large forest
JP7045030B2 (en) Inspection system, inspection method, server equipment, and program
CN111951510A (en) Forestry fire prevention intelligence patrols and examines monitoring early warning system based on big data
CN106054928B (en) A kind of full region fire generation measuring method based on unmanned plane network
KR101895811B1 (en) A high performance large coverage surveillance system
CN111932813B (en) Unmanned aerial vehicle forest fire reconnaissance system based on edge calculation and working method
CN111508181A (en) Forest fire prevention system based on multiple unmanned aerial vehicles and method thereof
CN115797873B (en) Crowd density detection method, system, equipment, storage medium and robot
KR102298063B1 (en) Smart security drone system linked to cctv
CN112802287B (en) Power transmission line forest fire monitoring, early warning and positioning system and method thereof
CN106781187A (en) Scene of fire burning things which may cause a fire disaster area positioning method based on multidimentional system FM models
CN114283548A (en) Fire continuous monitoring method and system for unmanned aerial vehicle
CN112365763B (en) Unmanned aerial vehicle inspection training method and system for power equipment
CN114200958A (en) Automatic inspection system and method for photovoltaic power generation equipment
CN107045805B (en) Method and system for monitoring small aircraft and airborne objects
KR20220142865A (en) System and method for monitoring unmanned-office using mobile robot
CN108765841A (en) A kind of forest fire protection inspection system and visiting method
KR101865835B1 (en) Monitoring system for a flying object
KR20130126300A (en) Forest fire detection system and forest fire detection method using the same, and aerial robot for forest fire detection
CN113938609B (en) Regional monitoring method, device and equipment
KR101868082B1 (en) Fire detection system in extensive area
CN115248590A (en) Routing inspection robot navigation method applied to natural gas power plant
CN114558267A (en) Industrial scene fire prevention and control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant