CN110660187B - Forest fire alarm monitoring system based on edge calculation - Google Patents

Forest fire alarm monitoring system based on edge calculation Download PDF

Info

Publication number
CN110660187B
CN110660187B CN201910770603.2A CN201910770603A CN110660187B CN 110660187 B CN110660187 B CN 110660187B CN 201910770603 A CN201910770603 A CN 201910770603A CN 110660187 B CN110660187 B CN 110660187B
Authority
CN
China
Prior art keywords
fire
forest
area
neural network
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910770603.2A
Other languages
Chinese (zh)
Other versions
CN110660187A (en
Inventor
贾琳
刘毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Terminus Technology Co Ltd
Original Assignee
Chongqing Terminus Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Terminus Technology Co Ltd filed Critical Chongqing Terminus Technology Co Ltd
Priority to CN201910770603.2A priority Critical patent/CN110660187B/en
Publication of CN110660187A publication Critical patent/CN110660187A/en
Application granted granted Critical
Publication of CN110660187B publication Critical patent/CN110660187B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/005Fire alarms; Alarms responsive to explosion for forest fires, e.g. detecting fires spread over a large or outdoors area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/94Hardware or software architectures specially adapted for image or video understanding
    • G06V10/95Hardware or software architectures specially adapted for image or video understanding structured as a network, e.g. client-server architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Alarm Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a forest fire alarm monitoring system based on edge calculation, which comprises: the unmanned aerial vehicle is used for acquiring a plurality of continuous forest image frames and carrying out identification calculation on the plurality of continuous forest image frames so as to obtain the multivariate number of the characteristic values of a plurality of fire areas<A,C(xc,yc),T>(ii) a A server for generating a multivariate number according to the characteristic values of the plurality of fire regions<A,C(xc,yc),T>And verifying whether the fire phenomenon exists or not, if so, according to the multivariate arrays of the characteristic values of the plurality of fire areas<A,C(xc,yc),T>And analyzing the fire condition parameters of the fire area. The invention solves the problems that the transmission speed of the aerial forest image returned by the unmanned aerial vehicle is slow and the phenomena of signal loss and interruption frequently occur due to poor communication signal coverage in the forest region; the unmanned aerial vehicle only needs to transmit back the multivariate array of the characteristic value of the fire area to the server<A,C(xc,yc),T>Compared with the method for returning the complete aerial forest image, the method saves the communication load and reduces the requirement on communication.

Description

Forest fire alarm monitoring system based on edge calculation
Technical Field
The invention relates to the technical field of fire alarm monitoring, in particular to a forest fire alarm monitoring system based on edge calculation.
Background
Early detection of forest fires is an important part for suppressing spread of disasters and minimizing loss. In the modern forest fire alarm monitoring system, the forest images are shot by the unmanned aerial vehicle, and the fire area is found by utilizing image recognition and calculation, so that the monitoring range is expanded, and the monitoring timeliness is improved.
At present, the image recognition and calculation of the forest fire area are complex, and calculation resources are needed. Generally, an unmanned aerial vehicle sends aerial forest images to a server in real time, and the server executes image recognition calculation. However, the coverage of communication signals in forest areas is generally poor, the transmission speed of the unmanned aerial vehicle in the process of returning aerial forest images is slow, and the situations of signal loss and interruption often occur, so that the image identification and calculation of the server are not facilitated.
The edge calculation means that equipment with data acquisition, calculation and storage capabilities is adopted on one side close to a data source to provide information processing, analysis and identification nearby. The edge calculation can relieve the data communication load between the front-end equipment and the rear-end server, provides a new thought for the problem of communication limitation in forest fire alarm monitoring, and solves the problems that the transmission speed of an unmanned aerial vehicle returning aerial forest image is low and the phenomena of signal loss and interruption frequently occur due to poor communication signal coverage in a forest region.
Disclosure of Invention
Objects of the invention
In order to overcome at least one defect in the prior art, the invention provides a forest fire alarm monitoring system and method based on edge calculation, which can solve the problems that the transmission speed of an unmanned aerial vehicle returning aerial forest image is low, and the phenomena of signal loss and interruption frequently occur due to poor communication signal coverage in a forest region.
(II) technical scheme
As a first aspect of the invention, the invention discloses a forest fire alarm monitoring system based on edge calculation, which comprises:
the unmanned aerial vehicle is used for acquiring a plurality of continuous forest image frames and carrying out identification calculation on the plurality of continuous forest image frames so as to obtain a multi-element array of characteristic values of a plurality of fire areas<A,C(xc,yc),T>(ii) a Wherein A represents an area value of the ignition region, C (x)c,yc) Representing the coordinates of the center of gravity of the fire area, wherein T is the acquisition time of the forest image frame;
a server for generating a multivariate array based on the eigenvalues of the plurality of firing zones<A,C(xc,yc),T>Verifying whether the fire phenomenon exists or not, if so, according to the multivariate arrays of the characteristic values of the plurality of fire areas<A,C(xc,yc),T>And analyzing the fire condition parameters of the fire area.
In a possible implementation manner, the drone is configured to convert the forest image frame into a forest image frame component map composed of hue H, saturation S and brightness I components, and then obtain a plurality of fire areas from the forest image frame component mapMultiple array of eigenvalues of<A,C(xc,yc),T>。
In a possible implementation, the drone is configured to compare the saturation S with a saturation S distinguishing threshold and the brightness I with a brightness I distinguishing threshold, and then identify whether the fire area exists in the forest image frame component map.
In a possible embodiment, the drone is configured to calculate the feature value of the fire area when the fire area is identified to be present in the forest image frame component map.
In one possible embodiment, the characteristic values of the fire area include: the area value A of the fire area and the area gravity center point coordinate C (x)c,yc)。
In a possible implementation manner, the drone is configured to generate a multivariate array of the feature values of the fire area according to the feature values of the fire area and the acquisition time of the forest image frames<A,C(xc,yc),T>。
In a possible embodiment, the drone is further configured to identify the fire areas of the consecutive forest image frames and perform the feature values of the fire areas and the multivariate array of the feature values of the fire areas when the fire areas are identified to exist in the forest image frame component map<A,C(xc,yc),T>And (4) calculating.
In one possible embodiment, the server includes: a verification module; the verification module is used for generating a multivariate array according to the characteristic values of the plurality of fire areas<A,C(xc,yc),T>And calculating each index of the fire area, and verifying whether a fire phenomenon exists or not according to each index of the fire area.
In a possible embodiment, the indexes of the fire area include: area rate of change, center of gravity shift rate, and center of gravity shift direction rate of change.
In a possible implementation, the server further includes:a neural network evaluation module; the neural network evaluation module is used for judging whether the fire phenomenon exists according to the verification module and the multivariate array of the characteristic values of the plurality of fire areas when the verification module verifies that the fire phenomenon exists<A,C(xc,yc),T>And evaluating the fire parameters of the fire area.
The invention also provides a forest fire alarm monitoring method based on edge calculation, which is characterized by comprising the following steps of:
collecting a plurality of continuous forest image frames by an unmanned aerial vehicle, and identifying and calculating the continuous forest image frames so as to obtain a multi-element array of characteristic values of a plurality of fire areas<A,C(xc,yc),T>(ii) a Wherein A represents an area value of the ignition region, C (x)c,yc) Representing the coordinates of the center of gravity of the fire area, wherein T is the acquisition time of the forest image frame;
a multivariate array of the characteristic values of the plurality of fire regions<A,C(xc,yc),T>Uploading to a server;
a multivariate array according to the characteristic values of the plurality of fire regions through a server<A,C(xc,yc),T>Verifying whether the fire phenomenon exists or not, if so, according to the multivariate arrays of the characteristic values of the plurality of fire areas<A,C(xc,yc),T>And analyzing the fire condition parameters of the fire area.
In one possible implementation, the forest image frame is converted into a forest image frame component map composed of hue H, saturation S and brightness I components, and the multivariate array of the characteristic values of a plurality of fire regions is obtained from the forest image frame component map<A,C(xc,yc),T>。
In a possible implementation manner, the saturation S is compared with a saturation S distinguishing threshold, and the brightness I is compared with a brightness I distinguishing threshold, so as to identify whether the fire area exists in the forest image frame component map.
In one possible implementation, when the fire area exists in the forest image frame component map, the characteristic value of the fire area is calculated.
In one possible embodiment, the characteristic values of the fire area include: the area value A of the fire area and the area gravity center point coordinate C (x)c,yc)。
In a possible implementation, a multivariate array < A, C (x) of the feature values of the fire area is generated based on the feature values of the fire area and the acquisition time of the forest image framesc,yc),T〉。
In one possible embodiment, when the fire areas exist in the forest image frame component map, the fire areas of the continuous forest image frames are identified, and the characteristic values of the fire areas and the multi-element array of the characteristic values of the fire areas are carried out<A,C(xc,yc) Calculation of T >.
In a possible embodiment, the step of verifying is performed by the server; in the verification step, a multivariate array according to the characteristic values of the plurality of fire areas<A,C(xc,yc),T>And calculating each index of the fire area, and verifying whether a fire phenomenon exists or not according to each index of the fire area.
In a possible embodiment, the indexes of the fire area include: area rate of change, center of gravity shift rate, and center of gravity shift direction rate of change.
In a possible embodiment, the neural network evaluation step is performed by the server; the neural network evaluation step is used for evaluating the fire according to the multivariate array of the characteristic values of the plurality of fire areas when the fire phenomenon is verified in the verification step<A,C(xc,yc) T > evaluating the fire parameters of said fire zone. (III) advantageous effects
The forest fire alarm monitoring system based on edge calculation provided by the invention utilizes the forest image frames acquired by the unmanned aerial vehicle to identify the fire area, and calculates the characteristic value of the fire area and the multivariate array of the characteristic value of the fire area<A,C(xc,yc) T > so that the server is based on the tuple<A,C(xc,yc),T>Whether a fire phenomenon exists or not is verified, the fire parameters of the fire area are analyzed, and an alarm is given by using the fire parameters, so that the problems that the transmission speed of the unmanned aerial vehicle for returning the aerial photography forest image is low and the phenomena of signal loss and interruption frequently occur due to poor communication signal coverage of a forest area are solved; the unmanned aerial vehicle only needs to transmit back the multivariate array of the characteristic value of the fire area to the server<A,C(xc,yc),T>Compared with the aerial photography forest image with complete return, the method saves communication load, reduces the requirement for communication, can still reliably keep accurate recognition and analysis of forest fire, and improves the timeliness of forest fire alarm monitoring.
Drawings
The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining and illustrating the present invention and should not be construed as limiting the scope of the present invention.
FIG. 1 is a schematic structural diagram of a forest fire alarm monitoring system based on edge calculation according to the present invention;
fig. 2 is a schematic structural diagram of a server of the forest fire alarm monitoring system based on edge calculation provided by the invention.
Detailed Description
In order to make the implementation objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be described in more detail below with reference to the accompanying drawings in the embodiments of the present invention.
It should be noted that: in the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The embodiments described are some embodiments of the present invention, not all embodiments, and features in embodiments and embodiments in the present application may be combined with each other without conflict. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it is to be understood that the terms "central," "longitudinal," "lateral," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in the orientation or positional relationship indicated in the drawings, which are used for convenience in describing the invention and for simplicity in description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are not to be considered limiting of the scope of the invention.
A first embodiment of the forest fire alarm monitoring system based on edge calculation according to the present invention is described in detail below with reference to fig. 1. As shown in fig. 1, the forest fire alarm monitoring system provided by this embodiment mainly includes: unmanned aerial vehicle and server.
The unmanned aerial vehicle is used for acquiring a plurality of continuous forest image frames and carrying out identification calculation on the plurality of continuous forest image frames so as to obtain a multi-element array of characteristic values of a plurality of fire areas<A,C(xc,yc),T>. The unmanned aerial vehicle can acquire a plurality of continuous forest image frames through aerial photography, so that a plurality of fire areas are identified and marked by the plurality of continuous forest image frames.
A server for generating a multivariate array based on the eigenvalues of the plurality of firing zones<A,C(xc,yc),T>Verifying whether the fire phenomenon exists or not, if so, according to the multivariate arrays of the characteristic values of the plurality of fire areas<A,C(xc,yc),T>And analyzing the fire condition parameters of the fire area. When the server verifies that the fire phenomenon exists, the server can utilize the fire parameters obtained by the server to alarm so as to carry out corresponding processing according to the fire parameters, and the loss is effectively reduced.
And the unmanned aerial vehicle is used for converting the forest image frame into a forest image frame component map consisting of hue H, saturation S and brightness I components. The pixel colors of the forest image frame can be converted from (R, G, B) values into hue H, saturation S and brightness I components, so that the forest image frame is converted into a forest image frame component map consisting of hue H, saturation S and brightness I components.
The RGB color space adopts physical three primary colors to express, the physical meaning is clear, and the RGB color space is suitable for the color picture tube to work, however, the integration is not suitable for the visual characteristics of people. The HSI color space is based on the human visual system, and uses hue H, saturation S and brightness I to describe colors, which is more convenient for color processing and identification. The HSI color space and the RGB color space have a certain conversion relation. Assuming that the base colors have been normalized, i.e., R, G, B ∈ [0,1], then there are:
Figure BDA0002173436430000071
Figure BDA0002173436430000073
Figure BDA0002173436430000074
the unmanned aerial vehicle is used for comparing the saturation S with a saturation S distinguishing threshold value and the brightness I with a brightness I distinguishing threshold value, and then identifying whether the fire area exists in the forest image frame component diagram or not. Since the saturation S and the brightness I of the fire area are significantly different from those of the normal forest area, a saturation S discrimination threshold and a brightness I discrimination threshold may be set. The saturation S of the forest image frame component map may be compared with a saturation S distinguishing threshold, the brightness I of the forest image frame component map may be compared with a brightness I distinguishing threshold, and the forest image frame component map area defined when the saturation S meets the saturation S distinguishing threshold and the brightness I meets the brightness I distinguishing threshold is a fire area; otherwise, no fire zone exists.
The unmanned aerial vehicle is used for calculating the characteristic value of the fire area when the fire area exists in the forest image frame component map. The combustion condition of the ignition area can be known according to the characteristic value of the ignition area.
Wherein the characteristic values of the fire area include: the area value A of the fire area and the area gravity center point coordinate C (x)c,yc). Extracting a target area image as G from two adjacent forest images (one of which is the forest image frame of the detected fire area)iAnd Gi+1Then, subtraction processing is performed on the two frame forest images (0 is set for a value smaller than 0), and an image GD (x, y) is generated:
GD(x,y)=Gi-Gi+1
and (3) applying a Wiener filter to carry out smooth denoising treatment on the GD (x, y) to remove isolated point noise. Image G is then determined using the following methodi+1Fire area of (2):
1) the image GD (x, y) is set as a start detection target pixel.
2) The new set of pixels is taken as the starting point for detection.
3) A clustering condition is determined as a pixel growth criterion.
(1) Adjacent to the seed point.
(2) The brightness value of the pixel point is larger than T. The value of T is about 0.9, which is suitable.
(3) Definition Gi+1The maximum value R (x, y) of the gradient in each direction in the image is generally small, so that setting R (x, y) to be less than 0.1 can basically effectively detect the fire area. Wherein R (x, y) is defined as
R(x,y)=max(|Gi+1(x,y+1)-Gi+1(x,y)|,|Gi+1(x,y-1)
-Gi+1(x,y)|,|Gi+1(x+1,y)-Gi+1(x,y)|,|Gi+1(x-1,y)
-Gi+1(x,y)|,|Gi+1(x+1,y+1)-Gi+1(x,y)|,|Gi+1(x+1,y-1)
-Gi+1(x,y)|,|Gi+1(x-1,y+1)-Gi+1(x,y)|,|Gi+1(x-1,y-1)
-Gi+1(x,y)|)
4) Searching the whole image, finding out the pixels which accord with the step 3), and adding the pixels into the pixel set. Generating a new pixel set, judging whether the pixel set is increased, if so, returning to the step 2), and if not, executing the step 5).
5) Generation of image G 'from pixels in a set of pixels'2
Calculating an area value A of the fire area: the area value a is defined as the sum of the points in the target extracted binary image with pixel 1:
Figure BDA0002173436430000091
defining the coordinates of the gravity center point of the target image by considering the change of the position of the gravity center point, and then binarizing the image G'2Center of gravity point coordinate C (x)c,yc) Comprises the following steps:
Figure BDA0002173436430000092
the unmanned aerial vehicle is used for generating a multi-element array of the characteristic values of the fire area according to the characteristic values of the fire area and the acquisition time of the forest image frame<A,C(xc,yc),T>. The time parameter T can be added to the characteristic value of the fire area to form a multi-element array representing the characteristic value of the fire area<A,C(xc,yc),T>And the time parameter T is the acquisition time of the forest image frame.
Wherein the unmanned aerial vehicle is further configured to identify the plurality of fire areas of the plurality of consecutive forest image frames and perform the features of the plurality of fire areas when the fire areas are identified to exist in the forest image frame component mapMultivariate array of values and eigenvalues of the several fire zones<A,C(xc,yc),T>And (4) calculating. When the unmanned aerial vehicle identifies the fire area, for a plurality of continuous forest image frames, identifying the fire area from each forest image frame and calculating the characteristic value of the fire area, so as to form the characteristic values of the fire area, and further generate the multi-element array of the characteristic values of the fire area<A,C(xc,yc),T>. The unmanned aerial vehicle carries out multivariate array of characteristic values of fire regions of a plurality of continuous forest image frames<A,C(xc,yc),T>And uploading to a server.
As shown in fig. 2, wherein the server includes: the device comprises a verification module and a neural network evaluation module.
The verification module is used for generating a multivariate array according to the characteristic values of the plurality of fire areas<A,C(xc,yc),T>And calculating each index of the fire area, and verifying whether a fire phenomenon exists or not according to each index of the fire area. The verification module verifies whether the fire phenomenon really exists or not so as to prevent the false alarm phenomenon from happening. The verification module judges that the fire phenomenon does not exist when any one index in the indexes does not conform to the preset interval; judging that the fire occurs when all indexes are verified to be in accordance with the preset interval, thereby obtaining the multi-element array of the characteristic values of the fire regions<A,C(xc,yc),T>And sending the data to a neural network evaluation module.
Wherein the indexes of the fire area comprise: area rate of change, center of gravity shift rate, and center of gravity shift direction rate of change. From the above, the image area change rate for one size of m × n is:
Figure BDA0002173436430000101
calculating the displacement offset CDL of the barycentric coordinate of each frame imagei
Figure BDA0002173436430000102
The barycentric coordinate displacement offset CDLiI.e. the center of gravity mobility.
The center of gravity movement direction change rate may be a relative movement rate of the center of gravity in the horizontal direction and the vertical direction. If used, I1(x,y),I2(X, y) respectively representing two adjacent frames of the forest image, and the size of the forest image is M X N, (X)1,Y1),(X2,Y2) Respectively representing their centers of gravity; vx,VyRepresenting the relative movement rate of the center of gravity in the horizontal and vertical directions, respectively, then:
Figure BDA0002173436430000111
Figure BDA0002173436430000112
relative rate of movement V of center of gravityx,Vy∈[Vc1,VC2]Wherein V isc1,VC2Upper and lower limits of the relative moving speed, respectively, and Vc1,VC2∈(0,1)。
The neural network evaluation module is used for judging whether the fire phenomenon exists according to the characteristic values of the fire areas when the verification module verifies that the fire phenomenon exists, and the multivariate array of the characteristic values of the fire areas is used for evaluating the fire phenomenon according to the characteristic values of the fire areas<A,C(xc,yc),T>And evaluating the fire parameters of the fire area. The fire condition parameters can reflect the severity of forest fires, so that the current situation of the fires can be known in time and the fires can be saved in time.
In particular, for a number of said tuple arrays<A,C(xc,yc),T>And combining the input feature vectors according to time to generate an input feature vector, and substituting the input feature vector into the BP neural network model of the neural network evaluation module.
The BP neural network model is trained in advance and is extracted from a sample forest image frameMulti-element array<A,C(xc,yc),T>Merging the generated sample input feature vectors, substituting the merged sample input feature vectors into the BP neural network, judging whether the fire parameters output by the BP neural network model after each round of training are matched with the fire parameters actually corresponding to the sample forest image frames or not, and judging whether the deviation of the fire parameters and the deviation of the fire parameters is less than or equal to a preset allowable deviation or not, if so, stopping iteration, and if not, executing reverse calculation by the BP neural network model; repeatedly learning, and continuously adjusting weight values among the neurons until the BP neural network model extracts the multivariate array from the sample forest image frame<A,C(xc,yc),T>And if the deviation between the fire parameters output by the combined sample input feature vectors and the fire parameters actually corresponding to the sample forest image frames reaches less than or equal to the preset allowable deviation, the training of the BP neural network model is finished.
After the training is finished, when the verification module verifies that the fire phenomenon exists, the fire phenomenon is verified according to the multiple multivariate arrays<A,C(xc,yc),T>And combining the two to generate an input characteristic vector according to time, substituting the input characteristic vector into a BP neural network model of the neural network evaluation module, and obtaining the fire parameters output by the BP neural network model. The fire parameter may be, for example, the number of classes of forest fires, so that a corresponding fire fighting plan may be initiated according to the number of classes.
The invention also provides a forest fire alarm monitoring method based on edge calculation, which comprises the following steps:
collecting a plurality of continuous forest image frames by an unmanned aerial vehicle, and identifying and calculating the continuous forest image frames so as to obtain a multi-element array of characteristic values of a plurality of fire areas<A,C(xc,yc),T>(ii) a Wherein A represents an area value of the ignition region, C (x)c,yc) Representing the coordinates of the center of gravity of the fire area, wherein T is the acquisition time of the forest image frame;
a multivariate array of the characteristic values of the plurality of fire regions<A,C(xc,yc) T) uploading to a server;
by passingThe server is used for generating a multivariate array according to the characteristic values of the plurality of fire areas<A,C(xc,yc),T>Verifying whether the fire phenomenon exists or not, if so, according to the multivariate arrays of the characteristic values of the plurality of fire areas<A,C(xc,yc),T>And analyzing the fire condition parameters of the fire area.
In one possible implementation, the forest image frame is converted into a forest image frame component map composed of hue H, saturation S and brightness I components, and the multivariate array of the characteristic values of a plurality of fire regions is obtained from the forest image frame component map<A,C(xc,yc),T>。
In a possible implementation manner, the saturation S is compared with a saturation S distinguishing threshold, and the brightness I is compared with a brightness I distinguishing threshold, so as to identify whether the fire area exists in the forest image frame component map.
In one possible implementation, when the fire area exists in the forest image frame component map, the characteristic value of the fire area is calculated.
In one possible embodiment, the characteristic values of the fire area include: the area value A of the fire area and the area gravity center point coordinate C (x)c,yc)。
In one possible implementation mode, a multi-element array of the characteristic values of the fire area is generated according to the characteristic values of the fire area and the acquisition time of the forest image frame<A,C(xc,yc),T>。
In one possible embodiment, when the fire areas exist in the forest image frame component map, the fire areas of the continuous forest image frames are identified, and the characteristic values of the fire areas and the multi-element array of the characteristic values of the fire areas are carried out<A,C(xc,yc),T>And (4) calculating.
In a possible embodiment, the step of verifying is performed by the server; in the verification step, a plurality of elements according to the characteristic values of the plurality of fire regionsArray of elements<A,C(xc,yc),T>And calculating each index of the fire area, and verifying whether a fire phenomenon exists or not according to each index of the fire area.
In a possible embodiment, the indexes of the fire area include: area rate of change, center of gravity shift rate, and center of gravity shift direction rate of change.
In a possible embodiment, the neural network evaluation step is performed by the server; the neural network evaluation step is used for evaluating the fire according to the multivariate array of the characteristic values of the plurality of fire areas when the fire phenomenon is verified in the verification step<A,C(xc,yc),T>And evaluating the fire parameters of the fire area.
The forest image frames acquired by the unmanned aerial vehicle are used for identifying the fire area, and the characteristic value of the fire area and the multi-element array of the characteristic value of the fire area are calculated<A,C(xc,yc),T>Whereby the server is responsive to the tuple array<A,C(xc,yc),T>And verifying whether the fire phenomenon exists or not, analyzing the fire parameters of the fire area, and alarming by using the fire parameters. The problems that the transmission speed of the unmanned aerial vehicle for returning the aerial photography forest images is low and the phenomena of signal loss and interruption frequently occur due to poor communication signal coverage in a forest region are solved; the unmanned aerial vehicle only needs to transmit back the multivariate array of the characteristic value of the fire area to the server<A,C(xc,yc),T>Compared with the method for returning the complete aerial forest image, the method saves the communication load and reduces the requirement on communication.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (4)

1. A forest fire alarm monitoring system based on edge calculation is characterized by comprising:
the unmanned aerial vehicle is used for acquiring a plurality of continuous forest image frames and carrying out identification calculation on the plurality of continuous forest image frames so as to obtain a multi-element array of characteristic values of a plurality of fire areas<A,C(xc,yc),T>(ii) a Wherein A represents the area value of the ignition region, C (x)c,yc) Representing the coordinates of the center of gravity point of the fire area, and T representing the acquisition time of the forest image frame;
a server for generating a multivariate array based on the eigenvalues of the plurality of firing zones<A,C(xc,yc),T>Verifying whether the fire phenomenon exists or not, if so, according to the multivariate arrays of the characteristic values of the plurality of fire areas<A,C(xc,yc),T>Analyzing fire parameters of the fire area; when the server verifies that the fire phenomenon exists, the server alarms by using the fire condition parameters obtained by the server;
the unmanned aerial vehicle is used for converting the forest image frame into a forest image frame component diagram consisting of hue H, saturation S and brightness I components; converting the pixel colors of the forest image frame from a red R value, a green G value and a blue B value into hue H, saturation S and brightness I components, thereby converting the forest image frame into a forest image frame component map consisting of the hue H, the saturation S and the brightness I components;
the HSI color space and the RGB color space have a certain conversion relation, and when the primary colors are standardized, namely R, G, B belongs to [0,1], the conversion relation comprises the following steps:
Figure FDA0002795405270000011
Figure FDA0002795405270000012
Figure FDA0002795405270000013
Figure FDA0002795405270000014
the unmanned aerial vehicle is also used for identifying the plurality of fire areas of the plurality of continuous forest image frames and carrying out the characteristic values of the plurality of fire areas and the multi-element array of the characteristic values of the plurality of fire areas when the fire areas exist in the forest image frame component diagram<A,C(xc,yc),T>Calculating (1); when the unmanned aerial vehicle identifies the fire area, for a plurality of continuous forest image frames, identifying the fire area from each forest image frame and calculating the characteristic value of the fire area, so as to form the characteristic values of the fire area, and further generate the multi-element array of the characteristic values of the fire area<A,C(xc,yc),T>(ii) a The unmanned aerial vehicle carries out multivariate array of characteristic values of fire regions of a plurality of continuous forest image frames<A,C(xc,yc),T>Uploading to a server;
wherein the server comprises: the device comprises a verification module and a neural network evaluation module;
the verification module is used for generating a multivariate array according to the characteristic values of the plurality of fire areas<A,C(xc,yc),T>Calculating each index of the fire area, and verifying whether a fire phenomenon exists according to each index of the fire area; the verification module verifies whether the fire phenomenon really exists or not so as to prevent the occurrence of false alarm phenomenon; the verification module judges that the fire phenomenon does not exist when any one index in the indexes does not conform to the preset interval; judging that the fire occurs when all indexes are verified to be in accordance with the preset interval, thereby obtaining the multi-element array of the characteristic values of the fire regions<A,C(xc,yc),T>Sending the data to a neural network evaluation module;
wherein the indexes of the fire area comprise: area rate of change, center of gravity rate of movement and center of gravity direction rate of change; thus, the rate of change of image area for one size of m × n is:
Figure FDA0002795405270000021
calculating the displacement offset CDL of the barycentric coordinate of each frame imagei
Figure FDA0002795405270000022
The barycentric coordinate displacement offset CDLiNamely the gravity center mobility;
the change rate of the gravity center moving direction is the relative moving rate of the gravity center in the horizontal direction and the vertical direction; if used, I1(x,y),I2(X, y) respectively representing two adjacent frames of the forest image, and the size of the forest image is M X N, (X)1,Y1),(X2,Y2) Respectively representing their centers of gravity; vx,VyRepresenting the relative movement rate of the center of gravity in the horizontal and vertical directions, respectively, then:
Figure FDA0002795405270000031
Figure FDA0002795405270000032
relative rate of movement V of center of gravityx,Vy∈[Vc1,VC2]Wherein V isc1,VC2Upper and lower limits of the relative moving speed, respectively, and Vc1,VC2∈(0,1)。
2. The forest fire alarm monitoring system of claim 1, wherein the drone is configured to compare the saturation S to a saturation S discrimination threshold and the brightness I to a brightness I discrimination threshold to identify whether the fire area exists in the forest image frame component map; when the saturation S meets a forest image frame component map area defined by a saturation S distinguishing threshold and the brightness I meets a brightness I distinguishing threshold, the forest image frame component map area defined by the brightness I distinguishing threshold is a fire area; otherwise, no fire zone exists.
3. A forest fire alarm monitoring system as claimed in claim 2, wherein the characteristic values of the fire area include: the area value A of the fire-starting area and the coordinates C (x) of the area center of gravity pointc,yc) (ii) a Extracting a target area image G from two adjacent forest imagesiAnd Gi+1Then, subtraction processing is performed on the two frames of forest images to generate an image GD (x, y):
GD(x,y)=Gi-Gi+1
carrying out smooth denoising treatment on GD (x, y) by using a Wiener filter to remove isolated point noise; image G is then determined using the following methodi+1Fire area of (2):
1) with the image GD (x, y) as a start detection target pixel set;
2) taking the new pixel set as a detection starting point;
3) determining a clustering condition as a pixel growth criterion;
(1) adjacent to the seed point;
(2) the brightness value of the pixel point is larger than 0.9;
(3) definition Gi+1Gradient maximum values R (x, y) in all directions in the image are small, so that the fire area can be effectively detected by setting R (x, y) to be less than 0.1; wherein R (x, y) is defined as
R(x,y)=max(|Gi+1(x,y+1)-Gi+1(x,y)|,|Gi+1(x,y-1)-Gi+1(x,y)|,|Gi+1(x+1,y)-Gi+1(x,y)|,|Gi+1(x-1,y)-Gi+1(x,y)|,|Gi+1(x+1,y+1)-Gi+1(x,y)|,|Gi+1(x+1,y-1)-Gi+1(x,y)|,|Gi+1(x-1,y+1)-Gi+1(x,y)|,|Gi+1(x-1,y-1)-Gi+1(x,y)|);
4) Searching the whole image, finding out the pixels which accord with the step 3), and adding the pixels into the pixel set; generating a new pixel set, judging whether the pixel set is increased, if so, returning to the step 2), and if not, executing the step 5);
5) generation of image G 'from pixels in a set of pixels'2
For an image of size m × n, calculating the area value a of the firing zone, defined as the sum of the points of the target extracted binary image with pixels 1:
Figure FDA0002795405270000041
defining the coordinates of the gravity center point of the target image by considering the change of the position of the gravity center point, and then binarizing the image G'2Center of gravity point coordinate C (x)c,yc) Comprises the following steps:
Figure FDA0002795405270000051
4. the forest fire alarm monitoring system of claim 3, wherein the neural network evaluation module is configured to evaluate the multivariate array according to the eigenvalues of the plurality of fire zones when the verification module verifies that the fire event exists<A,C(xc,yc),T>Evaluating a fire parameter of the fire area; in particular, for a number of said tuple sets<A,C(xc,yc),T>Combining the input characteristic vectors according to time to generate an input characteristic vector, and substituting the input characteristic vector into a BP neural network model of the neural network evaluation module;
the BP neural network model is trained in advance, and a multivariate array extracted from a sample forest image frame<A,C(xc,yc),T>Merging generated sample input feature vectorsSubstituting the fire parameters into the BP neural network, judging whether the fire parameters output by the BP neural network model after each round of training are matched with the fire parameters actually corresponding to the forest image frames of the sample, and judging whether the deviation between the fire parameters and the forest image frames is less than or equal to a preset allowable deviation, if so, stopping iteration, and if not, executing reverse calculation by the BP neural network model; repeatedly learning, and continuously adjusting weight values among the neurons until the BP neural network model extracts the multivariate array from the sample forest image frame<A,C(xc,yc),T>Combining the fire parameters output by the generated sample input feature vectors and the fire parameter deviation actually corresponding to the sample forest image frame, and finishing training on the BP neural network model if the deviation reaches less than or equal to the preset allowable deviation;
after the training is finished, when the verification module verifies that the fire phenomenon exists, the fire phenomenon is verified according to the plurality of the multivariate arrays<A,C(xc,yc),T>Combining the two to generate an input characteristic vector according to time, substituting the input characteristic vector into a BP neural network model of the neural network evaluation module, and obtaining a fire parameter output by the BP neural network model; the fire parameters comprise the grade number of the forest fire, so that the corresponding fire extinguishing plan is started according to the grade number.
CN201910770603.2A 2019-08-20 2019-08-20 Forest fire alarm monitoring system based on edge calculation Active CN110660187B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910770603.2A CN110660187B (en) 2019-08-20 2019-08-20 Forest fire alarm monitoring system based on edge calculation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910770603.2A CN110660187B (en) 2019-08-20 2019-08-20 Forest fire alarm monitoring system based on edge calculation

Publications (2)

Publication Number Publication Date
CN110660187A CN110660187A (en) 2020-01-07
CN110660187B true CN110660187B (en) 2021-01-01

Family

ID=69037503

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910770603.2A Active CN110660187B (en) 2019-08-20 2019-08-20 Forest fire alarm monitoring system based on edge calculation

Country Status (1)

Country Link
CN (1) CN110660187B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111899452A (en) * 2020-08-04 2020-11-06 成都云图睿视科技有限公司 Forest fire prevention early warning system based on edge calculation
CN115526896A (en) * 2021-07-19 2022-12-27 中核利华消防工程有限公司 Fire prevention and control method and device, electronic equipment and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2689809A1 (en) * 2012-07-24 2014-01-29 The Boeing Company Wildfire arrest and prevention system
CN103971114A (en) * 2014-04-23 2014-08-06 天津航天中为数据***科技有限公司 Forest fire detection method based on aerial remote sensing
CN106955437A (en) * 2017-03-24 2017-07-18 西安旋飞电子科技有限公司 A kind of fire-fighting unmanned plane
CN206773866U (en) * 2017-01-23 2017-12-19 无锡觅睿恪科技有限公司 Fire alarm early warning unmanned plane
CN110047241A (en) * 2019-04-27 2019-07-23 刘秀萍 A kind of forest fire unmanned plane cruise monitoring system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2689809A1 (en) * 2012-07-24 2014-01-29 The Boeing Company Wildfire arrest and prevention system
CN103971114A (en) * 2014-04-23 2014-08-06 天津航天中为数据***科技有限公司 Forest fire detection method based on aerial remote sensing
CN206773866U (en) * 2017-01-23 2017-12-19 无锡觅睿恪科技有限公司 Fire alarm early warning unmanned plane
CN106955437A (en) * 2017-03-24 2017-07-18 西安旋飞电子科技有限公司 A kind of fire-fighting unmanned plane
CN110047241A (en) * 2019-04-27 2019-07-23 刘秀萍 A kind of forest fire unmanned plane cruise monitoring system

Also Published As

Publication number Publication date
CN110660187A (en) 2020-01-07

Similar Documents

Publication Publication Date Title
EP3869459B1 (en) Target object identification method and apparatus, storage medium and electronic apparatus
CN109522793B (en) Method for detecting and identifying abnormal behaviors of multiple persons based on machine vision
CN115691026B (en) Intelligent early warning monitoring management method for forest fire prevention
EP3648448B1 (en) Target feature extraction method and device, and application system
CN106600888B (en) Automatic forest fire detection method and system
Liao et al. A localized approach to abandoned luggage detection with foreground-mask sampling
CN107944359A (en) Flame detecting method based on video
CN112016457A (en) Driver distraction and dangerous driving behavior recognition method, device and storage medium
CN110837784A (en) Examination room peeping cheating detection system based on human head characteristics
EP3036714B1 (en) Unstructured road boundary detection
CN107798279B (en) Face living body detection method and device
CN101715111B (en) Method for automatically searching abandoned object in video monitoring
CN103761529A (en) Open fire detection method and system based on multicolor models and rectangular features
CN108596087B (en) Driving fatigue degree detection regression model based on double-network result
CN106339657B (en) Crop straw burning monitoring method based on monitor video, device
CN106096604A (en) Multi-spectrum fusion detection method based on unmanned platform
CN110660187B (en) Forest fire alarm monitoring system based on edge calculation
CN111062303A (en) Image processing method, system and computer storage medium
CN112101260B (en) Method, device, equipment and storage medium for identifying safety belt of operator
US20220366570A1 (en) Object tracking device and object tracking method
JP2019106193A (en) Information processing device, information processing program and information processing method
KR102514301B1 (en) Device for identifying the situaton of object&#39;s conduct using sensor fusion
CN113312965A (en) Method and system for detecting unknown face spoofing attack living body
CN107491714B (en) Intelligent robot and target object identification method and device thereof
KR101343623B1 (en) adaptive color detection method, face detection method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant