CN111986436A - Comprehensive flame detection method based on ultraviolet and deep neural networks - Google Patents

Comprehensive flame detection method based on ultraviolet and deep neural networks Download PDF

Info

Publication number
CN111986436A
CN111986436A CN202010908392.7A CN202010908392A CN111986436A CN 111986436 A CN111986436 A CN 111986436A CN 202010908392 A CN202010908392 A CN 202010908392A CN 111986436 A CN111986436 A CN 111986436A
Authority
CN
China
Prior art keywords
flame
ultraviolet
neural network
image
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010908392.7A
Other languages
Chinese (zh)
Other versions
CN111986436B (en
Inventor
王思维
范峥荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Zhima Technology Co ltd
Original Assignee
Chengdu Zhima Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Zhima Technology Co ltd filed Critical Chengdu Zhima Technology Co ltd
Priority to CN202010908392.7A priority Critical patent/CN111986436B/en
Publication of CN111986436A publication Critical patent/CN111986436A/en
Application granted granted Critical
Publication of CN111986436B publication Critical patent/CN111986436B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Fire-Detection Mechanisms (AREA)
  • Fire Alarms (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)

Abstract

The invention provides an ultraviolet and deep neural network-based comprehensive flame detection system, which comprises an ultraviolet detector, a camera, a controller and an alarm device, wherein the ultraviolet detector is connected with the camera; the data output end of the ultraviolet detector is connected with the first data input end of the controller, the trigger end of the controller is connected with the trigger end of the camera, the data output end of the camera is connected with the second data input end of the controller, and the alarm output end of the controller is connected with the alarm input end of the alarm device; the controller judges whether the data collected by the ultraviolet detector meets the triggering condition or not according to the data collected by the ultraviolet detector, if so, the controller triggers the camera to collect the data, and the controller judges the environmental condition of the controller according to the data collected by the camera and gives an alarm through the alarm device. The invention can judge and alarm the fire spreading.

Description

Comprehensive flame detection method based on ultraviolet and deep neural networks
Technical Field
The invention relates to the field of deep learning target detection, in particular to a comprehensive flame detection method based on an ultraviolet deep neural network.
Background
Fire has played an important role in the development of society, but it is also a double-edged sword. In order to reduce the influence of fire on life, property, social economy, and the like, fire detection technology is extremely important to control the fire to a very small range. Along with the development of electronic technology, fire detection devices have also been emerging, and can be classified into smoke-sensitive, temperature-sensitive, photosensitive and image-based fire detectors according to the fire detection principle.
1. The temperature-sensing fire detector is a sensor which responds according to the change of the environmental temperature and can convert the temperature signal in the environment into an electric signal. A smoke-sensitive fire detector is a fire detector that responds to smoke particles in ambient air, and is a device that converts changes in smoke concentration at a detection site into an electrical signal.
2. Generally, the reliability of the temperature-sensitive fire detector is higher than that of the smoke detector, and the requirement on the environment is low; however, it is not suitable for use in places where smoke or dust may be generated because of its slow response to initial fire. The smoke-sensitive fire detector can detect dangerous situations in time at the initial stage of a fire, but is only suitable for places where the fire generates large smoke or places where smoldering is easy to generate, and has great limitation when being used in environments with too fast ventilation or large smoke at ordinary times. And the traditional temperature-sensing type and smoke-sensing type detectors have the characteristics of high false alarm rate and long response time.
3. The photosensitive flame detector is generally divided into an ultraviolet detector and an infrared detector, and the traditional photosensitive flame detector comprises a single ultraviolet detector, a single infrared detector, a double infrared detector and a triple infrared detector. The ultraviolet detector has quick response, is not sensitive to people and high-temperature objects, has background noise, is easily influenced by thunder, electric arcs and the like, and the infrared detector is easily influenced by the high-temperature objects, people, sunlight and the like, so that the single ultraviolet and single infrared detection is easily subjected to false alarm.
4. The image type fire detector analyzes and processes the environmental picture data shot by the camera to judge whether a fire disaster exists in the current environment. The image processing technology has some existing algorithms for fire detection, but most of the algorithms are applied to some fixed scenes, so that the improvement of the fire detection rate and the reduction of the false detection rate are difficult to meet simultaneously.
Disclosure of Invention
The invention aims to at least solve the technical problems in the prior art, and particularly creatively provides a comprehensive flame detection method based on an ultraviolet and deep neural network.
In order to achieve the above object, the present invention provides a comprehensive flame detection system based on an ultraviolet and deep neural network, comprising an ultraviolet detector, a camera, a controller and an alarm device;
the data output end of the ultraviolet detector is connected with the first data input end of the controller, the trigger end of the controller is connected with the trigger end of the camera, the data output end of the camera is connected with the second data input end of the controller, and the alarm output end of the controller is connected with the alarm input end of the alarm device;
the controller judges whether the data collected by the ultraviolet detector meets the triggering condition or not according to the data collected by the ultraviolet detector, if so, the controller triggers the camera to collect the data, and the controller judges the environmental condition of the controller according to the data collected by the camera and gives an alarm through the alarm device.
The invention also discloses a comprehensive flame detection method based on the ultraviolet deep neural network, which comprises the following steps:
s1, acquiring image data of a plurality of imaging modes under the environment of flame detection, wherein the image data comprise a sample image with flame and a sample image without fire;
s2, processing the flame image data set based on Yolo _ v4, training a model, and constructing a deep neural network model for identifying flames;
s3, detecting the ultraviolet spectrum in the solar blind zone wave band in the flame radiation under the monitored environment in real time through an ultraviolet sensor, and triggering a camera to capture the environment condition when the flame occurs in the environment by a controller to obtain the current environment image data;
s4, recognizing the flame condition in the image data through a deep neural network model, further confirming the fire possibly detected by the ultraviolet detector, outputting the area of the flame and the position of a fire source, and calculating the spreading speed of the flame through comparing the areas of the flame of the front frame and the flame of the rear frame;
in a preferred embodiment of the present invention, the S2 includes:
s2-1, preprocessing the acquired flame image data, classifying images with or without flames, marking the positions of fire sources in the images with the flames to obtain flame images with fire source marks, and dividing an image data set into a training set and a testing set;
s2-2, presetting hyper-parameters such as learning rate, iteration times and the like of Yolo _ v4, and inputting a training set into a neural network for classification training;
s2-3, calculating the values of Loss, Precision, Recall, F1, AP and mAp after iterative training according to a formula, setting the threshold values of Loss and mAp, and keeping the weight file of the neural network when the final training result meets the threshold values;
s2-4, if the training result does not meet the threshold condition, retraining again by means of increasing the data set, resetting the hyper-parameter and the like until the training result meets the threshold;
and S2-5, inputting the test set into the neural network model trained in the S2-3, calculating the detection precision of the neural network model on the test set, if the detection precision meets a set value, indicating that the training is successful, and if the detection precision does not meet the set value, changing parameters to carry out iterative training again.
In a preferred embodiment of the present invention, the S3 includes:
s3-1, detecting an ultraviolet spectrum in a solar blind area wave band in flame radiation under an environment through an ultraviolet detector, measuring the ultraviolet spectrum in the environment once every 30ms through an ultraviolet sensor, and setting a counter to record the environment abnormity detection times;
s3-2, when detecting that the ultraviolet signal in the environment does not reach the set threshold value, the environment is normal, the ultraviolet detector returns first data, and the counter is emptied;
s3-3, when detecting that the ultraviolet signal in the environment reaches a set threshold, starting to count, when the counting times reach a set value, returning second data by the ultraviolet detector, and starting the camera to take a snapshot;
in a preferred embodiment of the present invention, the S4 includes:
s4-1, when the ultraviolet detector triggers the camera to capture flame image data of the current environment, sending the image into a feature extraction network, and outputting feature maps of three scales;
s4-2, inputting the characteristic diagram into the target detection layer, and detecting the position of the fire source on the characteristic diagrams of three scales;
s4-3, if it is detected that the fire really exists in the image, the network outputs a prediction boundary box of the fire source, including information of the position of the upper left corner point of the flame prediction boundary box in the image, the length and the width of the boundary box and the like.
S4-3, if it is detected that the image really has a fire, the network outputs a flame prediction boundary box which comprises information of the position of the upper left corner point of the flame prediction boundary box in the image, the length and the width of the boundary box and the like, when flame information is detected in ten continuous images, the fire is determined to be in place, and alarm information is output to an alarm device.
S4-4, calculating the flame area size in the image through the flame prediction bounding box information output by the neural network, comparing the flame area size in the previous frame of image input to the neural network, and calculating the flame spreading speed.
Figure BDA0002662332170000041
Figure BDA0002662332170000042
Wherein Area refers to the total size of flame Area in the image, N refers to the number of fire sources in the image, L, W refers to the length and width of the flame prediction bounding box of each fire source, Rate refers to the propagation speed of flame and refers to the interval time between two frame image inputs, subscript k refers to the current frame image of the input neural network, and k-1 refers to the last frame image of the input neural network. If the spreading speed Rate is greater than or equal to a preset first spreading speed threshold value, the alarm device sends out third-level alarm information, if the spreading speed Rate is greater than or equal to a preset second spreading speed threshold value, the preset second spreading speed threshold value is greater than the preset first spreading speed threshold value, the alarm device sends out second-level alarm information, if the spreading speed Rate is greater than or equal to a preset third spreading speed threshold value, and the preset third spreading speed threshold value is greater than the preset second spreading speed threshold value, the alarm device sends out first-level alarm information, and the first-level alarm information is sound-light alarm; the second-level alarm information is a brightness alarm; the third-level alarm information is a sound alarm.
AreakRepresenting the total size of the flame area in the current frame image;
Liindicating the length of the flame prediction bounding box of the ith fire source in the image;
Wiindicating the width of the flame prediction bounding box of the ith fire source in the image;
Areak-1representing the total size of the flame area in the previous frame of image;
time represents the time difference between two frames of image acquisition;
in conclusion, due to the adoption of the technical scheme, the ultraviolet detector has the advantages of quick response, insensitivity to human and high-temperature objects and the like; the Yolo _ v4 neural network algorithm has good detection precision and detection speed. The invention relates to a fire detection system designed by combining an ultraviolet detector and a Yolo _ v4 neural network algorithm, which is characterized in that the ultraviolet detector is used for detecting an ultraviolet spectrum with the wavelength of 180-240 nm in a solar blind area wave band in the environment, a Yolo _ v4 neural network model is used for further confirming the detected possible fire, the area of the fire is obtained, and the spreading tendency of the fire is calculated by comparing the areas of flame of front and rear frames. The method can avoid the situations of false detection, missed detection and the like of the traditional temperature, smoke and light sensation sensor in some complex scenes. And aiming at the condition that the traditional fire detector can not detect the fire spreading speed at present, the invention adds the function of measuring the fire spreading trend on the basis of fire detection, and can also monitor whether the flame condition is out of control or not in some scenes with flames, so that firefighters can master more detailed fire information.
The invention detects the fire by combining the ultraviolet fire detector with the Yolo _ v4 neural network algorithm, and compared with the traditional smoke, temperature and light sensing fire detectors, the invention has stronger robustness, reduces the probability of false detection and also improves the detection speed. Compared with the traditional fire detector, the fire detector can measure the spreading speed of the fire in real time during detection, and can provide more fire information for the outside.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic diagram of the system of the present invention.
FIG. 2 is a flow chart of the method of the present invention.
FIG. 3 is a flow chart of neural network detection in accordance with the present invention.
Fig. 4 is an image of an environment in which the present invention is used for testing.
FIG. 5 is a diagram of the neural network of the present invention identifying flame effects.
FIG. 6 is a graph of the flame location, area and propagation velocity of the neural network of the present invention identifying the flame output.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
As shown in fig. 1, an ultraviolet detector monitors flame ultraviolet spectrum signals in an environment in real time, a camera triggers the ultraviolet detector to capture an environment image of the environment, captured image information is input into a Yolo _4 neural network for detection, and when a fire is detected, a warning is given out.
As shown in fig. 2, the system monitors ultraviolet spectrums in the environment in real time through an ultraviolet detector, when the ultraviolet spectrums with flames are detected to exist, the ultraviolet detector triggers a camera to capture the environment, captured image information is transmitted to a Yolo _ v4 neural network, and whether flames exist in the environment is detected through images; if the flame is detected, the neural network outputs a flame prediction frame, the position and the area of the flame in the image can be obtained through the flame prediction frame, and then whether the fire occurs is judged by judging whether the flame information is detected in ten continuous frames of images; if the fire does exist, the spreading speed of the flame is calculated according to the area of the flame detected in the two latest frames of images and the time interval between the images and the input neural network, and finally the position, the area and the spreading speed of the flame are sent to an alarm device.
Yolo _ v4 neural network model training
1. Production of data sets
a) The method comprises the steps of arranging a camera in an environment needing to be monitored for fire conditions, and manually simulating various flame conditions in the environment.
b) A camera is used for carrying out mass snapshot on the environment, environmental image data including no flame, flame and a plurality of fire sources are collected, and the quantity of different types of images is guaranteed to be equivalent.
c) The images are divided into two types of images with no flame and flame, the positions and areas of the flame in the images with flame are marked by using a marking tool, and text files including image names, detection categories and the like required by training are generated.
d) And dividing the marked image data into a training set and a testing set for the following neural network model training and performance testing.
2. Training and testing network models
a) Presetting hyper-parameters such as learning rate, iteration times and the like of the Yolo _ v4 neural network training, and inputting a training set into the neural network for classification training;
b) calculating Loss value Loss, Precision, Recall, F1, AP and mAp values after iterative training according to a formula, setting thresholds of Loss and mAp, and keeping a weight file of the neural network when a final training result meets the thresholds;
Figure BDA0002662332170000071
Figure BDA0002662332170000072
Figure BDA0002662332170000073
Figure BDA0002662332170000074
wherein T is1The number of images for correctly identifying flame is referred to; f1The number of images indicating that flames are recognized wrongly; f2Referring to the number of flame images that are not identified, mAp indicates that the AP is calculated for each category and then the AP average is calculated.
c) If the training result does not meet the threshold condition, retraining again by means of increasing the data set, resetting the hyper-parameter and the like until the training result meets the threshold;
d) inputting the test set into a trained neural network model, calculating the detection precision of the neural network model on the test set, if the test set meets a set value, indicating that the training is successful, and if the test set does not meet the set value, changing parameters and carrying out iterative training again;
the main process of the system is as follows:
1. environmental monitoring
The flame contains visible radiation which can be seen by human flesh and ultraviolet radiation which can not be seen by human eyes and has the wavelength distribution of 105-380 nm, and when a substance burns, the substance can emit stronger ultraviolet radiation, so that an ultraviolet photoelectric tube is used for detecting an ultraviolet spectrum signal in the environment, when a sufficient amount of ultraviolet signal appears in the environment, the ultraviolet photoelectric tube can generate current, then a pulse signal is generated through a capacitor charging and discharging and shaping circuit and is sent to a controller, then the controller starts to count the pulse signal, the pulse signal is read every 30ms, if the continuous counting exceeds ten times, the fact that flame exists in the environment is indicated, and then a camera is triggered to capture the image information of the environment and transmit the image information into a Yolo _ v4 neural network.
2. Identifying flames in an environment
As shown in fig. 3, after a frame of environment image is input into a neural network, the image first enters a feature extraction layer of a Yolo _ v4 neural network, and then a three-to-medium scale feature map is output for target detection of different scales; and then inputting the three characteristic maps into a target detection layer to detect whether flames exist in the image or not.
3. Outputting the position of the fire source, calculating the area and the spreading speed of the flame
4-6, when it is detected that there is a flame in the environment, the Yolo _ v4 neural network outputs a flame prediction box, which includes the position information of the upper left corner of the flame, the length and width information of the flame prediction box, and the detection confidence of the flame target.
The area of a flame prediction frame is used as the predicted area of flame, and then the flame area in the last frame of image input to the neural network is compared, and the flame spreading speed is calculated.
Figure BDA0002662332170000091
Figure BDA0002662332170000092
Wherein Area refers to the total size of the flame Area in the image, N refers to the number of fire sources in the image, L, W refers to the length and width of the flame prediction bounding box of each fire source, Rate refers to the spreading Rate of the flame and refers to the interval time between two frame image inputs, the index k refers to the current frame image of the input neural network, and k-1 refers to the last frame image of the input neural network.
If the spreading rate of the flame is greater than or equal to the preset spreading rate of the flame, the controller sends an alarm signal to the alarm device to warn the staff to pay attention.
4. Accuracy of judging fire
If the images input into the neural network in ten continuous frames detect flames, the fact that a fire disaster happens is judged, and the device can send flame information in the current environment to an alarm device, wherein the flame information comprises flame position information, flame area, spreading speed and the like.
While embodiments of the invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (5)

1. A comprehensive flame detection system based on an ultraviolet and deep neural network is characterized by comprising an ultraviolet detector, a camera, a controller and an alarm device;
the data output end of the ultraviolet detector is connected with the first data input end of the controller, the trigger end of the controller is connected with the trigger end of the camera, the data output end of the camera is connected with the second data input end of the controller, and the alarm output end of the controller is connected with the alarm input end of the alarm device;
the controller judges whether the data collected by the ultraviolet detector meets the triggering condition or not according to the data collected by the ultraviolet detector, if so, the controller triggers the camera to collect the data, and the controller judges the environmental condition of the controller according to the data collected by the camera and gives an alarm through the alarm device.
2. A comprehensive flame detection method based on an ultraviolet deep neural network is characterized by comprising the following steps:
s1, acquiring image data of a plurality of imaging modes under the environment of flame detection, wherein the image data comprise a sample image with flame and a sample image without fire;
s2, processing the flame image data set based on Yolo _ v4, training a model, and constructing a deep neural network model for identifying flames;
s3, detecting the ultraviolet spectrum in the solar blind zone wave band in the flame radiation under the monitored environment in real time through an ultraviolet detector, and triggering a camera to capture the environment condition when the flame occurs in the environment by a controller to obtain the current environment image data;
s4, recognizing the flame condition in the image data through the deep neural network model, further confirming the fire possibly detected by the ultraviolet detector, outputting the area of the flame and the position of the fire source, calculating the spreading speed of the flame through the comparison of the areas of the flame of the front frame and the flame of the rear frame, and giving an alarm.
3. The integrated flame detection method based on the ultraviolet deep neural network of claim 2, wherein the S2 includes:
s2-1, preprocessing the acquired flame image data, classifying images with or without flames, marking the positions of fire sources in the images with the flames to obtain flame images with fire source marks, and dividing an image data set into a training set and a testing set;
s2-2, presetting hyper-parameters of Yolo _ v4, and inputting the training set into a neural network for classification training;
s2-3, after iterative training, when the final result of the training meets a threshold value, retaining the weight file of the neural network;
s2-4, if the training result does not meet the threshold condition, retraining again by using a mode of increasing the data set or/and resetting the hyper-parameter until the training result meets the threshold;
and S2-5, inputting the test set into S2-3 to train the neural network model, calculating the detection precision of the neural network model to the test set, if the detection precision meets a set value, indicating that the training is successful, and if the detection precision does not meet the set value, changing parameters to carry out iterative training again.
4. The integrated flame detection method based on the ultraviolet deep neural network of claim 2, wherein the S3 includes:
s3-1, detecting an ultraviolet spectrum in a solar blind area wave band in flame radiation under an environment through an ultraviolet detector, measuring the ultraviolet spectrum in the environment once every 30ms through an ultraviolet sensor, and setting a counter to record the environment abnormity detection times;
s3-2, when detecting that the ultraviolet signal in the environment does not reach the set threshold value, the environment is normal, the ultraviolet detector returns first data, and the counter is emptied;
and S3-3, when the ultraviolet signal in the environment is detected to reach a set threshold value, counting is started, and when the counting times reach a set value, the ultraviolet detector returns second data and starts the camera to take a snapshot.
5. The integrated flame detection method based on the ultraviolet deep neural network of claim 2, wherein the S4 includes:
s4-1, when the ultraviolet detector triggers the camera to capture flame image data of the current environment, sending the image into a feature extraction network, and outputting feature maps of three scales;
s4-2, inputting the characteristic diagram into the target detection layer, and detecting the position of the fire source on the characteristic diagrams of three scales;
s4-3, if the fact that the fire exists in the image is detected, the network outputs a flame prediction boundary box which comprises the position of the upper left corner point of the flame prediction boundary box in the image or/and the length and width information of the boundary box, when the flame information is detected in ten continuous images, the fire is determined to happen, and alarm information is output to an alarm device;
s4-4, calculating the flame area size in the image through flame prediction bounding box information output by the neural network, comparing the flame area size in the previous frame of image input to the neural network, and calculating the flame spreading speed;
Figure FDA0002662332160000031
Figure FDA0002662332160000032
wherein, Area refers to the total size of the flame Area in the image, N refers to the number of fire sources in the image, L, W refers to the length and width of the flame prediction bounding box of each fire source, Rate refers to the propagation speed of the flame, subscript k refers to the current frame image of the input neural network, and k-1 refers to the last frame image of the input neural network.
CN202010908392.7A 2020-09-02 2020-09-02 Comprehensive flame detection method based on ultraviolet and deep neural networks Active CN111986436B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010908392.7A CN111986436B (en) 2020-09-02 2020-09-02 Comprehensive flame detection method based on ultraviolet and deep neural networks

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010908392.7A CN111986436B (en) 2020-09-02 2020-09-02 Comprehensive flame detection method based on ultraviolet and deep neural networks

Publications (2)

Publication Number Publication Date
CN111986436A true CN111986436A (en) 2020-11-24
CN111986436B CN111986436B (en) 2022-12-13

Family

ID=73448170

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010908392.7A Active CN111986436B (en) 2020-09-02 2020-09-02 Comprehensive flame detection method based on ultraviolet and deep neural networks

Country Status (1)

Country Link
CN (1) CN111986436B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465119A (en) * 2020-12-08 2021-03-09 武汉理工光科股份有限公司 Fire-fighting dangerous case early warning method and device based on deep learning
CN112907885A (en) * 2021-01-12 2021-06-04 中国计量大学 Distributed centralized household image fire alarm system and method based on SCNN
CN113139272A (en) * 2020-12-11 2021-07-20 西安天和防务技术股份有限公司 Forest fire spreading prediction method, device, equipment and storage medium
CN114998843A (en) * 2022-08-04 2022-09-02 深圳市海清视讯科技有限公司 Fire detection method and related device
CN116109599A (en) * 2023-02-17 2023-05-12 湖北清江水电开发有限责任公司 Carbon brush sparking monitoring system of generator
CN116109599B (en) * 2023-02-17 2024-06-11 湖北清江水电开发有限责任公司 Carbon brush sparking monitoring system of generator

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101236684A (en) * 2008-03-06 2008-08-06 中国人民解放军第二炮兵装备研究院第三研究所 Fire disaster detector and detection positioning method
CN101315667A (en) * 2008-07-04 2008-12-03 南京航空航天大学 Multi-characteristic synthetic recognition method for outdoor early fire disaster
US20090060260A1 (en) * 2007-08-29 2009-03-05 Billy Hou Intelligent image smoke/flame sensor and detection system
CN101840571A (en) * 2010-03-30 2010-09-22 杭州电子科技大学 Flame detection method based on video image
CN102819926A (en) * 2012-08-24 2012-12-12 华南农业大学 Fire monitoring and warning method on basis of unmanned aerial vehicle
CN103337123A (en) * 2013-06-17 2013-10-02 西安石油大学 System and method for oil field united station fire disaster pre-warning system based on video images
CN104021644A (en) * 2014-06-13 2014-09-03 淄博安益矿用设备有限公司 False triggering prevention device and method for ultraviolet flame sensor
CN104199117A (en) * 2014-08-20 2014-12-10 湖北捷讯光电有限公司 Infrared ultraviolet composite flame detector and detection method thereof
US9459142B1 (en) * 2015-09-10 2016-10-04 General Monitors, Inc. Flame detectors and testing methods
CN106251568A (en) * 2016-10-26 2016-12-21 山东大学 A kind of fire alarm safety-protection system based on ultraviolet and image detecting technique and method
CN107301653A (en) * 2017-05-18 2017-10-27 常州市武进区半导体照明应用技术研究院 Video image fire disaster flame detection method based on BP neural network
CN107941331A (en) * 2017-12-27 2018-04-20 四川大学 The energy-probe design method of ArF excimer laser Light Energies uniformity detection
JP2019028957A (en) * 2017-08-04 2019-02-21 モリタ宮田工業株式会社 Fire disaster identification apparatus
CN110263622A (en) * 2019-05-07 2019-09-20 平安科技(深圳)有限公司 Train fire monitoring method, apparatus, terminal and storage medium
CN110516609A (en) * 2019-08-28 2019-11-29 南京邮电大学 A kind of fire video detection and method for early warning based on image multiple features fusion
CN110660052A (en) * 2019-09-23 2020-01-07 武汉科技大学 Hot-rolled strip steel surface defect detection method based on deep learning
CN110852347A (en) * 2019-10-17 2020-02-28 天津大学 Fire detection method using improved YOLO v3
CN111062350A (en) * 2019-12-23 2020-04-24 合肥众兴智讯科技有限公司 Artificial intelligence based firework identification algorithm
CN111091072A (en) * 2019-11-29 2020-05-01 河海大学 YOLOv 3-based flame and dense smoke detection method
CN111105587A (en) * 2019-12-31 2020-05-05 广州思瑞智能科技有限公司 Intelligent flame detection method and device, detector and storage medium
CN111179279A (en) * 2019-12-20 2020-05-19 成都指码科技有限公司 Comprehensive flame detection method based on ultraviolet and binocular vision
CN111275918A (en) * 2020-03-05 2020-06-12 深圳市君利信达科技有限公司 Flame detection analysis early warning system
CN111401148A (en) * 2020-02-27 2020-07-10 江苏大学 Road multi-target detection method based on improved multilevel YO L Ov3
CN111415345A (en) * 2020-03-20 2020-07-14 山东文多网络科技有限公司 Transformer substation ultraviolet image intelligent inspection algorithm and device based on deep learning

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090060260A1 (en) * 2007-08-29 2009-03-05 Billy Hou Intelligent image smoke/flame sensor and detection system
CN101236684A (en) * 2008-03-06 2008-08-06 中国人民解放军第二炮兵装备研究院第三研究所 Fire disaster detector and detection positioning method
CN101315667A (en) * 2008-07-04 2008-12-03 南京航空航天大学 Multi-characteristic synthetic recognition method for outdoor early fire disaster
CN101840571A (en) * 2010-03-30 2010-09-22 杭州电子科技大学 Flame detection method based on video image
CN102819926A (en) * 2012-08-24 2012-12-12 华南农业大学 Fire monitoring and warning method on basis of unmanned aerial vehicle
CN103337123A (en) * 2013-06-17 2013-10-02 西安石油大学 System and method for oil field united station fire disaster pre-warning system based on video images
CN104021644A (en) * 2014-06-13 2014-09-03 淄博安益矿用设备有限公司 False triggering prevention device and method for ultraviolet flame sensor
CN104199117A (en) * 2014-08-20 2014-12-10 湖北捷讯光电有限公司 Infrared ultraviolet composite flame detector and detection method thereof
US9459142B1 (en) * 2015-09-10 2016-10-04 General Monitors, Inc. Flame detectors and testing methods
CN106251568A (en) * 2016-10-26 2016-12-21 山东大学 A kind of fire alarm safety-protection system based on ultraviolet and image detecting technique and method
CN107301653A (en) * 2017-05-18 2017-10-27 常州市武进区半导体照明应用技术研究院 Video image fire disaster flame detection method based on BP neural network
JP2019028957A (en) * 2017-08-04 2019-02-21 モリタ宮田工業株式会社 Fire disaster identification apparatus
CN107941331A (en) * 2017-12-27 2018-04-20 四川大学 The energy-probe design method of ArF excimer laser Light Energies uniformity detection
CN110263622A (en) * 2019-05-07 2019-09-20 平安科技(深圳)有限公司 Train fire monitoring method, apparatus, terminal and storage medium
CN110516609A (en) * 2019-08-28 2019-11-29 南京邮电大学 A kind of fire video detection and method for early warning based on image multiple features fusion
CN110660052A (en) * 2019-09-23 2020-01-07 武汉科技大学 Hot-rolled strip steel surface defect detection method based on deep learning
CN110852347A (en) * 2019-10-17 2020-02-28 天津大学 Fire detection method using improved YOLO v3
CN111091072A (en) * 2019-11-29 2020-05-01 河海大学 YOLOv 3-based flame and dense smoke detection method
CN111179279A (en) * 2019-12-20 2020-05-19 成都指码科技有限公司 Comprehensive flame detection method based on ultraviolet and binocular vision
CN111062350A (en) * 2019-12-23 2020-04-24 合肥众兴智讯科技有限公司 Artificial intelligence based firework identification algorithm
CN111105587A (en) * 2019-12-31 2020-05-05 广州思瑞智能科技有限公司 Intelligent flame detection method and device, detector and storage medium
CN111401148A (en) * 2020-02-27 2020-07-10 江苏大学 Road multi-target detection method based on improved multilevel YO L Ov3
CN111275918A (en) * 2020-03-05 2020-06-12 深圳市君利信达科技有限公司 Flame detection analysis early warning system
CN111415345A (en) * 2020-03-20 2020-07-14 山东文多网络科技有限公司 Transformer substation ultraviolet image intelligent inspection algorithm and device based on deep learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张为等: "嵌入DenseNet结构和空洞卷积模块的改进YOLO v3火灾检测算法", 《天津大学学报(自然科学与工程技术版)》 *
张为等: "嵌入DenseNet结构和空洞卷积模块的改进YOLO v3火灾检测算法", 《天津大学学报(自然科学与工程技术版)》, vol. 53, no. 09, 30 September 2020 (2020-09-30), pages 976 - 983 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465119A (en) * 2020-12-08 2021-03-09 武汉理工光科股份有限公司 Fire-fighting dangerous case early warning method and device based on deep learning
CN113139272A (en) * 2020-12-11 2021-07-20 西安天和防务技术股份有限公司 Forest fire spreading prediction method, device, equipment and storage medium
CN112907885A (en) * 2021-01-12 2021-06-04 中国计量大学 Distributed centralized household image fire alarm system and method based on SCNN
CN114998843A (en) * 2022-08-04 2022-09-02 深圳市海清视讯科技有限公司 Fire detection method and related device
CN116109599A (en) * 2023-02-17 2023-05-12 湖北清江水电开发有限责任公司 Carbon brush sparking monitoring system of generator
CN116109599B (en) * 2023-02-17 2024-06-11 湖北清江水电开发有限责任公司 Carbon brush sparking monitoring system of generator

Also Published As

Publication number Publication date
CN111986436B (en) 2022-12-13

Similar Documents

Publication Publication Date Title
CN111986436B (en) Comprehensive flame detection method based on ultraviolet and deep neural networks
CN106448023B (en) Fire smoke alarm with storage function
US8294567B1 (en) Method and system for fire detection
CN104318688B (en) A kind of multisensor fire alarm method based on data fusion
CN106251568A (en) A kind of fire alarm safety-protection system based on ultraviolet and image detecting technique and method
CN112002095A (en) Fire early warning method in mine tunnel
CN113299035A (en) Fire identification method and system based on artificial intelligence and binocular vision
CN113516820B (en) Fire early warning method and fire early warning system
EP3361459A1 (en) Method, apparatus and system for passive infrared sensor framework
CN110675591A (en) Anti-interference photoelectric smoke detection method and detection module
Baek et al. Intelligent multi-sensor detection system for monitoring indoor building fires
CN114120171A (en) Fire smoke detection method, device and equipment based on video frame and storage medium
CN209433517U (en) It is a kind of based on more flame images and the fire identification warning device for combining criterion
JP4111660B2 (en) Fire detection equipment
Wang et al. Forest fire detection system based on Fuzzy Kalman filter
CN114255562A (en) Wisdom fire control early warning system based on thing networking
CN113362560B (en) Photoelectric smoke sensing detection method for accurately identifying fire smoke
Ho et al. Nighttime fire smoke detection system based on machine vision
JPH07200961A (en) Fire alarm system for early detection of fire
CN113516091B (en) Method for identifying electric spark image of transformer substation
CN111882800B (en) Fire-fighting early warning method and system based on multi-dimensional data linkage
EP3029600A1 (en) Thermal image object detection
CN117636565B (en) Multispectral flame detection system based on spectral feature data fusion
KR20230075647A (en) Ensemble supervised learning-based fire situation detection method
KR20060020507A (en) An exhaust smoke recognition and alarm device and method using picture image analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: No. 109, 1st Floor, Building 2, No. 11, Tianying Road, High tech Zone, Chengdu, Sichuan 611700

Applicant after: Chengdu Shidao Information Technology Co.,Ltd.

Address before: 611731 floor 2, No. 4, Xinhang Road, West Park, high tech Zone (West Zone), Chengdu, Sichuan

Applicant before: CHENGDU ZHIMA TECHNOLOGY CO.,LTD.

GR01 Patent grant
GR01 Patent grant