CN114913323A - Method for detecting open fire at night in charging pile area - Google Patents

Method for detecting open fire at night in charging pile area Download PDF

Info

Publication number
CN114913323A
CN114913323A CN202210828781.8A CN202210828781A CN114913323A CN 114913323 A CN114913323 A CN 114913323A CN 202210828781 A CN202210828781 A CN 202210828781A CN 114913323 A CN114913323 A CN 114913323A
Authority
CN
China
Prior art keywords
area
flame
open fire
suspected
flame area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210828781.8A
Other languages
Chinese (zh)
Other versions
CN114913323B (en
Inventor
梁帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Prophet Big Data Co ltd
Original Assignee
Dongguan Prophet Big Data Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Prophet Big Data Co ltd filed Critical Dongguan Prophet Big Data Co ltd
Priority to CN202210828781.8A priority Critical patent/CN114913323B/en
Publication of CN114913323A publication Critical patent/CN114913323A/en
Application granted granted Critical
Publication of CN114913323B publication Critical patent/CN114913323B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L53/00Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
    • B60L53/30Constructional details of charging stations
    • B60L53/31Charging columns specially adapted for electric vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L53/00Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
    • B60L53/60Monitoring or controlling charging stations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/60Other road transportation technologies with climate change mitigation effect
    • Y02T10/70Energy storage systems for electromobility, e.g. batteries
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/60Other road transportation technologies with climate change mitigation effect
    • Y02T10/7072Electromobility specific charging systems or methods for batteries, ultracapacitors, supercapacitors or double-layer capacitors
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation
    • Y02T90/10Technologies relating to charging of electric vehicles
    • Y02T90/12Electric charging stations

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Artificial Intelligence (AREA)
  • Power Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Evolutionary Biology (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Human Computer Interaction (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Fire-Detection Mechanisms (AREA)

Abstract

The invention is suitable for the technical field of charging pile monitoring, and provides a method for detecting open fire at night in a charging pile area, which comprises the following steps: marking a light emitting area such as an indicator light in a scene, and dividing the scene into a non-light area and a light area; collecting a night flame image, marking a flame area, and establishing a suspected flame area detection model; step three, detecting a suspected flame area appearing in the scene by using a night suspected flame area detection model for the no-light area in the scene; according to the method for detecting the open fire in the charging pile area at night, disclosed by the invention, the open fire picture is captured through the monitoring camera and the image recognition, and interference information such as field light, vehicle light irradiation and the like is discharged in detail, so that the open fire detection precision can be effectively improved, meanwhile, by matching with an automatic early warning mode, a worker can be reminded to check and process the early warning picture, and the misjudgment caused by the influence of the vehicle light in the area at night can be effectively reduced.

Description

Charging pile area night open fire detection method
Technical Field
The invention belongs to the technical field of charging pile monitoring, and particularly relates to a method for detecting open fire at night in a charging pile area.
Background
Under the promotion of policies such as energy conservation and emission reduction, carbon peak-reaching and the like, the electric vehicle industry develops rapidly, and a new safety problem is inevitably brought. In recent years, charging safety accidents of electric vehicles are frequent, and the electric vehicle on fire is very easy to quickly ignite other electric vehicles, so that a larger fire accident is caused.
Most of the prior art adopt a monitoring camera and image recognition to capture an open fire picture and remind workers to check and process an early warning picture, but in a night scene, the detection technology is easily interfered by field light and vehicle light irradiation, so that the open fire detection precision is poor, and therefore, an open fire detection method capable of eliminating light interference is needed.
Disclosure of Invention
The embodiment of the invention provides a method for detecting open fire in a charging pile area at night, which is characterized in that an open fire picture is captured through a monitoring camera and image recognition, interference information such as field light, vehicle light irradiation and the like is discharged in detail, the open fire detection precision can be effectively improved, meanwhile, a worker can be reminded to check and process an early warning picture by matching with an automatic early warning mode, and misjudgment caused by the influence of the vehicle light in the area at night can be effectively reduced.
The embodiment of the invention is realized in such a way that a method for detecting the open fire at night in a charging pile area comprises the following steps:
marking a light-emitting area such as an indicator light in a scene, and dividing the scene into a non-light area and a light area;
collecting a night flame image, marking a flame area, and establishing a suspected flame area detection model;
step three, detecting a suspected flame area appearing in the scene by using a night suspected flame area detection model for the no-light area in the scene;
step four, when a continuous flame area is detected in the monitoring video, detecting vehicles appearing in the scene by using a trained vehicle detection model, and judging whether the continuous flame area is an open flame area;
step five, when a suspected open fire area is detected in the monitoring video, further judging whether open fire exists or not, and sending early warning information when open fire exists;
and step six, when open fire is detected in the scene, extracting open fire information and personnel information, comprehensively judging whether the personnel are suspected fire personnel, and sending early warning of the fire personnel to a relevant management department for processing in real time.
In embodiment 1, the suspected flame area detection model in the second step is obtained by training the labeled data based on a YOLO training model.
In embodiment 1, the third step of detecting a suspected flame area occurring in the scene using a nighttime suspected flame area detection model for the no light area in the scene includes:
detecting and judging whether the flame area is a flame area or a false flame area;
if the flame area is judged whether a continuous flame area exists or not.
Embodiment 1, the detailed steps of the detection and judgment of whether the flame area is a flame area or a false fire area are as follows:
when monitoring video
Figure 214016DEST_PATH_IMAGE001
Frame detection of suspected flame regions
Figure 927894DEST_PATH_IMAGE002
Wherein
Figure 769948DEST_PATH_IMAGE003
The abscissa is fixed for the top left corner of the suspected flame region detection box,
Figure 991589DEST_PATH_IMAGE004
the abscissa is fixed for the top left corner of the suspected flame region detection box,
Figure 602699DEST_PATH_IMAGE005
the width of the box is detected for the area of suspected flame,
Figure 487478DEST_PATH_IMAGE006
calculating a size score for the suspected flame area for detecting a high of the box
Figure 551249DEST_PATH_IMAGE007
Wherein
Figure 313931DEST_PATH_IMAGE008
Figure 310706DEST_PATH_IMAGE009
In order to set the first judgment threshold value,
Figure 100807DEST_PATH_IMAGE010
to a set second judgment threshold
Figure 917454DEST_PATH_IMAGE011
Figure 955598DEST_PATH_IMAGE012
In order to set the third judgment threshold value,
Figure 541300DEST_PATH_IMAGE013
to a set fourth judgment threshold value
When in use
Figure 33461DEST_PATH_IMAGE014
In which
Figure 337403DEST_PATH_IMAGE015
And if the set fifth judgment threshold value is adopted, the suspected flame area is judged as the flame area, otherwise, the false flame area is judged.
Example 1 details of the procedure for determining whether a flame zone has a continuous flame zoneThe detailed step of judging whether the flame area has the continuous flame area is as follows when the monitoring video is in the first place
Figure 441888DEST_PATH_IMAGE016
The frame detects the flame area, adds the mark score for the frame image
Figure 147676DEST_PATH_IMAGE017
The absence of flame zone is marked with a score of 0, recorded from
Figure 545159DEST_PATH_IMAGE016
Frame start continuation
Figure 70818DEST_PATH_IMAGE018
Calculating flame persistence score based on the detected flame region in the frame image
Figure 710485DEST_PATH_IMAGE019
When in use
Figure 5200DEST_PATH_IMAGE020
In which
Figure 839164DEST_PATH_IMAGE021
To set the sixth judgment threshold, judge
Figure 117698DEST_PATH_IMAGE016
The continuous flame area exists in the frame, otherwise, the continuous flame area does not exist.
Example 1, the detailed procedure of step four is as follows:
when a vehicle is detected within a scene
Figure 563985DEST_PATH_IMAGE022
In which
Figure 713207DEST_PATH_IMAGE023
The fixed-point abscissa of the upper left corner of the vehicle detection box,
Figure 452493DEST_PATH_IMAGE024
a fixed point ordinate of the upper left corner of the vehicle detection box,
Figure 218324DEST_PATH_IMAGE025
for the width of the vehicle detection box,
Figure 701257DEST_PATH_IMAGE026
calculating a vehicle flame correlation score for a high vehicle detection frame
Figure 728820DEST_PATH_IMAGE027
When in use
Figure 639007DEST_PATH_IMAGE028
In which
Figure 626555DEST_PATH_IMAGE029
A seventh judgment threshold value is set, the vehicle is judged to be related to the existence of flame, otherwise, the vehicle is judged not to exist,
marking vehicles associated with the presence of flames and extracting vehicle features, obtained from
Figure 913179DEST_PATH_IMAGE030
Frame start
Figure 538458DEST_PATH_IMAGE031
Detection frame for marking vehicle in image after frame
Figure 885126DEST_PATH_IMAGE032
When the vehicle approximates the speed
Figure 94390DEST_PATH_IMAGE033
Is greater than the set eighth judgment threshold value
Figure 184706DEST_PATH_IMAGE034
When the vehicle is judged to have displacement, otherwise, the vehicle is judged not to move,
when the displacement of the marked vehicle is detected, the slave computer is calculated
Figure 927141DEST_PATH_IMAGE030
Frame continuation
Figure 179130DEST_PATH_IMAGE031
Companion score within post-frame images
Figure 141270DEST_PATH_IMAGE035
When in use
Figure 769698DEST_PATH_IMAGE036
Greater than a set ninth judgment threshold
Figure 602524DEST_PATH_IMAGE037
When the flame is detected to be a non-open flame region, otherwise, the flame is detected to be a suspected open flame region,
when the marked vehicle is detected to have displacement, the continuous flame area is judged to be a suspected open flame area,
when the vehicle is not associated with flames, determining that the continuous flame area is a suspected open flame area,
and when the vehicle is not detected in the scene, judging the continuous flame area as a suspected open flame area.
Example 1, the detailed procedure of step five is as follows:
when monitoring video
Figure 526880DEST_PATH_IMAGE038
Frame detection of suspected open fire regions
Figure 710737DEST_PATH_IMAGE002
Setting the region
Figure 142855DEST_PATH_IMAGE039
For detecting areas of open fireWherein
Figure 95768DEST_PATH_IMAGE040
The fixed-point horizontal coordinate of the upper left corner of the detection frame of the open fire detection area,
Figure 459534DEST_PATH_IMAGE041
a fixed point vertical coordinate is arranged at the upper left corner of the detection frame of the open fire detection area,
Figure 865108DEST_PATH_IMAGE042
the width of the detection box for the open fire detection area,
Figure 835338DEST_PATH_IMAGE043
detecting a high of a box for an open flame detection area
Figure 908336DEST_PATH_IMAGE044
Figure 440074DEST_PATH_IMAGE045
Figure 864102DEST_PATH_IMAGE046
Figure 903602DEST_PATH_IMAGE047
Figure 565527DEST_PATH_IMAGE048
A first correction constant obtained by training historical data, W is the width of a monitoring picture,
Figure 265237DEST_PATH_IMAGE049
a second correction constant trained for historical data,
Figure 910982DEST_PATH_IMAGE050
a third correction constant obtained by historical data training is obtained, and H is the height of the monitoring picture;
extracting gray values of pixels in non-suspected open fire areas in the open fire detection area
Figure 223014DEST_PATH_IMAGE051
Wherein
Figure 270605DEST_PATH_IMAGE052
Indicating the coordinates of the pixel point when
Figure 144145DEST_PATH_IMAGE051
Greater than the set tenth judgment threshold
Figure 11607DEST_PATH_IMAGE053
Adding the bright spot mark score to the pixel point
Figure 127331DEST_PATH_IMAGE054
Positive and negative
Figure 763848DEST_PATH_IMAGE055
Collecting the speckle marker score
Figure 401018DEST_PATH_IMAGE054
The pixel points form a bright spot set
Figure 755776DEST_PATH_IMAGE056
Clustering pixels in the bright spot set according to pixel coordinates by using a K-means-based clustering model, and for any O in a clustering result, when the number of elements of O is equal to that of the elements of O
Figure 675190DEST_PATH_IMAGE057
Cluster radius with O
Figure 431793DEST_PATH_IMAGE058
Satisfy the requirement of
Figure 647136DEST_PATH_IMAGE059
Figure 754770DEST_PATH_IMAGE060
Figure 212296DEST_PATH_IMAGE061
In which
Figure 823406DEST_PATH_IMAGE062
In order to set the eleventh determination threshold value,
Figure 472299DEST_PATH_IMAGE063
in order to set the twelfth judgment threshold value,
Figure 801650DEST_PATH_IMAGE064
judging that O is a Mars region for a set thirteenth judgment threshold value, and recording the number of the Mars regions in a clustering result
Figure 62867DEST_PATH_IMAGE065
When coming from
Figure 528483DEST_PATH_IMAGE038
Frame start continuation
Figure 584164DEST_PATH_IMAGE066
Sum of Mars region number of frame
Figure 902275DEST_PATH_IMAGE067
Greater than a fourteenth determination threshold
Figure 436024DEST_PATH_IMAGE068
And judging that open fire exists in the suspected open fire area, and sending open fire early warning to a related management department in real time.
In embodiment 1, the vehicle detection model trained in the fourth step is obtained by training the labeled image using a YOLO-based training model.
Example 1, the detailed procedure of step six is as follows: extracting open fire information when open fire is detected in a scene
Figure 21727DEST_PATH_IMAGE002
Before the open fire appears, the extraction is continued
Figure 513888DEST_PATH_IMAGE069
Frame image, extracting person in image
Figure 56645DEST_PATH_IMAGE070
Information wherein
Figure 659665DEST_PATH_IMAGE071
The abscissa is fixed for the upper left corner of the human frame,
Figure 99874DEST_PATH_IMAGE072
for the top left hand fixed point ordinate of the personnel box,
Figure 497357DEST_PATH_IMAGE073
the width of the square frame for the person,
Figure 790060DEST_PATH_IMAGE074
the high of the human box.
When the personnel meets the requirements of open fire
Figure 931192DEST_PATH_IMAGE075
In which
Figure 491486DEST_PATH_IMAGE076
Judging whether the person approaches the open fire area or not for a set fifteenth judgment threshold value, and acquiring the elbow coordinates of the person
Figure 59871DEST_PATH_IMAGE077
Wrist coordinates
Figure 571361DEST_PATH_IMAGE078
Knee coordinates
Figure 250604DEST_PATH_IMAGE079
Hip bone coordinates
Figure 665405DEST_PATH_IMAGE080
Head coordinates
Figure 404691DEST_PATH_IMAGE081
Calculating the pilot fire score
Figure 170522DEST_PATH_IMAGE082
Where k is the number of frames
Figure 154920DEST_PATH_IMAGE083
Figure 689807DEST_PATH_IMAGE084
A fourth correction constant trained for historical data,
Figure 599994DEST_PATH_IMAGE085
fifth correction constant obtained for historical data training
Figure 321962DEST_PATH_IMAGE086
Figure 101263DEST_PATH_IMAGE087
Figure 490656DEST_PATH_IMAGE088
In order to set the sixteenth determination threshold,
Figure 837324DEST_PATH_IMAGE089
is a seventeenth judgment threshold value
Personnel pyrotechnics scoring when there is no personnel near open fire areas
Figure 577747DEST_PATH_IMAGE090
Calculating the continuity before the occurrence of open fire
Figure 903948DEST_PATH_IMAGE069
Sum of the person's crazing score for each frame within a frame image
Figure 413427DEST_PATH_IMAGE091
When in use
Figure 930996DEST_PATH_IMAGE092
In which
Figure 391671DEST_PATH_IMAGE093
And if the judgment result is the set eighteenth judgment threshold, judging that the personnel are suspected fire-leading personnel, and sending the early warning of the fire-leading personnel to a related management department in real time for processing.
A charging pile area night open fire detection system comprises:
the monitoring module is used for acquiring video information of a target area;
the processing module is used for receiving the information collected by the monitoring module and analyzing and processing the information;
and the communication early warning module is used for receiving the information of the processing module and carrying out communication early warning operation.
The invention has the beneficial effects that: catch the naked light picture through surveillance camera head and image recognition to interference information such as place light, vehicle light shine is discharged in detail, can effectively improve naked light and detect the precision, and the mode of automatic early warning is cooperated simultaneously, can remind the staff to look over the processing to the early warning picture, and the regional influence of vehicle light at night that effectively reduces leads to the erroneous judgement.
Drawings
FIG. 1 is a block diagram of the method of the present invention;
fig. 2 is a system block diagram of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1-2, according to the scheme, an open fire picture is captured through a monitoring camera and image recognition, interference information such as field light and vehicle light irradiation is discharged in detail, open fire detection precision can be effectively improved, meanwhile, a worker can be reminded to check and process an early warning picture in a matched automatic early warning mode, and misjudgment caused by influence of vehicle light in a night area is effectively reduced.
Example one
A method for detecting a nighttime open fire in a charging pile area comprises the following steps:
marking a light emitting area such as an indicator light in a scene, and dividing the scene into a non-light area and a light area;
collecting a night flame image, marking a flame area, and establishing a suspected flame area detection model;
step three, detecting a suspected flame area appearing in the scene by using a night suspected flame area detection model for the no-light area in the scene;
step four, when a continuous flame area is detected in the monitoring video, detecting vehicles appearing in the scene by using a trained vehicle detection model, and judging whether the continuous flame area is an open flame area;
step five, when a suspected open fire area is detected in the monitoring video, further judging whether open fire exists or not, and sending early warning information when open fire exists;
and step six, when open fire is detected in the scene, extracting open fire information and personnel information, comprehensively judging whether the personnel are suspected fire personnel, and sending early warning of the fire personnel to a relevant management department for processing in real time.
And the suspected flame area detection model in the second step is obtained by training the labeled data based on a training model of the YOLO.
Example two
The third step of detecting the suspected flame area appearing in the scene by using a night suspected flame area detection model for the no-light area in the scene comprises the following steps:
detecting and judging whether the flame area is a flame area or a false flame area;
if the flame area is judged whether a continuous flame area exists or not.
EXAMPLE III
The detailed steps of the detection and judgment of whether the flame area is the flame area or the false flame area are as follows:
when monitoring video
Figure 754519DEST_PATH_IMAGE001
Frame detection of a suspected flame region
Figure 852925DEST_PATH_IMAGE002
Wherein
Figure 541395DEST_PATH_IMAGE003
The abscissa is fixed for the top left corner of the suspected flame region detection box,
Figure 990831DEST_PATH_IMAGE004
the top left fixed point ordinate of the suspected flame area detection box,
Figure 658835DEST_PATH_IMAGE005
the width of the box is detected for the area of suspected flame,
Figure 877327DEST_PATH_IMAGE006
calculating a size score for the suspected flame area for detecting a high of the box
Figure 471119DEST_PATH_IMAGE007
Wherein
Figure 142272DEST_PATH_IMAGE008
Figure 148055DEST_PATH_IMAGE009
In order to set the first judgment threshold value,
Figure 486633DEST_PATH_IMAGE010
to a set second judgment threshold
Figure 516905DEST_PATH_IMAGE011
Figure 409775DEST_PATH_IMAGE012
In order to set the third determination threshold value,
Figure 154002DEST_PATH_IMAGE013
to a set fourth judgment threshold value
When the temperature is higher than the set temperature
Figure 878245DEST_PATH_IMAGE014
In which
Figure 813840DEST_PATH_IMAGE015
And if the set fifth judgment threshold value is adopted, the suspected flame area is judged as the flame area, otherwise, the false flame area is judged.
In the fourth embodiment, the detailed step of judging whether the flame area has the continuous flame area is as follows, and the detailed step of judging whether the flame area has the continuous flame area is as follows when the video is monitored
Figure 459585DEST_PATH_IMAGE016
Frame detection of flameRegion for adding mark score to the frame image
Figure 270153DEST_PATH_IMAGE017
The absence of flame zone is marked with a score of 0, recorded from
Figure 52164DEST_PATH_IMAGE016
Frame start continuation
Figure 424239DEST_PATH_IMAGE018
Calculating flame persistence score based on the detected flame region in the frame image
Figure 822860DEST_PATH_IMAGE019
When in use
Figure 174469DEST_PATH_IMAGE020
In which
Figure 76566DEST_PATH_IMAGE021
To set the sixth judgment threshold, judge
Figure 619542DEST_PATH_IMAGE016
The continuous flame area exists in the frame, otherwise, the continuous flame area does not exist.
EXAMPLE five
The detailed steps of the step four are as follows:
when a vehicle is detected within a scene
Figure 708721DEST_PATH_IMAGE022
In which
Figure 409828DEST_PATH_IMAGE023
The fixed-point abscissa for the upper left corner of the vehicle detection box,
Figure 166432DEST_PATH_IMAGE024
a fixed point ordinate of the upper left corner of the vehicle detection box,
Figure 614730DEST_PATH_IMAGE025
for the width of the vehicle detection box,
Figure 722364DEST_PATH_IMAGE026
calculating a vehicle flame correlation score for a high vehicle detection frame
Figure 681355DEST_PATH_IMAGE094
When in use
Figure 292465DEST_PATH_IMAGE028
In which
Figure 442823DEST_PATH_IMAGE029
For the set seventh judgment threshold value, judging that the vehicle is associated with the flame existence, otherwise, judging that the vehicle does not exist, marking the vehicle associated with the flame existence, extracting the vehicle characteristics, and acquiring the flame from the flame existence
Figure 772173DEST_PATH_IMAGE030
Frame start
Figure 266346DEST_PATH_IMAGE031
Detection frame for marking vehicle in image after frame
Figure 997542DEST_PATH_IMAGE032
When the vehicle approximates the speed
Figure 53223DEST_PATH_IMAGE033
Is greater than the set eighth judgment threshold value
Figure DEST_PATH_IMAGE096AAAA
When the vehicle is judged to have displacement, otherwise, the vehicle is judged not to move,
when the displacement of the marked vehicle is detected, the method calculates
Figure 230388DEST_PATH_IMAGE030
Frame continuation
Figure 295296DEST_PATH_IMAGE031
Companion score within post-frame images
Figure 615419DEST_PATH_IMAGE035
When in use
Figure 611975DEST_PATH_IMAGE036
Greater than a set ninth judgment threshold
Figure 915917DEST_PATH_IMAGE037
When the flame is detected to be a non-open flame region, otherwise, the flame is detected to be a suspected open flame region,
when the marked vehicle is detected to have displacement, the continuous flame area is judged to be a suspected open flame area,
when the vehicle is not associated with flames, determining that the continuous flame area is a suspected open flame area,
and when the vehicle is not detected in the scene, judging the continuous flame area as a suspected open flame area.
EXAMPLE six
The detailed steps of the fifth step are as follows:
when monitoring video
Figure 518937DEST_PATH_IMAGE038
Frame detection of suspected open fire regions
Figure 693567DEST_PATH_IMAGE002
Setting the region
Figure 123673DEST_PATH_IMAGE039
Is an open flame detection area, wherein
Figure 383753DEST_PATH_IMAGE040
For detecting areas of open fireThe fixed-point abscissa of the upper left corner of the square frame is detected,
Figure 790464DEST_PATH_IMAGE041
a fixed point vertical coordinate is arranged at the upper left corner of the detection frame of the open fire detection area,
Figure 350758DEST_PATH_IMAGE042
the width of the detection box for the open fire detection area,
Figure 683257DEST_PATH_IMAGE043
detecting a high of a box for an open flame detection area
Figure 961792DEST_PATH_IMAGE044
Figure 906614DEST_PATH_IMAGE045
Figure 321415DEST_PATH_IMAGE046
Figure 827745DEST_PATH_IMAGE047
Figure 327996DEST_PATH_IMAGE048
A first correction constant obtained by training historical data, W is the width of a monitoring picture,
Figure 810930DEST_PATH_IMAGE049
a second correction constant trained for historical data,
Figure 80237DEST_PATH_IMAGE050
a third correction constant obtained by historical data training is obtained, and H is the height of the monitoring picture;
extracting non-suspected naked fire in the naked fire detection areaGray value of regional pixel point
Figure 748680DEST_PATH_IMAGE051
Wherein
Figure 736227DEST_PATH_IMAGE052
Indicating the coordinates of the pixel point when
Figure 22852DEST_PATH_IMAGE051
Greater than the set tenth judgment threshold
Figure 146666DEST_PATH_IMAGE053
Adding the bright spot mark score to the pixel point
Figure 493333DEST_PATH_IMAGE054
Positive and negative
Figure 204063DEST_PATH_IMAGE055
Collecting the speckle marker score
Figure 294379DEST_PATH_IMAGE054
The pixel points form a bright spot set
Figure 538278DEST_PATH_IMAGE056
Clustering pixels in the bright spot set according to pixel coordinates by using a K-means-based clustering model, and for any O in a clustering result, when the number of elements of O is equal to that of the elements of O
Figure 55847DEST_PATH_IMAGE057
Cluster radius with O
Figure 516522DEST_PATH_IMAGE058
Satisfy the requirements of
Figure 144949DEST_PATH_IMAGE059
Figure 977776DEST_PATH_IMAGE060
Figure 230028DEST_PATH_IMAGE061
In which
Figure 413885DEST_PATH_IMAGE062
In order to set the eleventh determination threshold value,
Figure 580424DEST_PATH_IMAGE063
in order to set the twelfth judgment threshold value,
Figure 533337DEST_PATH_IMAGE064
judging that O is a Mars region for a set thirteenth judgment threshold value, and recording the number of the Mars regions in a clustering result
Figure 127129DEST_PATH_IMAGE065
When coming from
Figure 302676DEST_PATH_IMAGE038
Frame start continuation
Figure 272906DEST_PATH_IMAGE066
Sum of Mars region number of frame
Figure 80325DEST_PATH_IMAGE067
At the fourteenth determination threshold value
Figure 110598DEST_PATH_IMAGE068
And judging that open fire exists in the suspected open fire area, and sending open fire early warning to a related management department in real time.
And the vehicle detection model trained in the fourth step is obtained by training the labeled image by using a training model based on YOLO.
EXAMPLE seven
The detailed steps of the step six are as follows: when in useExtracting open fire information when open fire is detected in a scene
Figure 504933DEST_PATH_IMAGE002
Before the open fire appears, the extraction is continued
Figure 13274DEST_PATH_IMAGE097
Frame image, extracting person in image
Figure 206358DEST_PATH_IMAGE070
Information wherein
Figure 141953DEST_PATH_IMAGE071
The abscissa is fixed for the upper left corner of the human frame,
Figure 256540DEST_PATH_IMAGE072
for the top left hand fixed point ordinate of the personnel box,
Figure 332687DEST_PATH_IMAGE073
the width of the square frame for the person,
Figure 114698DEST_PATH_IMAGE074
the high of the human box.
When the personnel meets the requirements of open fire
Figure 752353DEST_PATH_IMAGE075
In which
Figure 619815DEST_PATH_IMAGE076
Judging whether the person approaches the open fire area or not for a set fifteenth judgment threshold value, and acquiring the elbow coordinates of the person
Figure 971424DEST_PATH_IMAGE077
Wrist coordinates
Figure 873521DEST_PATH_IMAGE078
Knee coordinates
Figure 416498DEST_PATH_IMAGE079
Hip bone coordinates
Figure 771256DEST_PATH_IMAGE080
Head coordinates
Figure 159512DEST_PATH_IMAGE081
Calculating the pilot fire score
Figure 697807DEST_PATH_IMAGE082
Where k is the number of frames
Figure 146106DEST_PATH_IMAGE083
Figure 988160DEST_PATH_IMAGE084
A fourth correction constant trained for historical data,
Figure 180107DEST_PATH_IMAGE085
fifth correction constant obtained for historical data training
Figure 558261DEST_PATH_IMAGE086
Figure 177462DEST_PATH_IMAGE087
Figure 241232DEST_PATH_IMAGE088
In order to set the sixteenth determination threshold,
Figure 502450DEST_PATH_IMAGE089
is a seventeenth judgment threshold value
When it is notScoring of a person's pilot fire when the person is near an open fire area
Figure 968066DEST_PATH_IMAGE090
Calculating the continuity before the occurrence of open fire
Figure 522282DEST_PATH_IMAGE097
Sum of the person's crazing score for each frame within a frame image
Figure 73349DEST_PATH_IMAGE091
When in use
Figure 138257DEST_PATH_IMAGE092
In which
Figure 723959DEST_PATH_IMAGE093
And judging that the personnel is suspected pilot fire personnel for the set eighteenth judgment threshold, and sending the pilot fire personnel early warning to a related management department for processing in real time.
Example eight
A charging pile area night open fire detection system comprises:
the monitoring module is used for acquiring video information of a target area, the monitoring module is a plurality of cameras arranged in a charging pile concentrated area, and dead angles of the monitoring area are reduced through the multi-angle cameras;
the processing module is used for receiving the information collected by the monitoring module and analyzing and processing the information, and a storage module for storing model information is arranged in the processing module;
and the communication early warning module is used for receiving the information of the processing module and carrying out communication early warning operation.
It should be understood that, although the steps in the flowcharts of the embodiments of the present invention are shown in sequence as indicated by the arrows, the steps are not necessarily executed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in various embodiments may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (9)

1. A charging pile area night open fire detection method is characterized by comprising the following steps:
marking a light emitting area of an indicator light in a scene, and dividing the scene into a non-light area and a light area;
collecting a night flame image, marking a flame area, and establishing a suspected flame area detection model;
step three, detecting a suspected flame area appearing in the scene by using a night suspected flame area detection model for the no-light area in the scene;
step four, when a continuous flame area is detected in the monitoring video, detecting vehicles appearing in the scene by using a trained vehicle detection model, and judging whether the continuous flame area is an open flame area;
step five, when a suspected open fire area is detected in the monitoring video, further judging whether open fire exists or not, and sending early warning information when open fire exists;
and step six, when open fire is detected in the scene, extracting open fire information and personnel information, comprehensively judging whether the personnel are suspected fire personnel, and sending early warning of the fire personnel to a relevant management department for processing in real time.
2. The method according to claim 1, wherein the suspected flame area detection model in the second step is obtained by training labeled data based on a training model of YOLO.
3. The method for detecting the nighttime open fire in the charging pile area according to claim 1, wherein the step three of detecting the suspected flame area in the scene by using the nighttime suspected flame area detection model for the no-light area in the scene comprises the following steps:
detecting and judging whether the flame area is a flame area or a false flame area;
if the flame area is judged whether a continuous flame area exists or not.
4. The method for detecting the nighttime open fire in the charging pile area according to claim 3, wherein the detailed steps of detecting and judging whether the flame area is the flame area or the false flame area are as follows:
when monitoring video
Figure 932507DEST_PATH_IMAGE001
Frame detection of suspected flame regions
Figure 646385DEST_PATH_IMAGE002
Wherein
Figure 957280DEST_PATH_IMAGE003
The abscissa is fixed for the top left corner of the suspected flame region detection box,
Figure 631451DEST_PATH_IMAGE004
the ordinate of the fixed point at the upper left corner of the suspected flame area detection box,
Figure 976982DEST_PATH_IMAGE005
the width of the box is detected for the area of suspected flame,
Figure 330603DEST_PATH_IMAGE006
calculating a size score for the suspected flame area for detecting a high of the box
Figure 145106DEST_PATH_IMAGE007
Wherein
Figure 875165DEST_PATH_IMAGE008
Figure 75202DEST_PATH_IMAGE009
In order to set the first judgment threshold value,
Figure 599724DEST_PATH_IMAGE010
to a set second judgment threshold
Figure 167103DEST_PATH_IMAGE011
Figure 700852DEST_PATH_IMAGE012
In order to set the third judgment threshold value,
Figure 771707DEST_PATH_IMAGE013
to a set fourth judgment threshold value
When in use
Figure 998289DEST_PATH_IMAGE014
In which
Figure 302232DEST_PATH_IMAGE015
And if the set fifth judgment threshold value is adopted, the suspected flame area is judged as the flame area, otherwise, the false flame area is judged.
5. The method for detecting the nighttime open fire in the charging pile area according to claim 4, wherein the detailed step of judging whether the flame area has the continuous flame area is as follows when the step of judging whether the flame area has the continuous flame area is as follows when a monitoring video is used
Figure 387475DEST_PATH_IMAGE016
The frame detects the flame area, adds the mark score for the frame image
Figure 93263DEST_PATH_IMAGE017
The absence of flame zone is marked with a score of 0, recorded from
Figure 975899DEST_PATH_IMAGE016
Frame start continuation
Figure 501559DEST_PATH_IMAGE018
Calculating flame persistence score based on the detected flame region in the frame image
Figure 659002DEST_PATH_IMAGE019
When in use
Figure 953717DEST_PATH_IMAGE020
In which
Figure 538413DEST_PATH_IMAGE021
To set the sixth judgment threshold, judge
Figure 285789DEST_PATH_IMAGE016
The continuous flame area exists in the frame, otherwise, the continuous flame area does not exist.
6. The method for detecting the nighttime open fire in the charging pile area according to claim 1, wherein the detailed steps of the fourth step are as follows:
when a vehicle is detected within a scene
Figure 965032DEST_PATH_IMAGE022
In which
Figure 151073DEST_PATH_IMAGE023
The fixed-point abscissa for the upper left corner of the vehicle detection box,
Figure 155938DEST_PATH_IMAGE024
a fixed point ordinate of the upper left corner of the vehicle detection box,
Figure 406922DEST_PATH_IMAGE025
for the width of the vehicle detection box,
Figure 624277DEST_PATH_IMAGE026
calculating a vehicle flame correlation score for a high vehicle detection frame
Figure 644317DEST_PATH_IMAGE027
When in use
Figure 820083DEST_PATH_IMAGE028
In which
Figure 292784DEST_PATH_IMAGE029
Determining that the vehicle is associated with the presence of flames for a set seventh determination threshold, otherwise determining that the vehicle is not present,
marking vehicles associated with the presence of flames and extracting vehicle features, obtained from
Figure 579409DEST_PATH_IMAGE030
Frame start
Figure 437643DEST_PATH_IMAGE031
Detection frame for marking vehicle in image after frame
Figure 266534DEST_PATH_IMAGE032
When the vehicle approximates the speed
Figure 741378DEST_PATH_IMAGE033
Is greater than the set eighth judgment threshold value
Figure 316847DEST_PATH_IMAGE034
If so, judging that the marked vehicle has displacement, otherwise, judging that the vehicle does not move;
when the displacement of the marked vehicle is detected, the method calculates
Figure 560747DEST_PATH_IMAGE030
Frame continuation
Figure 297890DEST_PATH_IMAGE031
Companion score within post-frame images
Figure 260029DEST_PATH_IMAGE035
When in use
Figure 622878DEST_PATH_IMAGE036
Greater than a set ninth judgment threshold
Figure 737595DEST_PATH_IMAGE037
If so, judging that the continuous flame area is a non-open flame area, otherwise, judging that the continuous flame area is a suspected open flame area;
when the marked vehicle is detected to have displacement, the continuous flame area is judged to be a suspected open flame area;
when the vehicle is not associated with the flame, determining that the continuous flame area is a suspected open flame area;
and when the vehicle is not detected in the scene, judging the continuous flame area as a suspected open flame area.
7. The method for detecting the night open fire in the charging pile area according to claim 1, characterized in that the detailed steps in the fifth step are as follows:
when monitoring video
Figure 894907DEST_PATH_IMAGE038
Frame detection of suspected open fire regions
Figure 357725DEST_PATH_IMAGE002
Setting the region
Figure 524264DEST_PATH_IMAGE039
Is an open fire detection area, in which
Figure 962330DEST_PATH_IMAGE040
The fixed-point horizontal coordinate of the upper left corner of the detection frame of the open fire detection area,
Figure 87281DEST_PATH_IMAGE041
a fixed point vertical coordinate is arranged at the upper left corner of the detection frame of the open fire detection area,
Figure 243587DEST_PATH_IMAGE042
the width of the detection box for the open fire detection area,
Figure 213817DEST_PATH_IMAGE043
detecting a high of a box for an open flame detection area
Figure 37547DEST_PATH_IMAGE044
Figure 802241DEST_PATH_IMAGE045
Figure 960690DEST_PATH_IMAGE046
Figure 482414DEST_PATH_IMAGE047
Figure 426230DEST_PATH_IMAGE048
A first correction constant obtained by training historical data, W is the width of a monitoring picture,
Figure 361825DEST_PATH_IMAGE049
a second correction constant trained for historical data,
Figure 741991DEST_PATH_IMAGE050
a third correction constant obtained for historical data training, H is monitoringThe height of the picture;
extracting gray values of pixels in non-suspected open fire areas in the open fire detection area
Figure 539176DEST_PATH_IMAGE051
Wherein
Figure 321188DEST_PATH_IMAGE052
Indicating the coordinates of the pixel point when
Figure 427684DEST_PATH_IMAGE051
Greater than the set tenth judgment threshold
Figure 45878DEST_PATH_IMAGE053
Adding the bright spot mark score to the pixel point
Figure 896023DEST_PATH_IMAGE054
Positive and negative
Figure 280343DEST_PATH_IMAGE055
Collecting the speckle marker score
Figure 557741DEST_PATH_IMAGE054
The pixel points form a bright spot set
Figure 646919DEST_PATH_IMAGE056
Clustering pixels in the bright spot set according to pixel coordinates by using a K-means-based clustering model, and for any O in a clustering result, when the number of elements of O is equal to that of the elements of O
Figure 51487DEST_PATH_IMAGE057
Cluster radius with O
Figure 542511DEST_PATH_IMAGE058
Satisfy the requirement of
Figure 7122DEST_PATH_IMAGE059
Figure 849176DEST_PATH_IMAGE060
Figure 41123DEST_PATH_IMAGE061
In which
Figure 137386DEST_PATH_IMAGE062
In order to set the eleventh determination threshold value,
Figure 22165DEST_PATH_IMAGE063
in order to set the twelfth judgment threshold value,
Figure 99318DEST_PATH_IMAGE064
judging that O is a Mars region for a set thirteenth judgment threshold value, and recording the number of the Mars regions in a clustering result
Figure 360535DEST_PATH_IMAGE065
When coming from
Figure 560572DEST_PATH_IMAGE038
Frame start continuation
Figure 366985DEST_PATH_IMAGE066
Sum of Mars region number of frame
Figure 183632DEST_PATH_IMAGE067
Greater than a fourteenth determination threshold
Figure 468114DEST_PATH_IMAGE068
And judging that open fire exists in the suspected open fire area, and sending open fire early warning to a related management department in real time.
8. The method for detecting the nighttime open fire in the charging pile area according to claim 1, wherein the vehicle detection model trained in the fourth step is obtained by training the labeled image by using a training model based on YOLO.
9. The method for detecting the nighttime open fire in the charging pile area according to claim 1, wherein the detailed steps of the sixth step are as follows: extracting open fire information when open fire is detected in a scene
Figure 788237DEST_PATH_IMAGE002
Before the open fire appears, the extraction is continued
Figure 296709DEST_PATH_IMAGE069
Frame image, extracting person in image
Figure 69493DEST_PATH_IMAGE070
Information wherein
Figure 420316DEST_PATH_IMAGE071
The abscissa is fixed for the upper left corner of the human frame,
Figure 860524DEST_PATH_IMAGE072
for the top left hand fixed point ordinate of the personnel box,
Figure 258008DEST_PATH_IMAGE073
the width of the square frame for the person,
Figure 534399DEST_PATH_IMAGE074
height of the person box;
when the personnel meets the requirements of open fire
Figure 675531DEST_PATH_IMAGE075
In which
Figure 720978DEST_PATH_IMAGE076
Judging whether the person approaches the open fire area or not for a set fifteenth judgment threshold value, and acquiring the elbow coordinates of the person
Figure 289363DEST_PATH_IMAGE077
Wrist coordinate
Figure 53051DEST_PATH_IMAGE078
Knee coordinates
Figure 997873DEST_PATH_IMAGE079
Hip bone coordinates
Figure 147094DEST_PATH_IMAGE080
Head coordinates
Figure 899762DEST_PATH_IMAGE081
Calculating the pilot fire score
Figure 134435DEST_PATH_IMAGE082
Where k is the number of frames
Figure 102522DEST_PATH_IMAGE083
Figure 637408DEST_PATH_IMAGE084
A fourth correction constant trained for historical data,
Figure 32749DEST_PATH_IMAGE085
a fifth correction constant obtained by training historical data;
Figure 20296DEST_PATH_IMAGE086
Figure 41342DEST_PATH_IMAGE087
Figure 181467DEST_PATH_IMAGE088
in order to set the sixteenth determination threshold,
Figure 262556DEST_PATH_IMAGE089
a seventeenth determination threshold value set;
personnel pyrotechnics scoring when there is no personnel near open fire areas
Figure 321140DEST_PATH_IMAGE090
Calculating the continuity before the occurrence of open fire
Figure 145876DEST_PATH_IMAGE069
Sum of the person's crazing score for each frame within a frame image
Figure 874929DEST_PATH_IMAGE091
When in use
Figure 392498DEST_PATH_IMAGE092
In which
Figure 839791DEST_PATH_IMAGE093
Judging the personnel to be suspected pilot personnel for a set eighteenth judgment threshold value, and sending the pilot personnel early warning to a relevant management department in real timeAnd (5) line processing.
CN202210828781.8A 2022-07-15 2022-07-15 Charging pile area night open fire detection method Active CN114913323B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210828781.8A CN114913323B (en) 2022-07-15 2022-07-15 Charging pile area night open fire detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210828781.8A CN114913323B (en) 2022-07-15 2022-07-15 Charging pile area night open fire detection method

Publications (2)

Publication Number Publication Date
CN114913323A true CN114913323A (en) 2022-08-16
CN114913323B CN114913323B (en) 2022-11-15

Family

ID=82772537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210828781.8A Active CN114913323B (en) 2022-07-15 2022-07-15 Charging pile area night open fire detection method

Country Status (1)

Country Link
CN (1) CN114913323B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116259167A (en) * 2023-03-14 2023-06-13 东莞先知大数据有限公司 Charging pile area high-temperature risk early warning method, device, equipment and medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100119210A1 (en) * 2007-05-21 2010-05-13 Mitsubishi Electric Corporation Image difference detection method and apparatus, scene change detection method and apparatus, and image difference value detection method and apparatus
CN101872526A (en) * 2010-06-01 2010-10-27 重庆市海普软件产业有限公司 Smoke and fire intelligent identification method based on programmable photographing technology
CN103761529A (en) * 2013-12-31 2014-04-30 北京大学 Open fire detection method and system based on multicolor models and rectangular features
WO2017161747A1 (en) * 2016-03-25 2017-09-28 乐视控股(北京)有限公司 Charging post control system, multifunctional charging post and electric vehicle
US20170274789A1 (en) * 2016-03-25 2017-09-28 Le Holdings (Beijing) Co., Ltd. Charging pile control system, multi-functional charging pile and electric vehicle
US20180376305A1 (en) * 2017-06-23 2018-12-27 Veniam, Inc. Methods and systems for detecting anomalies and forecasting optimizations to improve smart city or region infrastructure management using networks of autonomous vehicles
CN110667435A (en) * 2019-09-26 2020-01-10 武汉客车制造股份有限公司 Fire monitoring and early warning system and method for new energy automobile power battery
CN111626188A (en) * 2020-05-26 2020-09-04 西南大学 Indoor uncontrollable open fire monitoring method and system
CN215537956U (en) * 2021-03-29 2022-01-18 国网重庆市电力公司永川供电分公司 Electric vehicle charging station fire automatic alarm fire extinguishing system
US20220041076A1 (en) * 2020-08-05 2022-02-10 BluWave Inc. Systems and methods for adaptive optimization for electric vehicle fleet charging
CN114394100A (en) * 2022-01-12 2022-04-26 深圳力维智联技术有限公司 Unmanned prowl car control system and unmanned car
WO2022098365A1 (en) * 2020-11-05 2022-05-12 GRID20/20, Inc. Fire mitigation and downed conductor detection systems and methods
CN114475305A (en) * 2021-12-03 2022-05-13 上海众石信息科技有限公司 High-safety intelligent charging shed and use method thereof

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100119210A1 (en) * 2007-05-21 2010-05-13 Mitsubishi Electric Corporation Image difference detection method and apparatus, scene change detection method and apparatus, and image difference value detection method and apparatus
CN101872526A (en) * 2010-06-01 2010-10-27 重庆市海普软件产业有限公司 Smoke and fire intelligent identification method based on programmable photographing technology
CN103761529A (en) * 2013-12-31 2014-04-30 北京大学 Open fire detection method and system based on multicolor models and rectangular features
WO2017161747A1 (en) * 2016-03-25 2017-09-28 乐视控股(北京)有限公司 Charging post control system, multifunctional charging post and electric vehicle
US20170274789A1 (en) * 2016-03-25 2017-09-28 Le Holdings (Beijing) Co., Ltd. Charging pile control system, multi-functional charging pile and electric vehicle
US20180376305A1 (en) * 2017-06-23 2018-12-27 Veniam, Inc. Methods and systems for detecting anomalies and forecasting optimizations to improve smart city or region infrastructure management using networks of autonomous vehicles
CN110667435A (en) * 2019-09-26 2020-01-10 武汉客车制造股份有限公司 Fire monitoring and early warning system and method for new energy automobile power battery
CN111626188A (en) * 2020-05-26 2020-09-04 西南大学 Indoor uncontrollable open fire monitoring method and system
US20220041076A1 (en) * 2020-08-05 2022-02-10 BluWave Inc. Systems and methods for adaptive optimization for electric vehicle fleet charging
WO2022098365A1 (en) * 2020-11-05 2022-05-12 GRID20/20, Inc. Fire mitigation and downed conductor detection systems and methods
CN215537956U (en) * 2021-03-29 2022-01-18 国网重庆市电力公司永川供电分公司 Electric vehicle charging station fire automatic alarm fire extinguishing system
CN114475305A (en) * 2021-12-03 2022-05-13 上海众石信息科技有限公司 High-safety intelligent charging shed and use method thereof
CN114394100A (en) * 2022-01-12 2022-04-26 深圳力维智联技术有限公司 Unmanned prowl car control system and unmanned car

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王旭: "学校电动车充电桩安全问题排查和研究", 《产业科技创新》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116259167A (en) * 2023-03-14 2023-06-13 东莞先知大数据有限公司 Charging pile area high-temperature risk early warning method, device, equipment and medium
CN116259167B (en) * 2023-03-14 2023-11-21 东莞先知大数据有限公司 Charging pile area high-temperature risk early warning method, device, equipment and medium

Also Published As

Publication number Publication date
CN114913323B (en) 2022-11-15

Similar Documents

Publication Publication Date Title
CN106650620B (en) A kind of target person identification method for tracing using unmanned plane monitoring
CN111445524B (en) Scene understanding-based construction site worker unsafe behavior identification method
CN113516076B (en) Attention mechanism improvement-based lightweight YOLO v4 safety protection detection method
CN106295551A (en) A kind of personal security cap wear condition real-time detection method based on video analysis
CN106446926A (en) Transformer station worker helmet wear detection method based on video analysis
CN110136172B (en) Detection method for wearing of underground protective equipment of miners
CN108416968A (en) Fire alarm method and apparatus
CN104504369A (en) Wearing condition detection method for safety helmets
CN106128022A (en) A kind of wisdom gold eyeball identification violent action alarm method and device
CN111325048B (en) Personnel gathering detection method and device
CN113743256B (en) Intelligent early warning method and device for site safety
CN111428617A (en) Video image-based distribution network violation maintenance behavior identification method and system
CN112396658A (en) Indoor personnel positioning method and positioning system based on video
CN103996203A (en) Method and device for detecting whether face in image is sheltered
CN112434669B (en) Human body behavior detection method and system based on multi-information fusion
CN114913323B (en) Charging pile area night open fire detection method
CN111062373A (en) Hoisting process danger identification method and system based on deep learning
CN111079722A (en) Hoisting process personnel safety monitoring method and system
CN113506416A (en) Engineering abnormity early warning method and system based on intelligent visual analysis
CN101859376A (en) Fish-eye camera-based human detection system
CN114359712A (en) Safety violation analysis system based on unmanned aerial vehicle inspection
CN113537019A (en) Detection method for identifying wearing of safety helmet of transformer substation personnel based on key points
CN112377265A (en) Rock burst alarm method based on image recognition acceleration characteristics
KR101560810B1 (en) Space controled method and apparatus for using template image
CN113554682B (en) Target tracking-based safety helmet detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: Building 7, No. 124 Dongbao Road, Dongcheng Street, Dongguan City, Guangdong Province, 523015

Patentee after: Guangdong Prophet Big Data Co.,Ltd.

Country or region after: China

Address before: Room 102, Building 7, No. 124, Dongbao Road, Dongcheng Street, Dongguan City, Guangdong Province, 523015

Patentee before: Dongguan prophet big data Co.,Ltd.

Country or region before: China