CN114782903A - ESN-based highway river-crossing grand bridge group fog recognition and early warning method - Google Patents

ESN-based highway river-crossing grand bridge group fog recognition and early warning method Download PDF

Info

Publication number
CN114782903A
CN114782903A CN202210438430.6A CN202210438430A CN114782903A CN 114782903 A CN114782903 A CN 114782903A CN 202210438430 A CN202210438430 A CN 202210438430A CN 114782903 A CN114782903 A CN 114782903A
Authority
CN
China
Prior art keywords
fog
image
weather
data
sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210438430.6A
Other languages
Chinese (zh)
Inventor
王孜健
张昱
申全军
陈亮
么新鹏
樊兆董
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Expressway Group Innovation Research Institute
Shandong Transportation Institute
Original Assignee
Shandong Expressway Group Innovation Research Institute
Shandong Transportation Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Expressway Group Innovation Research Institute, Shandong Transportation Institute filed Critical Shandong Expressway Group Innovation Research Institute
Priority to CN202210438430.6A priority Critical patent/CN114782903A/en
Publication of CN114782903A publication Critical patent/CN114782903A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/10Alarms for ensuring the safety of persons responsive to calamitous events, e.g. tornados or earthquakes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/40Transportation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/10Information sensed or collected by the things relating to the environment, e.g. temperature; relating to location
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/10Detection; Monitoring
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/20Analytics; Diagnosis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30192Weather; Meteorology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30236Traffic on road, railway or crossing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q2209/00Arrangements in telecontrol or telemetry systems
    • H04Q2209/20Arrangements in telecontrol or telemetry systems using a distributed architecture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q2209/00Arrangements in telecontrol or telemetry systems
    • H04Q2209/40Arrangements in telecontrol or telemetry systems using a wireless architecture

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Development Economics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Quality & Reliability (AREA)
  • Mathematical Physics (AREA)
  • Biomedical Technology (AREA)
  • Environmental & Geological Engineering (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Emergency Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Game Theory and Decision Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Biology (AREA)
  • Molecular Biology (AREA)
  • Geology (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Medical Informatics (AREA)

Abstract

The invention discloses an ESN-based method for identifying and early warning cluster fog of a highway river-crossing grand bridge, belonging to the technical field of highway cluster fog identification, aiming at solving the technical problem of how to realize identification of cluster fog along a line in a highway river-crossing grand bridge area and improve cluster fog prediction precision, and adopting the technical scheme that: the method comprises the following specific steps: laying equipment: installing cluster fog sensing equipment and edge computing equipment on the highway river-crossing grand bridge, wherein the cluster fog sensing equipment is used for sensing and transmitting visibility and image data of each point of the highway river-crossing grand bridge; the edge computing equipment is used for analyzing and processing multi-source heterogeneous meteorological data and predicting the probability of the occurrence of the cluster fog; judging the cluster fog; sensing the mist; predicting the cloud: according to real-time meteorological data sensed along the highway river-crossing grand bridge, a group fog prediction probability value is obtained by utilizing an echo state network training method.

Description

ESN-based highway river-crossing grand bridge group fog recognition and early warning method
Technical Field
The invention relates to the field of highway group fog identification, in particular to an ESN-based highway river-crossing grand bridge group fog identification and early warning method.
Background
The fog is a common weather phenomenon, has strong regionality, low visibility, quick change and large prediction difficulty, and is extremely harmful to the traffic safety of the highway. The fog cluster is different from wide-range advection fog and radiation fog, the atmospheric visibility change range is large, and the visibility monitored in the fog cluster area rises and falls in a repeated jumping manner. Due to the unique climatic conditions of the river-crossing basin, the air humidity is high, when the river-crossing basin is in a season with large day-night temperature difference, water is evaporated into the air in the daytime, and water vapor in the air is very easy to liquefy to form mist after the air temperature drops at night. The different ways of forming a cloud have different effects on traffic operations, resulting in great difficulty in identifying and predicting the cloud.
At present, a visibility meter is generally adopted for identifying and early warning the mist, and is divided into a perspective type and a scattering type, wherein the perspective type principle is that the observation distance of naked eyes in the atmosphere is obtained by measuring the extinction coefficient of the atmosphere according to an empirical formula. The scattering principle is to determine the visible distance by measuring the intensity of scattered light in a volume of air caused by gas molecules, aerosol particles, droplets, etc. The method has the advantages that the prediction precision is low for the identification of the foggy weather in the area, the facilities are limited by unreasonable distribution of hardware technical conditions, the observation data analysis capability is insufficient, and the identification of the mass fog along the line and the prediction of the mass fog trend in the river-crossing bridge area of the expressway can not be realized.
Therefore, how to realize the identification of the mass fog along the line in the highway river-crossing bridge area and improve the mass fog prediction precision is a technical problem to be solved urgently at present.
Disclosure of Invention
The invention provides an ESN-based highway river-crossing grand bridge group fog recognition and early warning method, which solves the problem of how to realize highway river-crossing bridge area group fog recognition along lines and improve group fog prediction accuracy.
The technical task of the invention is realized in the following way, and the ESN-based method for identifying and early warning the mass fog of the river-crossing grand bridge of the highway specifically comprises the following steps:
laying equipment: installing cluster fog sensing equipment and edge computing equipment on the highway river-crossing grand bridge, wherein the cluster fog sensing equipment is used for sensing and transmitting visibility and image data of each point of the highway river-crossing grand bridge; the edge computing equipment is used for analyzing and processing multi-source heterogeneous meteorological data and predicting the probability of the occurrence of the cluster fog;
judging the mist: distinguishing through video images collected by a plurality of adjacent group fog sensing devices, removing moving targets by using an image recognition technology, and reserving background pictures; the characteristics of each pixel point in the image are represented by a Gaussian model, and the contrast of the current image and the previous frame of image and the change value of the ambiguity are analyzed to judge whether the fog is generated or not;
sensing the cluster fog: a weather sensing main station and a plurality of weather sensing substations are distributed by using a distributed structure, and the weather sensing main station and the weather sensing substations are communicated through a wireless self-organization Zigbee network, so that the fog condition of the highway crossing a river super bridge area along the line is effectively sensed;
and (3) predicting the cloud: according to real-time meteorological data sensed along the highway river-crossing grand bridge, a group fog prediction probability value is obtained by utilizing an echo state network training method.
Preferably, the group fog sensing equipment comprises a radar and vision all-in-one machine, a Zigbee (TI-CC2530) transceiver module, a weather sensing main station and a plurality of weather sensing substations; the weather sensing main station consists of a main station weather sensor, wherein the main station weather sensor comprises an visibility meter, an air temperature sensor, a humidity sensor, a wind speed sensor, a wind direction sensor, a road surface temperature sensor and a precipitation sensor; the weather sensing substation consists of a substation weather sensor, wherein the substation weather sensor comprises a back scattering visibility meter and a pavement temperature sensor;
the edge computing equipment comprises a fog end drive test terminal (RSU communication unit) and an edge computing terminal, and the edge computing terminal is used for collecting and analyzing meteorological data of a main station and a plurality of substations along the highway river-crossing grand bridge.
Preferably, the data transmission between the edge computing terminal and the cloud sensing device is as follows:
the edge computing terminal sends a weather command for starting transmission to the weather sensor in an RS485 communication mode, and the weather sensor sends various acquired weather information to the edge computing terminal in a weather message mode; the information of the message transmitted between the edge computing terminal and the meteorological sensor is 296 bytes, and the message information comprises a station number, observation time, longitude and latitude, altitude, air temperature, relative humidity, precipitation, wind speed, wind direction, road temperature, visibility, road surface state and atmospheric pressure value;
the method comprises the following steps that an edge computing terminal sends a visual command for starting data transmission of a thunder-vision all-in-one machine to the thunder-vision all-in-one machine by adopting a TCP/IP transmission protocol, and the thunder-vision all-in-one machine sends obtained thunder-vision data to the edge computing terminal in a visual information flow mode; the radar vision data in the radar vision all-in-one machine comprises visual pictures, vehicle types, vehicle numbers and traffic volume information of lane occupancy;
the edge computing terminal adopts a UART communication mode to send control information to the Zigbee transceiver module, and after the Zigbee transceiver module receives the information, the Zigbee network is started to detect whether the Zigbee network laid by other Zigbee transceiver modules exists nearby:
if yes, networking is carried out, and whether networking information of the Zigbee transceiver module exists or not and data information acquired by other nodes are returned to the edge computing terminal;
after receiving feedback information of the meteorological sensor, the radar and the radar integrated unit and feedback information of the TI-CC2530, the edge computing terminal reports all state information of the node to the cloud platform through the fog-side drive test terminal.
Preferably, the edge computing terminal and the fog end drive test terminal are communicated through a TCP/IP protocol, a thing networking flow card is arranged in the fog end drive test terminal, the edge computing terminal accesses the Internet in a 4G mode by means of the fog end drive test terminal, and various collected information is uploaded to the cloud platform through a 4G network;
the power supply of the edge computing terminal is alternating current 220V, the power supply of the fog end drive test terminal and the power supply of the thunder vision all-in-one machine are achieved through a direct current 24V circuit, the power supply of the meteorological sensor is direct current 12V, the power supply of the Zigbee transceiver module is direct current 5V, an antenna of the Zigbee transceiver module is connected with a power amplification circuit, and independent power supply is not needed.
Preferably, the meteorological sensing master station is used for scheduling data resources of all sensing substations of the highway river-crossing grand bridge and controlling data transmission of the substations through a Zigbee network; the weather sensing main station judges whether the cluster fog exists by combining with an ESN algorithm and utilizing weather, radar and visual information reported by each substation, and then outputs early warning information; the method comprises the following specific steps:
after the sensing weather master station is powered on, initializing an edge computing terminal (MEC);
after each meteorological sensor is electrified, acquire meteorological perception main website data, meteorological perception main website data include visibility, atmospheric temperature and humidity, wind speed and direction, precipitation, road surface temperature and road surface state data, judge simultaneously whether meteorological perception sub-station acquires data:
if the weather sensing substation does not acquire data, the ESN algorithm on the edge computing terminal only judges whether the cluster mist exists at the weather sensing main station or not; the weather sensing substation data comprises visibility and road surface temperature;
if the weather sensing substation acquires data, the data transmission is realized through a Zigbee network, and data verification is performed:
if the weather sensing substation is verified to be correct, an ESN algorithm on the edge computing terminal judges whether cluster fog exists at the positions of the weather sensing main station and the weather sensing substations:
if the cluster fog is not found, acquiring data of the weather sensing main station and the weather sensing substation again;
and if the cluster fog exists, outputting early warning and carrying out the next working cycle.
Preferably, the weather sensing substation is used for driving the weather sensor to normally work to acquire data, establishing a Zigbee network node, receiving a command of the weather sensing main station, and transmitting data required in the command to the weather sensing main station in a message form through a Zigbee network; the method comprises the following specific steps:
after the weather sensing substation is powered on, starting an Arm platform control program, establishing Zigbee sub-node communication, acquiring data of a visibility sensor and a pavement temperature sensor, and performing data preprocessing;
confirming whether the weather sensing main station acquires the data command of the weather sensing substation:
if a receiving command of the weather sensing main station is obtained, the weather sensing substation transmits data to the weather sensing main station through a Zigbee network;
if the receiving command of the weather sensing main station is not acquired, data storage is locally carried out on the weather sensing substation;
after the data of the weather sensing substation is sent, whether the transmission is completed is confirmed:
if the transmission is finished, performing the next working cycle;
if not, then carrying out retransmission; regardless of whether the retransmission correctly transmits the data, the next duty cycle is performed after the retransmission is completed.
Preferably, the judgment of the cloud is as follows:
image data preprocessing: the method comprises the steps of examining and checking video image data shot by a camera of the radar-vision all-in-one machine, deleting repeated information, eliminating image errors caused by camera spot problems, and unifying image storage formats into a jpg format;
time-series imaging: extracting pictures according to the selected video image and the 5s step length, forming a time sequence image by every five pictures, and respectively storing the time sequence image in an automatic folder, wherein the naming of the folder adopts a pile number plus time format;
determining a background image: the measurement of different brightness levels between the brightest white and the darkest black of a bright and dark area in an image is the identification of the contrast of the image, namely the gray contrast of the image, and specifically comprises the following steps: converting the image into a gray image, calculating pixel differences between adjacent pixels, summing the pixel differences to obtain the contrast of the image, learning the image generating fog at different time periods through a machine learning classification algorithm to determine the threshold value for judging fog at different moments, and comparing the threshold value with the image contrast at the current moment to judge whether fog exists;
judging whether fog exists: calculating the image definition to obtain the image fuzziness through inverse calculation, performing convolution operation through a Laplacian mask according to the image gray value, calculating the standard deviation, and when the standard deviation of the image is lower than a threshold value, determining that the moment is fog; when the standard deviation of the image is higher than the threshold value, the moment is considered to be fog-free; the threshold is obtained by training the fuzzy values of various fog or fog-free pictures in a machine learning mode, and the thresholds at different moments are different;
judging whether the mist is a cluster mist: when the camera of one radar vision all-in-one machine judges that fog exists, the cluster fog judgment is started immediately to comprehensively judge the cameras of the front and rear adjacent five radar vision all-in-one machines:
when the fog is judged by the cameras of more than four adjacent radar and video integrated machines, starting fog early warning;
when the cameras of more than two adjacent radar-vision integrated machines detect a non-fog condition, judging the fog to be cluster fog, analyzing and predicting the flow direction of the cluster fog, and starting cluster fog early warning; meanwhile, the flow direction and speed of the cluster fog are predicted and analyzed according to the sequence of the fog detected by the cameras, because the cameras are arranged according to a specific direction and a specific distance, if one camera detects the fog, the camera positioned in the south of the camera sequentially detects the fog, the flow direction of the cluster fog can be judged to be from north to south, and the flow direction and speed of the cluster fog are calculated according to the distance between the cameras.
Preferably, the background image is determined as follows:
the first frame f of the time series image1As an initial background, K Gaussian models are used for representing each pixel point f of the image1(x, y) feature; meanwhile, the mean value mu of each Gaussian model is determined by the distribution of the image gray level histogramjAnd standard deviation σjThe gray value of each pixel point is represented by the superposition of K Gaussian distribution functions;
starting from the second frame image fi(x,y),i>1, estimating whether each pixel point belongs to the background, namely judging whether the following formula is satisfied:
|fi(x,y)-μj|2.5σj
if yes, the pixel point is a background point;
if not, the scene point is taken as a foreground point;
with background model weights in the updated image: sequencing all Gaussian models, arranging the Gaussian models with large weight and small standard deviation in front, truncating after K, sequentially iterating, and updating the model image of the background point, wherein the formula is as follows:
Figure BDA0003613918310000061
wherein, the background point M k,i1, foreground point M k,i0, α is the learning rate,
dividing a background picture into four regions based on a time sequence image of the background picture, and establishing time variation curves of contrast of the four regions;
and establishing a fog monitoring model of the time sequence image based on the background image, and judging the real-time image transmitted by the camera by using the fog monitoring model.
Preferably, the analysis of the time-series image change rule based on the background image is specifically as follows:
firstly, the contrast X1 of the current image causes the contrast X1 of the pixel value in the background image to be sharply reduced when the cloud is sudden;
secondly, ambiguity X2 of the current image identifies the current ambiguity, and when the cluster fog is burst, the whole image has more than 75% of areas or fuzzy areas;
thirdly, the contrast ratio X3 of the upper frame image in the time sequence reduces the amplitude of X3 and X1 of three regions in the four regions by more than 300% when the fog burst occurs;
and fourthly, the blur degree X4 of the top frame image in the time sequence is that when the fog is sudden in the blur degree identification, the area below 25 percent of the whole image is a blur area.
Preferably, the method for obtaining the predicted value of the cloud by using the training method of the echo state network specifically comprises the following steps:
network initialization:
random initialization input weight WinAnd the state weight W of the reserve poolresAdjusting WresValue, selecting a suitable spectral radius, and dividing WbackSet to 0, ignore the feedback of the output layer to the reservoir. The ESN training process is mainly embodied in the output weight WoutDynamic adjustment of (2). That is, after the scale of the reserve and other weights are initialized randomly, the output weights are trained by linear regression or machine learning.
A network structure with L input nodes, M reserve pool internal neurons and N output nodes is constructed, and the expressions of reserve pool state updating and network output are as follows:
u(n)=f(Wresu(n-1)+Winx(n)+Wbacky (n-1)); formula (1)
y(n)=fout(Woutu (n); formula (2)
Wherein x (n) ε CL×1Representing the input sequence of the network at time n; u (n) C ∈M×1Representing n time state of neuron in reserve pool; w is a group ofres∈CM×MRepresenting a connection weight matrix of the reserve pool; w is a group ofin∈CM×lRepresenting a connection weight matrix from a network input node to a reserve pool neuron node; w is a group ofout∈CM×NRepresenting a connection weight matrix from a neural node of the reserve pool to an output node; wback∈CN×MRepresenting a feedback connection weight matrix from the output layer node to the reserve pool neuron node; y (n) C ∈N×1Representing the target output when the network input is x (n); f (g), fout(g) Representing the activation function of the pool neurons and the readout function of the output layer, respectively; f (g) typically using a non-linear function; f. ofout(g) Carrying out nonlinear reading on an output node of the network;
collecting the reserve pool status and the target output matrix: at the time n (n ═ 1,2, …, L), feeding input data x (n) into a reserve pool of the ESN, and updating the state of the reserve pool according to the formula (1) to obtain a new state matrix u (n); and collecting new state matrices; let t be0Starting to collect reserve pool state matrix u (n) and targets at that momentOutputting the matrix, and obtaining dimension M x (L-t) when n is L0+1) pool matrix U ═ U (t)0),u(t0+1),...,u(M)]TAnd dimension N x (L-t)0+1) target output matrix T ═ yt(t0),yt(t0+1),...,yt(N)](ii) a x (n) represents an L-dimensional input vector, and U represents M × (L-t)0A reserve pool status update matrix of +1 dimension, T represents N × (L-T)0A +1) dimensional target output matrix;
training output weight Wout: according to training sample (x (n), y) input at n timet(n)) and reserve pool states u (n) to train the output weights W of the networkout(ii) a Setting the state u (n) of the reserve pool in linear relation with the output y (n) of the network, i.e. the read-out function f of the output layerout(g) 1, in order to approximate the actual output y (n) of the network to the desired output y of the data samplet(n) namely:
yt(n)≈y(n)=Wout(n)u(n); (3)
expected output weight W of the networkoutThe mean square error of the system can be minimized, namely:
Wout=argmin||WoutU-T||2; (4)
the output weight W of the network can be obtained by solving the formula (3) and the formula (4) by adopting a linear regression methodout
Figure BDA0003613918310000081
Wherein,
Figure BDA0003613918310000082
is a pseudo-inverse operation of the matrix;
avoiding limiting the size N of the reservoir or the number of training samples by solving a system of norm equations, the formula is as follows:
WoutUUT=TUT; (6)
u and T are the above-mentioned reserve pool state update matrix and target output matrix, and the top right corner T represents the transpose of the matrix;
the equation is solved as:
Wout=TUT(UUT)-1; (7)
the upper right corner-1 represents the inverse of the matrix;
as can be seen from the formula (7), TUT∈Cm×NAnd UUT∈CN×NLength (L-t) independent of training samples0+1) and incrementally as the training data passes through the reservoir.
The ESN-based highway river-crossing grand bridge group fog recognition and early warning method has the following advantages:
according to the demand of safety control of the highway river-crossing bridge cluster fog traffic, the technical advantages of wireless self-organization Zigbee network communication, multi-point image recognition, echo state network and the like are fully integrated, and the stability, reliability, high efficiency and practicability of the highway river-crossing bridge cluster fog prediction are ensured;
the method comprises the following steps of (II) identifying the fog condition of the highway by utilizing an image processing technology, judging by adopting video data collected by a plurality of adjacent camera lenses, removing a moving target and keeping a background picture; the characteristics of each pixel point in the image are represented by a Gaussian model, and the change of the contrast and the fuzziness of the current image and the previous image is analyzed to judge whether the cluster fog occurs, so that the accuracy of cluster fog identification is effectively improved;
thirdly, cluster fog sensing equipment is distributed in a distributed structure, meteorological data with the characteristics of multiple points, multiple sources and isomerism are collected, arranged and communicated in a networking mode through an edge computing terminal by building a wireless self-organizing Zigbee network, collected multi-source isomerism data are classified, and reliability and instantaneity of data transmission are improved; the distributed structure means that the change condition of the mist along the bridge is observed on a highway river-crossing grand bridge, the sensing of the mist can be realized by configuring 1 weather sensing master station and a plurality of weather sensing substations, and the refined sensing can be realized; the weather sensing master station is arranged near a power supply, has power supply (220v alternating current) requirements, and adopts a wireless mode for transmission, namely sim cards are used as far as possible; the weather sensing substation is powered by a solar panel and is communicated with the weather sensing main station by a Zigbee network; the weather sensing main station needs to add a ground foundation on the bridge and erect the pole, and the weather sensing sub station can be fixed on the door frame and the lamp pole in a hoop mode; the weather sensing main station and the weather sensing sub-station are distributed in a distributed structure, weather environment information is collected through the weather sensing main station and the weather sensing sub-station, the weather sensing main station adopts an integrated weather station to collect all data, and the weather sensing sub-station is matched with a visibility sensor and a remote sensing type road surface state sensor and is distributed on the side of a highway along the route of the highway; the weather sensing main station is arranged at 1 position every 10 kilometers, and the parameters of the weather sensing main station are shown in the following table:
Figure BDA0003613918310000091
the weather sensing substation is provided with 1 position every 500m, and the parameters of the weather sensing substation are shown in the following table:
serial number In-situ sensor Measuring range Resolution power Accuracy of
1 Back scattering visibility 10~1000m﹡ 1m ±20%
2 Road surface temperature -40~+80℃ 0.1℃ ±1℃
And fourthly, the classical training method of the echo state network is applied to the short-term prediction of the fog of the super-large bridge group on the highway, and the ESN prediction algorithm is operated on the edge computing terminal, so that the accuracy of the prediction information is improved, and the system availability is increased.
Drawings
The invention is further described below with reference to the accompanying drawings.
FIG. 1 is a schematic diagram of a distributed layout of a route of a cloud sensing device;
FIG. 2 is a block diagram of a cluster fog sensing device and an edge computing device;
FIG. 3 is a flow chart of a method for identifying and early warning the mass fog of a river-crossing grand bridge of a highway based on ESN;
FIG. 4 is a flow chart of a weather sensing master station working process;
FIG. 5 is a flow chart of the working process of the weather sensing substation;
fig. 6 is a schematic diagram of an ESN topology of an echo state network.
Detailed Description
The method for identifying and warning the fog of the highway river-crossing grand bridge group based on the ESN is described in detail below by referring to the attached drawings and specific embodiments of the specification.
Example 1:
the invention discloses an ESN-based highway river-crossing grand bridge group fog recognition and early warning method, which comprises the following steps:
s1, arranging equipment: installing cluster fog sensing equipment and edge computing equipment on the highway river-crossing grand bridge, wherein the cluster fog sensing equipment is used for sensing and transmitting visibility and image data of each point of the highway river-crossing grand bridge; the edge computing equipment is used for analyzing and processing multi-source heterogeneous meteorological data and predicting the probability of the cluster fog;
s2, judging mist: distinguishing through video images collected by a plurality of adjacent group fog sensing devices, removing moving targets by using an image recognition technology, and reserving background pictures; the characteristics of each pixel point in the image are represented by a Gaussian model, and the change values of the contrast and the ambiguity of the current image and the previous frame image are analyzed to judge whether the fog is generated or not;
s3, sensing the cluster fog: a weather sensing main station and a plurality of weather sensing sub-stations are distributed by using a distributed structure, as shown in the attached figure 1, the weather sensing main station and the weather sensing sub-stations are communicated through a wireless self-organization Zigbee network, and the fog condition of the expressway along the line of a river-crossing grand bridge area is effectively sensed;
s4, predicting the foggy mass: and obtaining the predicted probability value of the foggy mass by utilizing an echo state network training method according to real-time sensing meteorological data along the highway river-crossing grand bridge.
As shown in fig. 2, the cloud sensing device in this embodiment includes a radar and vision all-in-one machine, a Zigbee (TI-CC2530) transceiver module, a weather sensing master station, and a plurality of weather sensing substations; the weather sensing main station consists of a main station weather sensor, wherein the main station weather sensor comprises an visibility meter, an air temperature sensor, a humidity sensor, a wind speed sensor, a wind direction sensor, a road surface temperature sensor and a precipitation sensor; the weather sensing substation consists of a substation weather sensor, wherein the substation weather sensor comprises a back scattering visibility meter and a pavement temperature sensor; the edge computing device comprises a fog-end drive test terminal (RSU communication unit) and an edge computing terminal, and the edge computing terminal is used for collecting and analyzing meteorological data of a main station and a plurality of substations along the river-crossing grand bridge of the expressway.
In this embodiment, the data transmission between the edge computing terminal and the cloud sensing device in step S1 is specifically as follows:
s101, the edge computing terminal sends a weather command for starting transmission to a weather sensor in an RS485 communication mode, and the weather sensor sends various acquired weather information to the edge computing terminal in a weather message mode; the information of the message transmitted between the edge computing terminal and the meteorological sensor is 296 bytes, and the message information comprises a station number, observation time, longitude and latitude, altitude, air temperature, relative humidity, precipitation, wind speed, wind direction, road temperature, visibility, road surface state and atmospheric pressure value;
s102, the edge computing terminal sends a visual command for starting data transmission of the radar-vision all-in-one machine to the radar-vision all-in-one machine by adopting a TCP/IP transmission protocol, and the radar-vision all-in-one machine sends the obtained radar-vision data to the edge computing terminal in a visual information flow mode; the radar vision data in the radar vision all-in-one machine comprises visual pictures, vehicle types, vehicle numbers and traffic volume information of lane occupancy;
s103, the edge computing terminal sends control information to the Zigbee transceiver module in a UART communication mode, after the Zigbee transceiver module receives the information, the Zigbee network is started, and whether the Zigbee network distributed by other Zigbee transceiver modules exists nearby is detected:
if yes, networking is carried out, and whether networking information of the Zigbee transceiver module exists or not and data information acquired by other nodes are returned to the edge computing terminal;
and S104, after receiving feedback information of the meteorological sensor, the radar-vision integrated unit and the TI-CC2530, the edge computing terminal reports all state information of the node to the cloud platform through the fog-side drive test terminal.
The edge computing terminal and the fog-side drive test terminal in the embodiment communicate through a TCP/IP protocol, a thing networking flow card is arranged in the fog-side drive test terminal, the edge computing terminal accesses the Internet in a 4G mode by means of the fog-side drive test terminal, and various collected information is uploaded to a cloud platform through a 4G network; the power supply of the edge computing terminal is alternating current 220V, the power supply of the fog end drive test terminal and the power supply of the radar all-in-one machine are achieved through a direct current 24V circuit, the power supply of the meteorological sensor is direct current 12V, the power supply of the Zigbee transceiver module is direct current 5V, and an antenna of the Zigbee transceiver module is connected with the power amplifier circuit and does not need to be supplied independently.
As shown in fig. 4, the weather sensing master station in this embodiment is configured to schedule data resources of each sensing substation of the highway river-crossing grand bridge, and control data transmission of the substation through a Zigbee network; the weather sensing main station judges whether the cluster fog exists by combining with an ESN algorithm and utilizing weather, radar and visual information reported by each substation, and then outputs early warning information; the method comprises the following specific steps:
(1) after the sensing weather master station is powered on, initializing an edge computing terminal (MEC);
(2) each meteorological sensor is electrified the back, acquires meteorological perception main website data, and meteorological perception main website data includes visibility, atmospheric temperature and humidity, wind speed and direction, precipitation, road surface temperature and road surface state data, judges simultaneously whether meteorological perception substation acquires data:
firstly, if the weather sensing substation does not acquire data, an ESN algorithm on the edge computing terminal only judges whether cluster fog exists at the weather sensing main station; the weather sensing substation data comprises visibility and road surface temperature;
secondly, if the weather sensing substation acquires the data, executing the step (3);
(3) and realizing data transmission through a Zigbee network, and carrying out data verification:
if the verification is correct, executing the step (4);
(4) and judging whether cluster mist exists at the positions of the weather sensing main station and the weather sensing substations by an ESN algorithm on the edge computing terminal:
firstly, if the cluster fog is not found, acquiring data of the weather sensing main station and the weather sensing substation again;
and secondly, if the mist exists, outputting an early warning and skipping to the step (1).
As shown in fig. 5, the weather sensing substation in this embodiment is configured to drive a weather sensor to normally work to acquire data, establish a Zigbee network node, receive a command from a weather sensing master station, and transmit data required in the command to the weather sensing master station in the form of a message through a Zigbee network; the method comprises the following specific steps:
after the weather sensing substation is powered on, starting an Arm platform control program, establishing Zigbee sub-node communication, acquiring data of a visibility sensor and a pavement temperature sensor, and performing data preprocessing;
(II) determining whether the weather sensing main station acquires a weather sensing substation data command:
if a receiving command of the weather sensing main station is obtained, the weather sensing substation transmits data to the weather sensing main station through a Zigbee network;
if the receiving command of the weather sensing main station is not acquired, data storage is locally carried out on the weather sensing substation;
and (III) after the data of the weather sensing substation are sent, whether transmission is finished is confirmed:
firstly, if the transmission is finished, the next working cycle is carried out;
if not, then retransmission is carried out; whether the retransmission correctly transmits the data or not, the next working cycle is carried out after the retransmission is finished.
As shown in fig. 3, the judgment of the cloud in step S3 in the present embodiment is specifically as follows:
s301, image data preprocessing: the method comprises the steps of examining and checking video image data shot by a camera of the radar and video all-in-one machine, deleting repeated information, eliminating image errors caused by camera spot problems, and unifying image storage formats into a jpg format;
s302, time series imaging: extracting pictures according to the selected video image and the 5s step length, forming a time sequence image by every five pictures, and respectively storing the time sequence images in an automatic folder, wherein the naming of the folder adopts a pile number + time format;
s303, determining a background image: the measurement of different brightness levels between the brightest white and the darkest black of a bright and dark area in an image is the identification of the contrast of the image, namely the gray contrast of the image, and specifically comprises the following steps: converting the image into a gray image, calculating pixel differences between adjacent pixels, summing the pixel differences to obtain the contrast of the image, learning the image generating fog at different time periods through a machine learning classification algorithm to determine the threshold value for judging fog at different moments, and comparing the threshold value with the contrast of the image at the current moment to judge whether fog exists or not;
s304, judging whether fog exists: calculating the image definition to obtain the image blur degree through inverse calculation, performing convolution operation through a Laplace mask according to the image gray value, calculating a standard deviation, and when the standard deviation of the image is lower than a threshold value, determining that the moment is fog; when the standard deviation of the image is higher than the threshold value, the moment is considered to be fog-free; the threshold is obtained by training the fuzzy values of various fog-existing pictures in a machine learning mode, and the thresholds at different moments are different;
s305, judging whether the mist is a cluster mist: when the camera of one radar vision all-in-one machine judges that fog exists, the cluster fog judgment is started immediately to comprehensively judge the cameras of the front and rear adjacent five radar vision all-in-one machines:
firstly, when more than four adjacent cameras of the radar and video all-in-one machine judge that fog exists, starting fog early warning;
judging that the fog is formed when the cameras of more than two adjacent radar all-in-one machines detect a non-fog condition, analyzing and predicting the flow direction of the fog, and starting the early warning of the fog; meanwhile, the flow direction and speed of the cluster fog are predicted and analyzed according to the sequence of the fog detected by the cameras, because the cameras are arranged according to a specific direction and a specific distance, if one camera detects the fog, the camera positioned in the south of the camera sequentially detects the fog, the flow direction of the cluster fog can be judged to be from north to south, and the flow direction and speed of the cluster fog are calculated according to the distance between the cameras.
The background image is determined in step S303 of this embodiment as follows:
s30301, and converting the first frame f of the time-series image1As an initial background, K Gaussian models are used for representing each pixel point f of the image1(x, y) feature; meanwhile, the mean value mu of each Gaussian model is determined by the distribution of the image gray level histogramjAnd standard deviation σjThe gray value of each pixel point is represented by the superposition of K Gaussian distribution functions;
s30302, starting from the second frame image fi(x,y),i>1, estimating whether each pixel point belongs to the background, namely judging whether the following formula is satisfied:
|fi(x,y)-μj|2.5σj
if yes, the pixel point is a background point;
if not, the scene is used as a foreground point;
s30303, using the background model weight in the updated image: sequencing all Gaussian models, arranging the Gaussian models with large weight and small standard deviation in front, truncating after K, sequentially iterating, and updating the model image of the background point, wherein the formula is as follows:
Figure BDA0003613918310000141
wherein, background point M k,i1, foreground point M k,i0, α is the learning rate;
s30304, dividing the background picture into four regions based on the time sequence image of the background image, and establishing time variation curves of the contrast of the four regions;
s30305, establishing a fog monitoring model of the time series image based on the background image, and judging the real-time image transmitted by the camera by using the fog monitoring model.
In this embodiment S30305, the analysis of the time-series image change rule based on the background image is specifically as follows:
firstly, the contrast X1 of the current image causes the contrast X1 of the pixel value in the background image to be sharply reduced when the cloud is sudden;
secondly, ambiguity X2 of the current image identifies the current ambiguity, and when the cluster fog is sudden, the whole image has more than 75% of areas or is an ambiguity area;
thirdly, the contrast ratio X3 of the upper frame image in the time sequence is reduced by more than 300% in X1 and X3 of three regions in the four regions when the fog bursts;
and fourthly, the blur degree X4 of the top frame image in the time sequence is that when the fog is sudden in the blur degree identification, the area below 25 percent of the whole image is a blur area.
In this embodiment, the probability value of the cloud prediction obtained by the training method using the echo state network in step S4 is specifically as follows:
s401, randomly initializing input weight WinAnd the state weight W of the reserve poolresAdjusting WresValue, selecting appropriate spectral radius, and dividing WbackSet to 0, ignore the feedback of the output layer to the reservoir.
S402, constructing a network structure with L input nodes, M reserve pool internal neurons and N output nodes, wherein the reserve pool state updating and network output expressions are as follows:
u(n)=f(Wresu(n-1)+Winx(n)+Wbacky (n-1)); formula (1)
y(n)=fout(Woutu (n)); formula (2)
Wherein x (n) ε CL×1Representing the input sequence of the network at time n; u (n) C ∈M×1Representing n-time state of neurons in the reserve pool; wres∈CM×MRepresenting a connection weight matrix of the reserve pool; w is a group ofin∈CM×lRepresenting a connection weight matrix from a network input node to a reserve pool neuron node; wout∈CM×NRepresenting a connection weight matrix from a neuron node of the reserve pool to an output node; wback∈CN×MRepresenting a feedback connection weight matrix from the output layer node to the reserve pool neuron node; y (n) C ∈N×1Representing a target output when the network input is x (n); f (g), fout(g) Respectively representing an activation function of the reserve cell neuron and a readout function of the output layer; f (g) typically using a non-linear function; f. ofout(g) Performing nonlinear reading on an output node of the network;
s403, collecting the reserve pool state and the target output matrix: at the time n (n ═ 1,2, …, L), feeding input data x (n) into a reserve pool of the ESN, and updating the state of the reserve pool according to the formula (1) to obtain a new state matrix u (n); and collecting new state matrixes; let t be0Starting to collect a reserve pool state matrix u (n) and a target output matrix at the moment, and when n is equal to L, obtaining a dimension of M x (L-t)0+1) pool matrix U ═ U (t)0),u(t0+1),...,u(M)]TAnd dimension N x (L-t)0+1) target output matrix T ═ yt(t0),yt(t0+1),...,yt(N)](ii) a x (n) represents an L-dimensional input vector, and U represents M × (L-t)0A reserve pool status update matrix of +1 dimension, T represents N × (L-T)0A +1) dimensional target output matrix;
s404, training output weight Wout: according to training samples (x (n), y) input at n timet(n)) and reserve pool states u (n) to train the output weights W of the networkout(ii) a Setting the state of reserve pool u (n) and the output y (n) of network in linear relation, i.e. the read-out function f of output layerout(g) 1, in order to approximate the actual output y (n) of the network to the desired output y of the data samplet(n) namely:
yt(n)≈y(n)=Wout(n)u(n); (3)
expected output weight W of the networkoutThe mean square error of the system can be minimized, namely:
Wout=argmin||WoutU-T||2; (4)
the output weight W of the network can be obtained by solving the formula (3) and the formula (4) by adopting a linear regression methodout
Figure BDA0003613918310000161
Wherein,
Figure BDA0003613918310000162
is a pseudo-inverse operation of the matrix;
s405, direct pseudo-inverse calculation has high numerical stability, but larger storage overhead is needed for a reserve pool matrix U with an excessively high dimension, so that the size N of the reserve pool or the number of training samples is limited, the limitation of the size N of the reserve pool or the number of training samples can be avoided by solving a norm equation set, and the formula is as follows:
WoutUUT=TUT; (6)
u and T are the above-mentioned reserve pool state updating matrix and target output matrix, and the upper right corner T represents the transpose of the matrix;
the equation is obtained as:
Wout=TUT(UUT)-1; (7)
top right-1 represents the inverse of the matrix;
as can be seen from the formula (7), TUT∈Cm×NAnd UUT∈CN×NLength (L-t) independent of training samples0+1) and incrementally calculating as the training data passes through the reservoir; therefore, the solution of equations (3-7) is in spatial or temporal complexity equal to (L-t)0+1) is irrelevant. Compared with direct pseudo-inverse calculation, the method has lower numerical stability, and the pseudo-inverse can be used
Figure BDA0003613918310000173
Instead of real inverse (UU)T)-1Also, less computational complexity can be achieved.
The method comprises the steps of performing correlation analysis by taking meteorological data of a main station and a sub-station of the highway river-crossing bridge collected on the spot as an example, wherein the data of the sensing main station is shown in a table 1, and the data of the sensing sub-station is shown in a table 2.
TABLE 1 weather sensing Master station data
Figure BDA0003613918310000171
TABLE 2 weather sensing substation data
Figure BDA0003613918310000172
Comparing the data in tables 3 and 4, it can be seen that the meteorological data for the master station and substation 1 are not the same at a 500m separation. For example, the roadbed temperature of the master station is about 14 ℃, and the roadbed temperature of the substation 1 is about 13 ℃; the wind speed of the master station is not 0, and the wind speed of the slave station 1 is substantially 0.
By utilizing an ESN algorithm, meteorological data acquired by the system and the like, the probability of the occurrence of the cluster fog is predicted to be compared with the real meteorological condition whether the cluster fog occurs or not, and the following table 3 shows:
TABLE 3 comparison of cloud prediction with real weather conditions
Figure BDA0003613918310000181
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. An ESN-based method for identifying and early warning mass fog of a river-crossing grand bridge of a highway is characterized by comprising the following steps:
laying equipment: installing cluster fog sensing equipment and edge computing equipment on the highway river-crossing grand bridge, wherein the cluster fog sensing equipment is used for sensing and transmitting visibility and image data of each point of the highway river-crossing grand bridge; the edge computing equipment is used for analyzing and processing multi-source heterogeneous meteorological data and predicting the probability of the cluster fog;
judging the cluster fog: distinguishing video images collected by a plurality of adjacent group fog sensing devices, removing moving targets by using an image recognition technology, and reserving background pictures; the characteristics of each pixel point in the image are represented by a Gaussian model, and the contrast of the current image and the previous frame of image and the change value of the ambiguity are analyzed to judge whether the fog is generated or not;
sensing the cluster fog: a weather sensing main station and a plurality of weather sensing substations are distributed by applying a distributed structure, and the weather sensing main station and the weather sensing substations are communicated through a wireless self-organizing Zigbee network, so that the mist condition of the lines in the river-crossing grand bridge area of the expressway can be effectively sensed;
predicting the cloud: and obtaining the predicted probability value of the foggy mass by utilizing an echo state network training method according to real-time sensing meteorological data along the highway river-crossing grand bridge.
2. The ESN-based highway river-crossing grand bridge cluster fog identifying and early warning method according to claim 1, wherein the cluster fog sensing equipment comprises a radar and vision all-in-one machine, a Zigbee transceiver module, a weather sensing main station and a plurality of weather sensing sub-stations; the weather sensing main station consists of a main station weather sensor, wherein the main station weather sensor comprises an visibility meter, an air temperature sensor, a humidity sensor, a wind speed sensor, a wind direction sensor, a road surface temperature sensor and a precipitation sensor; the weather sensing substation consists of a substation weather sensor, wherein the substation weather sensor comprises a back scattering visibility meter and a pavement temperature sensor;
the edge computing equipment comprises a fog end drive test terminal and an edge computing terminal, and the edge computing terminal is used for collecting and analyzing meteorological data of a main station and a plurality of substations along the highway river-crossing grand bridge.
3. The ESN-based mass fog identifying and early warning method for the highway river-crossing grand bridge, according to claim 2, is characterized in that the data transmission between the edge computing terminal and the mass fog sensing device is as follows:
the edge computing terminal sends a weather command for starting transmission to the weather sensor in an RS485 communication mode, and the weather sensor sends various acquired weather information to the edge computing terminal in a weather message mode; the information of the message transmitted between the edge computing terminal and the meteorological sensor is 296 bytes, and the message information comprises a station number, observation time, longitude and latitude, altitude, air temperature, relative humidity, precipitation, wind speed, wind direction, road temperature, visibility, road surface state and atmospheric pressure value;
the method comprises the following steps that an edge computing terminal sends a visual command for starting data transmission of the radar-vision all-in-one machine to the radar-vision all-in-one machine by adopting a TCP/IP transmission protocol, and the radar-vision all-in-one machine sends obtained radar-vision data to the edge computing terminal in a visual information flow mode; the radar vision data in the radar vision all-in-one machine comprises visual pictures, vehicle types, vehicle numbers and traffic information of lane occupancy;
the edge computing terminal adopts a UART communication mode to send control information to the Zigbee transceiver module, and after the Zigbee transceiver module receives the information, the Zigbee network is started to detect whether the Zigbee network laid by other Zigbee transceiver modules exists nearby:
if yes, networking is carried out, and whether networking information of the Zigbee transceiver module exists or not and data information acquired by other nodes are returned to the edge computing terminal;
after receiving feedback information of the meteorological sensor, the radar and radar integrated unit and the Zigbee transceiver module, the edge computing terminal reports all state information of the node to the cloud platform through the fog-end drive test terminal.
4. The ESN-based highway river-crossing grand bridge group fog identifying and early warning method according to claim 3, wherein the edge computing terminal and the fog end drive test terminal are communicated through a TCP/IP protocol, a thing networking flow card is arranged in the fog end drive test terminal, by means of the fog end drive test terminal, the edge computing terminal accesses the Internet in a 4G mode, and various collected information is uploaded to a cloud platform through a 4G network;
the power supply of the edge computing terminal is alternating current 220V, the power supply of the fog end drive test terminal and the power supply of the radar all-in-one machine are achieved through a direct current 24V circuit, the power supply of the meteorological sensor is direct current 12V, the power supply of the Zigbee transceiver module is direct current 5V, and an antenna of the Zigbee transceiver module is connected with the power amplifier circuit.
5. The ESN-based highway river-crossing grand bridge group fog identifying and early warning method according to claim 1 or 2, wherein the weather sensing master station is used for scheduling data resources of all sensing substations of the highway river-crossing grand bridge and controlling data transmission of the substations through a Zigbee network; the weather sensing main station judges whether the cluster fog exists by combining with an ESN algorithm and utilizing weather, radar and visual information reported by each substation, and then outputs early warning information; the method comprises the following specific steps:
after the sensing weather master station is powered on, initializing the edge computing terminal;
after each meteorological sensor is electrified, meteorological perception main station data are obtained, the meteorological perception main station data comprise visibility, atmospheric temperature and humidity, wind speed and direction, precipitation, road surface temperature and road surface state data, and whether the meteorological perception substation obtains the data is judged simultaneously:
if the weather sensing substation does not acquire data, the ESN algorithm on the edge computing terminal only judges whether cluster fog exists at the weather sensing main station or not; the weather sensing substation data comprises visibility and road surface temperature;
if the weather sensing substation acquires data, the data transmission is realized through a Zigbee network, and data verification is performed:
if the weather sensing substation is verified to be correct, an ESN algorithm on the edge computing terminal judges whether cluster fog exists at the positions of the weather sensing main station and the weather sensing substations:
if the cluster mist is not found, acquiring data of the weather sensing main station and the weather sensing substation again;
and if the cluster fog exists, outputting early warning and carrying out the next working cycle.
6. The ESN-based highway river-crossing grand bridge group fog identifying and early warning method according to claim 1 or 2, wherein the weather sensing substation is used for driving weather sensors to normally work to acquire data, establishing Zigbee network nodes, receiving commands of a weather sensing main station, and transmitting data required in the commands to the weather sensing main station in a message form through a Zigbee network; the method comprises the following specific steps:
after the weather sensing substation is powered on, starting an Arm platform control program, establishing Zigbee sub-node communication, acquiring data of a visibility sensor and a pavement temperature sensor, and performing data preprocessing;
confirming whether the weather sensing main station acquires the data command of the weather sensing substation:
if a receiving command of the weather sensing main station is obtained, the weather sensing substation transmits data to the weather sensing main station through the Zigbee network;
if the receiving command of the weather sensing main station is not acquired, data storage is locally carried out on the weather sensing substation;
after the data of the weather sensing substation is sent, whether the transmission is completed is confirmed:
if the transmission is finished, performing the next working cycle;
if not, then retransmission is carried out; wherein, no matter whether the retransmission correctly transmits the data or not, the next working cycle is carried out after the retransmission is finished.
7. The ESN-based mass fog identifying and early warning method for the highway river-crossing grand bridge, according to claim 1, is characterized in that mass fog judgment is as follows:
image data preprocessing: the method comprises the steps of examining and checking video image data shot by a camera of the radar-vision all-in-one machine, deleting repeated information, eliminating image errors caused by camera spot problems, and unifying image storage formats into a jpg format;
time-series imaging: extracting pictures according to the selected video image and the 5s step length, forming a time sequence image by every five pictures, and respectively storing the time sequence image in an automatic folder, wherein the naming of the folder adopts a pile number plus time format;
determining a background image: the measurement of different brightness levels between the brightest white and the darkest black of a bright and dark area in an image is the identification of the contrast of the image, namely the magnitude of the gray contrast of an image, and specifically comprises the following steps: converting the image into a gray image, calculating pixel differences between adjacent pixels, summing the pixel differences to obtain the contrast of the image, learning the image generating fog at different time periods through a machine learning classification algorithm to determine the threshold value for judging fog at different moments, and comparing the threshold value with the contrast of the image at the current moment to judge whether fog exists or not;
judging whether fog exists: calculating the image definition to obtain the image fuzziness through inverse calculation, performing convolution operation through a Laplacian mask according to the image gray value, calculating the standard deviation, and when the standard deviation of the image is lower than a threshold value, determining that the moment is fog; when the standard deviation of the image is higher than the threshold value, the moment is considered to be fog-free; the threshold is obtained by training the fuzzy values of various fog-existing pictures in a machine learning mode, and the thresholds at different moments are different;
judging whether the mist is a cluster mist: when the camera of one radar vision all-in-one machine judges that fog exists, the cluster fog judgment is started immediately to comprehensively judge the cameras of the front and rear adjacent five radar vision all-in-one machines:
when more than four adjacent cameras of the radar and vision all-in-one machine judge that fog exists, starting fog early warning;
when the cameras of more than two adjacent radar-vision integrated machines detect a non-fog condition, judging the fog to be cluster fog, analyzing and predicting the flow direction of the cluster fog, and starting cluster fog early warning; meanwhile, the flow direction and speed of the mist group are predicted and analyzed according to the sequence of the fog detected by the cameras, if one camera detects the mist and the cameras positioned in the south detect the mist in sequence, the flow direction of the mist group can be judged to be from north to south, and the flow direction and speed of the mist group are calculated according to the distance between the cameras.
8. The ESN-based highway river-crossing grand bridge group fog identifying and early warning method according to claim 7, wherein the background image is determined as follows:
the first frame f of the time series image1As an initial background, K Gaussian models are used for representing each pixel point f of the image1(x, y) feature; meanwhile, the mean value mu of each Gaussian model is determined by the distribution of the image gray level histogramjAnd standard deviation σjThe gray value of each pixel point is represented by the superposition of K Gaussian distribution functions;
starting from the second frame image fi(x,y),i>1, estimating whether each pixel belongs to the background, namely judging whether the following formula is satisfied:
|fi(x,y)-μj|2.5σj
if yes, the pixel point is a background point;
if not, the scene is taken as a foreground spot;
with background model weights in the updated image: sequencing all Gaussian models, arranging the Gaussian models with large weight and small standard deviation in front, truncating after K, sequentially iterating, and updating the model image of the background point, wherein the formula is as follows:
Figure FDA0003613918300000051
wherein, the background point Mk,i1, foreground point Mk,iWhere 0, α is the learning rate,
dividing a background picture into four regions based on a time sequence image of the background picture, and establishing time variation curves of the contrast of the four regions;
and establishing a fog monitoring model of the time series image based on the background image, and judging the real-time image transmitted by the camera by using the fog monitoring model.
9. The ESN-based highway river-crossing grand bridge group fog identifying and early warning method according to claim 8, wherein the analysis of the time series image change rule based on the background image is as follows:
firstly, the contrast X1 of the current image causes the contrast X1 of the pixel value in the background image to be sharply reduced when the cloud is sudden;
secondly, ambiguity X2 of the current image identifies the current ambiguity, and when the cluster fog is sudden, the whole image has more than 75% of areas or is an ambiguity area;
thirdly, the contrast ratio X3 of the upper frame image in the time sequence is reduced by more than 300% in X3 and X1 of three regions in the four regions when the fog bursts;
and fourthly, the blur degree X4 of the upper frame images in the time sequence indicates that when the fog is suddenly formed in the blur degree identification, the regions below 25 percent of the whole image are blurred regions.
10. The ESN-based highway river-crossing grand bridge agglomerate fog recognition and early warning method according to claim 1, wherein the agglomerate fog prediction probability value obtained by using the training method of the echo state network is as follows:
network initialization: random initialization input weight WinAnd the state weight W of the reserve poolresAdjusting WresValue, selecting a suitable spectral radius, and dividing WbackSet to 0, ignore the feedback of the output layer to the pool. The ESN training process is mainly embodied in the output weight WoutDynamic adjustment of (2). That is, after the scale of the reserve and other weights are initialized randomly, the output weights are trained by linear regression or machine learning.
A network structure with L input nodes, M reserve pool internal neurons and N output nodes is constructed, and the expressions of reserve pool state updating and network output are as follows:
u(n)=f(Wresu(n-1)+Winx(n)+Wbacky (n-1)); formula (1)
y(n)=fout(Woutu (n); formula (2)
Wherein x (n) ε CL×1Representing the input sequence of the network at time n; u (n) epsilon CM×1Representing n time state of neuron in reserve pool; w is a group ofres∈CM×MRepresenting a connection weight matrix of the reserve pool; win∈CM×lRepresenting a connection weight matrix from a network input node to a reserve pool neuron node; wout∈CM×NRepresenting a connection weight matrix from a neural node of the reserve pool to an output node; wback∈CN×MRepresenting a feedback connection weight matrix from the output layer node to the reserve pool neuron node; y (n) ε CN×1Representing a target output when the network input is x (n); f (g), fout(g) Respectively representing an activation function of the reserve cell neuron and a readout function of the output layer; f (g) typically using a non-linear function; f. ofout(g) Carrying out nonlinear reading on an output node of the network;
collecting the reserve pool state and the target output matrix: at the time n (n ═ 1,2, …, L), feeding input data x (n) into a reserve pool of the ESN, and updating the state of the reserve pool according to the formula (1) to obtain a new state matrix u (n); and collecting new state matrices; let a0Time of dayStarting to collect the reserve pool state matrix u (n) and the target output matrix, and obtaining the dimension M x (L-t) when n is equal to L0+1) pool matrix U ═ U (t)0),u(t0+1),...,u(M)]TAnd dimension N x (L-t)0+1) target output matrix T ═ yt(t0),yt(t0+1),...,yt(N)](ii) a x (n) represents an L-dimensional input vector, and U represents M × (L-t)0A reserve pool status update matrix of +1 dimension, T represents N × (L-T)0A +1) dimensional target output matrix;
training output weight Wout: according to training sample (x (n), y) input at n timet(n)) and reserve pool states u (n) to train the output weights W of the networkout(ii) a Setting the state of reserve pool u (n) and the output y (n) of network in linear relation, i.e. the read-out function f of output layerout(g) 1, in order to approximate the actual output y (n) of the network to the desired output y of the data samplet(n) is:
yt(n)≈y(n)=Wout(n)u(n);(3)
expected output weight W of the networkoutThe mean square error of the system can be minimized, namely:
Wout=argmin||WoutU-T||2;(4)
the output weight W of the network can be obtained by solving the formula (3) and the formula (4) by adopting a linear regression methodout
Figure FDA0003613918300000071
Wherein,
Figure FDA0003613918300000072
is a pseudo-inverse operation of the matrix;
avoiding limiting the size N of the reservoir or the number of training samples by solving a system of norm equations, the formula is as follows:
WoutUUT=TUT;(6)
u and T represent a reserve pool state updating matrix and a target output matrix, and the top right corner T represents the transposition of the matrix;
the equation is obtained as:
Wout=TUT(UUT)-1;(7)
wherein the upper right-hand corner-1 represents the inverse of the matrix;
as shown in the formula (7), TUT∈Cm×NAnd UUT∈CN×NLength (L-t) independent of training samples0+1) and incrementally as the training data passes through the reservoir.
CN202210438430.6A 2022-04-25 2022-04-25 ESN-based highway river-crossing grand bridge group fog recognition and early warning method Pending CN114782903A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210438430.6A CN114782903A (en) 2022-04-25 2022-04-25 ESN-based highway river-crossing grand bridge group fog recognition and early warning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210438430.6A CN114782903A (en) 2022-04-25 2022-04-25 ESN-based highway river-crossing grand bridge group fog recognition and early warning method

Publications (1)

Publication Number Publication Date
CN114782903A true CN114782903A (en) 2022-07-22

Family

ID=82432171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210438430.6A Pending CN114782903A (en) 2022-04-25 2022-04-25 ESN-based highway river-crossing grand bridge group fog recognition and early warning method

Country Status (1)

Country Link
CN (1) CN114782903A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116153095A (en) * 2023-04-20 2023-05-23 江西师范大学 Expressway mass fog early warning method based on edge calculation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116153095A (en) * 2023-04-20 2023-05-23 江西师范大学 Expressway mass fog early warning method based on edge calculation

Similar Documents

Publication Publication Date Title
CN111694010B (en) Roadside vehicle identification method based on fusion of vision and laser radar
CN109377726B (en) Expressway agglomerate fog accurate warning and inducing system and method based on Internet of vehicles
CN111458721B (en) Exposed garbage identification and positioning method, device and system
CN112422783A (en) Unmanned aerial vehicle intelligent patrol system based on parking apron cluster
CN110009037B (en) Short-term engineering wind speed prediction method and system based on physical information coupling
CN111339826B (en) Landslide unmanned aerial vehicle linear sensor network frame detecting system
CN108508372B (en) A kind of calculating of unmanned electricity and method for early warning and system based on environmental visual fusion
CN110910440B (en) Power transmission line length determination method and system based on power image data
CN110837800A (en) Port severe weather-oriented target detection and identification method
CN107316457B (en) Method for judging whether road traffic condition accords with automatic driving of automobile
CN116448773B (en) Pavement disease detection method and system with image-vibration characteristics fused
CN108572648A (en) A kind of automatic driving vehicle power supply multi-source fusion prediction technique and system
CN112437501A (en) Multi-sensor beyond-the-horizon ad hoc network method based on traffic semantics and game theory
CN115515077B (en) UAV-based WSN data acquisition track dynamic generation method and system
CN113191030A (en) Automatic driving test scene construction method and device
CN114764973A (en) Method, device and equipment for monitoring abnormal area of road surface and storage medium
CN114782903A (en) ESN-based highway river-crossing grand bridge group fog recognition and early warning method
WO2022107619A1 (en) Data analysis device and method, and program
WO2022107620A1 (en) Data analysis device and method, and program
CN113111876A (en) Method and system for obtaining evidence of traffic violation
CN117387647A (en) Road planning method integrating vehicle-mounted sensor data and road sensor data
CN115440034B (en) Vehicle-road cooperation realization method and realization system based on camera
CN117191419A (en) Automatic driving test platform based on virtual-real integration technology
CN115984768A (en) Multi-target pedestrian real-time detection positioning method based on fixed monocular camera
CN113705442A (en) Outdoor large-board advertising picture monitoring and identifying system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination