CN117290800A - Timing sequence anomaly detection method and system based on hypergraph attention network - Google Patents

Timing sequence anomaly detection method and system based on hypergraph attention network Download PDF

Info

Publication number
CN117290800A
CN117290800A CN202311580493.6A CN202311580493A CN117290800A CN 117290800 A CN117290800 A CN 117290800A CN 202311580493 A CN202311580493 A CN 202311580493A CN 117290800 A CN117290800 A CN 117290800A
Authority
CN
China
Prior art keywords
hypergraph
attention
attention network
vertex
network model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311580493.6A
Other languages
Chinese (zh)
Other versions
CN117290800B (en
Inventor
谢昕
郑文彬
熊申平
郑晗欣
杨志坚
郑星鹏
刘昭阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
East China Jiaotong University
Original Assignee
East China Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by East China Jiaotong University filed Critical East China Jiaotong University
Priority to CN202311580493.6A priority Critical patent/CN117290800B/en
Publication of CN117290800A publication Critical patent/CN117290800A/en
Application granted granted Critical
Publication of CN117290800B publication Critical patent/CN117290800B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2433Single-class perspective, e.g. one-against-all classification; Novelty detection; Outlier detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/042Knowledge-based neural networks; Logical representations of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • G06N3/0455Auto-encoder networks; Encoder-decoder networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2123/00Data types
    • G06F2123/02Data types in the time domain, e.g. time-series data

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Algebra (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a timing sequence abnormality detection method and a timing sequence abnormality detection system based on a hypergraph attention network, wherein the method comprises the following steps: preprocessing data of the collected multivariable time sequence; adopting a parallel hypergraph attention network model to perform feature fusion on the space dimension and the time dimension of the preprocessed multivariable time sequence; the characteristic fusion data is subjected to a gating circulation unit layer to obtain a sequence mode of the characteristic fusion data, and combined training based on a prediction model and a reconstruction model is adopted; calculating an anomaly score from the predicted value and the reconstructed probability, and judging whether the multivariate time sequence is abnormal or not; according to the hypergraph attention network model, targets are optimized through joint training, and the advantages of a model based on prediction and a model based on reconstruction are integrated, wherein the model based on prediction adopts a transducer prediction sub-network, so that the context information can be effectively captured, and the prediction of a single time stamp is more accurate.

Description

Timing sequence anomaly detection method and system based on hypergraph attention network
Technical Field
The invention relates to the technical field of data processing, in particular to a timing sequence abnormality detection method and system based on a hypergraph attention network.
Background
Time-series anomaly detection is an important issue in data mining, and is widely used in the industry. Because the world's system is always operating in a continuous manner, multiple continuous measurements can be made by multiple sensors, such as industrial equipment detection, water quality detection, network traffic detection, etc. The detection of faults from large-scale system detection data can be simplified to the detection of abnormal time points in a time sequence, which is significant for ensuring the safety of the system and avoiding economic losses. Time series anomaly detection studies can be categorized into univariate-based anomaly detection and multivariate-based anomaly detection. The time sequence abnormality detection method based on the univariate is to independently analyze each time sequence without cross learning in a data set, so that abnormality of a single index can be found, but misjudgment can occur when judging whether the whole system operates normally or not by using the abnormality of the univariate. The multivariate-based time series anomaly detection method regards a plurality of time series as a unified object, and simultaneously considers the correlation between different time series, and uses high-dimensional data as input, so that the multivariate-based time series anomaly detection method is more suitable for the framework of a deep learning method. The multivariable method can judge the running condition of the system more than the univariate method.
In recent years, more and more researches apply a deep learning method to time-series anomaly detection; while Graph Neural Networks (GNNs) have been successfully used to represent relationships, from social networks to supply chain information to disease transmission. The graph learning can be summarized into three steps, namely, firstly, initializing data to be converted into graphs according to vertex characteristics, edge characteristics and structural characteristics, then, distributing a unique embedded vector for each graph, and finally, aggregating the vertex and the edge information to output a vector representation graph. Recently, graph neural networks have been applied to learn dynamic graphs between multiple variables in a multivariate time series. Although more expressive, the learned graph is still limited to a single point in time, which is not sufficient for complex temporal patterns. Therefore, how to capture the multi-correlation by using the multi-variable sequence to realize the time sequence abnormality detection is important to judging the running state of the system.
Such as patent application number: CN202210905232.6 discloses a multivariate time series anomaly detection method for wind turbine generator data based on a fusion framework, which comprises the following steps: 1) Preprocessing multivariable time sequence data based on wind power SCADA by utilizing variational modal decomposition; 2) Constructing a space feature reconstruction module, inputting multidimensional time sequence data, and extracting space correlation features among multiple variables; 3) Constructing a time sequence prediction module and extracting a global time sequence dependency relationship; 4) Adopting a combined training mode, simultaneously optimizing two self-networks in a fusion frame, and minimizing a reconstruction error and a prediction error; 5) After the training of the abnormality detection model is completed, the data to be detected can be transmitted into the model in real time to be respectively subjected to abnormality analysis and on-line monitoring, but the scheme has the following defects: (1) in the scheme, an encoder-decoder structure is adopted by a spatial feature reconstruction module, an input vector is compressed through an Automatic Encoder (AE), then the spatial feature is extracted through decoding and mapping into an original vector, only the spatial feature is considered in a reconstruction stage, and the spatial feature is not considered in a prediction stage based on a transform prediction sub-network, so that the subsequent abnormality detection accuracy is not high; (2) in the scheme, when the spatial features are extracted, an automatic encoder is utilized to repeatedly establish and extract three compression representations, and each compression considers all time points, whether the feature information of the time points is important or not, so that key information cannot be extracted by pointedly selecting the time stamp of the related region;
patent application number: CN 202110541569.9 discloses a method for detecting multi-element time sequence anomaly of a hybrid model based on a graph neural network, which comprises the following steps: 1) Dividing a multi-element time sequence into a first multi-element subsequence by a sliding window, and preprocessing the first multi-element subsequence to obtain a first feature matrix and a first adjacent matrix; 2) Constructing a graph convolutional neural network prediction model, and training and testing by taking a first feature matrix and a first adjacent matrix as feature inputs to obtain a real feature value and a predicted value of the next time stamp of the input multi-element time sequence; 3) Comparing the true value with the predicted value on the same time stamp to obtain an abnormal score, and judging the abnormal time stamp based on the magnitude of the abnormal score; 4) Dividing the multi-element time sequence into a second multi-element subsequence by using a fixed window, and preprocessing the second multi-element subsequence to obtain a second adjacent matrix; 5) Constructing a convolutional neural network and attention long-short-term memory network mixed reconstruction model, and taking a second adjacency matrix as input to test and train to generate a reconstruction adjacency matrix; 6) Comparing the second adjacent matrix which is input with the reconstruction adjacent matrix to obtain a reconstruction error matrix, and judging an abnormal time sequence based on the size of elements in the reconstruction error matrix and the number of elements exceeding a threshold value; 7) Combining the judged abnormal time stamp and the abnormal time sequence to obtain an abnormal point coordinate; however, the scheme has the following defects: (1) the prediction model of the method does not consider the time characteristics in the multi-variable time sequence data, and the mixed reconstruction model takes a long-period memory network as an extractor, and is limited when global characteristics are extracted; (2) the prediction model and the reconstruction model cannot be trained jointly, so that the connection of the two parts is broken, and the integral effect of the model is easily affected.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a timing sequence abnormality detection method and a timing sequence abnormality detection system based on a hypergraph attention network, which solve the problems in the background art.
In order to achieve the above purpose, the present invention provides the following technical solutions: a time sequence abnormality detection method based on hypergraph attention network comprises the following steps:
step S1: preprocessing data of the collected multivariable time sequence;
step S2: adopting a parallel hypergraph attention network model to perform feature fusion on the space dimension and the time dimension of the preprocessed multivariable time sequence to obtain feature fusion data;
two branches exist in the parallel hypergraph attention network model, one is a space-oriented hypergraph attention network model, and the other is a time-oriented hypergraph attention network model; the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model are both composed of an attention vertex aggregation module and an attention superside aggregation module, wherein the attention vertex aggregation module aggregates the communicated vertex information to the superside, and the attention superside aggregation module is used for aggregating the superside information; fusing the outputs of the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model;
step S3: the characteristic fusion data passes through a gating circulation unit layer to obtain a sequence mode in the characteristic fusion data;
step S4: the method comprises the steps of jointly training feature fusion data after a sequence mode is acquired by adopting a prediction-based model and a reconstruction-based model, predicting a time stamp of the feature fusion data after the sequence mode is acquired by adopting the prediction-based model to obtain a predicted value of a next time stamp, and reconstructing data distribution in the feature fusion data after the sequence mode is acquired by adopting the reconstruction-based model to obtain reconstruction probability;
step S5: calculating an abnormality score by using the predicted value and the reconstructed probability, setting a threshold value, comparing the abnormality score with the threshold value, and judging whether the multivariate time sequence is abnormal or not so as to complete an abnormality detection task.
Further, in the step S2, the spatial hypergraph attention network model regards the multivariate time series as a complete hypergraph, wherein each vertex represents a feature, and selecting its nearest S neighbors for a vertex to construct a hyperedge, thereby generating a hyperedge with s+1 vertices, capturing the relationship between the vertices through the spatial hypergraph attention network model, each vertexFrom a sequence vectorThe representation is made of a combination of a first and a second color,representing a sequence vectorThe first of (3)The number of the characteristic vector values is n, and k is the total number of the multiple characteristics;the number of the multi-element features; a time-oriented hypergraph attention network model for capturing time dependence in a multivariate time series, treating all timestamps within a sliding window as a complete hypergraph, verticesRepresenting a time stampThe feature vector of the position, the superside in the hypergraph attention network model facing to time comprises u time stamps closest to the current sliding window; hypergraph attention network for spaceThe model is fused with the output of the hypergraph attention network model facing to time to obtain characteristic information from different sources.
Further, before extracting features with the parallel hypergraph attention network model, it is necessary to convert the multivariate time series into a hypergraph structure about spatial features and a hypergraph structure about temporal features, respectively, according to requirements; the hyperedge in the hypergraph is connected with two or more vertexes, the hypergraph is defined as G= (V, E, W), the hypergraph comprises a vertex set V, a hyperedge set E and an edge weight diagonal matrix W for assigning weight to each hyperedge, and the relation in the hypergraph is represented by an incidence matrix H of |V|×|E|, and the specific definition is as follows:
of the formula, e.g. verticesAt the supersideIn the inner part of the inner part,otherwise, 0; for a vertexThe degree of (2) is defined asThe method comprises the steps of carrying out a first treatment on the surface of the For a supersideThe degree of e is defined asThe method comprises the steps of carrying out a first treatment on the surface of the The frame of the method adopts supergraph Laplace regularization as follows:
in the method, in the process of the invention,representing a mapping vector for classification;representing a supervised experience loss;is regularization on hypergraphs, which is expressed as:
order theWill thenWriting into
In the method, in the process of the invention,is a unit matrix;a hypergraph laplace operator which is semi-positive;is beyond the limitWeight of (2);is the vertexAnd overrunThe associated probabilities;is the vertexAnd overrunThe associated probabilities;is a classification function;is thatIs a transpose of (2);anddiagonal matrices representing the degree of the superside and the degree of the vertex, respectively;taking the reciprocal value of the degree root number of all the vertexes to form a diagonal matrix;taking the reciprocal value of all the superside degrees to form a diagonal matrix;is an association matrixIs a transpose of (2);andrespectively isAndis a degree of (3).
Further, the specific process of obtaining the output of the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model is as follows: inputting hypergraphs into a parallel hypergraph attention network model, and calculating vertexes by using a shared attention mechanismAnd overrunAttention coefficient between
In the method, in the process of the invention,representation connected to the top pointIs a supersound set of (1);is the vertexAnd overrunIs a probability of association;is the vertexAnd overrunIs a probability of association;to respectively vertexSuper-limitAnd overrunThe features of the high-level are projected to the high-level features generated by the high-level through a weight matrix W;
the attention coefficient matrix of the final attention vertex aggregation module and the generated superside characteristic formula are as follows:
in the method, in the process of the invention,the functions are applied in a row-wise fashion;is a linear rectification function;representing a multiplication function;is an activation function;is a matrix of attention coefficients;a transpose of the attention coefficient matrix;the feature set is the feature set of the vertex and the incidence matrix;is a weighted superside feature set;generating superside characteristics;
the process of generating vertex characteristics by the attention over-edge aggregation module is as follows: and calculating the attention coefficient between the superside and the vertex to obtain an attention coefficient matrix, and generating the vertex characteristics.
Further, the specific process of step S4 is as follows:
prediction-based model: the method comprises the steps of adopting a transducer prediction sub-network as a prediction model, taking output of a gating circulating unit layer as input of the transducer prediction sub-network, and calculating a predicted value of the output by adopting the following formula:
in the method, in the process of the invention,representing a normalization operation;represents an attention operation;representing feed-forward layer operation;is an intermediate value;is a predicted value;is input; while using the root mean square error as a loss function:
in the method, in the process of the invention,to predict loss;a timestamp representing the current input;is thatMiddle (f)A characteristic value;is thatMiddle (f)A model predicted value based on the prediction;
based on the reconstructed model: a variable self-encoder is adopted as a reconstruction model; the variable self-encoder is composed of an encoder and a decoder, wherein the encoder compresses the data of the multivariate time series into the stealth space and inputs the data into the stealth space asThe output is a hidden vectorThe parameters areThe encoder is represented asThe method comprises the steps of carrying out a first treatment on the surface of the While the decoder reconstructs the data from the hidden vector state, the input being a hidden vectorThe probability distribution of data whose output is a multivariate time series, the parameters beingThe decoder is denoted asThe method comprises the steps of carrying out a first treatment on the surface of the The reconstruction loss calculation formula is:
in the method, in the process of the invention,is a vector representation in potential space;is a desired calculation;calculating KL divergence;loss for reconstruction;is a standard normal distribution;
the final loss is obtained:
furthermore, the gating cycle unit layer comprises two gates, namely an update gate and a reset gate, which adopt sigmoid as an activation function.
Further, the preprocessing in step S1 adopts a data normalization and data cleaning method.
A hypergraph attention network based timing anomaly detection system comprising:
the data preprocessing module is used for preprocessing the data of the collected multivariable time sequence;
the feature fusion module is used for carrying out feature fusion on the space dimension and the time dimension of the preprocessed multivariable time sequence by adopting a parallel hypergraph attention network model to obtain feature fusion data;
two branches exist in the parallel hypergraph attention network model, one is a space-oriented hypergraph attention network model, and the other is a time-oriented hypergraph attention network model; the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model are both composed of an attention vertex aggregation module and an attention superside aggregation module, wherein the attention vertex aggregation module aggregates the communicated vertex information to the superside, and the attention superside aggregation module is used for aggregating the superside information; fusing the outputs of the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model;
the acquisition module is used for acquiring the sequence mode in the feature fusion data through the gating circulating unit layer;
the training module is used for jointly training the feature fusion data after the sequence mode is acquired by adopting a prediction-based model and a reconstruction-based model, predicting the time stamp of the feature fusion data after the sequence mode is acquired by adopting the prediction-based model to obtain a predicted value of the next time stamp, and reconstructing the data distribution in the feature fusion data after the sequence mode is acquired by adopting the reconstruction-based model to obtain the reconstruction probability;
and the judging module is used for calculating the abnormality score of the predicted value and the reconstruction probability, setting a threshold value, comparing the abnormality score with the threshold value, and judging whether the multivariate time sequence is abnormal or not so as to complete the abnormality detection task.
Compared with the prior art, the invention has the following beneficial effects:
(1) According to the invention, the hypergraph neural network is introduced into the multivariate time sequence anomaly detection task, so that the hypergraph can model the multivariate relation more accurately.
(2) The hypergraph attention network model of the present invention optimizes targets by joint training and integrates the advantages of both predictive and reconstructive-based models. The prediction model adopts a transducer prediction sub-network, so that the context information can be effectively captured, and the single time stamp can be predicted more accurately.
(3) According to the invention, a attention mechanism is adopted in a parallel hypergraph model to distribute weights to all timestamps of input data, correlation among different features is modeled respectively when features are extracted, meanwhile, time dependence in each multivariate time sequence is modeled, clustering is carried out based on K neighbors to generate a hypergraph, original feature distribution is changed, and features of important areas are more focused.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
FIG. 2 is a diagram of the overall structure of the timing anomaly detection method for the hypergraph attention network of the present invention.
Fig. 3 is a schematic diagram of the structure of the prediction-based model and the reconstruction-based model of the present invention.
Fig. 4 is a schematic diagram of a system structure according to the present invention.
Detailed Description
As shown in fig. 1, a timing anomaly detection method based on hypergraph attention network includes the following steps:
step S1: data normalization and data cleaning are adopted to perform data preprocessing on the collected multivariable time sequence, the attribute of the multivariable time sequence is scaled to a certain range, and abnormal values in the multivariable time sequence are replaced by normal values, and the specific process is as follows:
collecting a multi-variable time sequence with different types of anomalies and a normal multi-variable time sequence, and dividing the multi-variable time sequence into a test set and a training set; because the multivariate time sequence is collected under a real environment, in order to improve the stability of a model based on prediction and a model based on reconstruction in the subsequent steps, the collected multivariate time sequence is required to be subjected to data normalization and data cleaning pretreatment, and then the pretreated multivariate time sequence is subjected to one-dimensional convolution layer with the kernel size of 5 to extract advanced features in the multivariate time sequence; wherein data normalization applies to the training set and the test set, and data cleansing applies only to the training set.
Data normalization: normalizing the multivariate time series by using the maximum value and the minimum value in the multivariate time series, and limiting the multivariate time series to be within the range of [0,1], wherein the specific formula is as follows:
wherein the input of the multi-variable time series is composed ofA representation;is a real number domain;is an input multivariate time series;is the normalized output;is the maximum length of the timestamp;is the number of features in the input multivariate time series; for longer multivariate time series, the pass length isGenerates a fixed length input;andrespectively minimum and maximum values in the multivariate time series;
data cleaning: in order to alleviate this problem, the Spectrum Residual (SR) is used to detect an abnormal time stamp of each individual time sequence in the multivariate time sequence, and the abnormal time stamp of which the detection result exceeds the threshold value is replaced by a normal value near the time stamp, so that the influence on the distribution of the multivariate time sequence is reduced.
Step S2: adopting a parallel hypergraph attention network model to perform feature fusion on the space dimension and the time dimension of the preprocessed multivariable time sequence to obtain feature fusion data;
two branches exist in the parallel hypergraph attention network model, one is a space-oriented hypergraph attention network model, and the other is a time-oriented hypergraph attention network model;
the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model are both composed of an attention vertex aggregation module and an attention superside aggregation module, and the attention vertex aggregation module aggregates the communicated vertex information to the superside; similarly, the attention-superside aggregation module is used for aggregating superside information, so that correlation between vertexes/supersides and other fields can be better explored, a multivariate time sequence is regarded as a complete supergraph in a space-oriented supergraph attention network model, each vertex represents a characteristic, each superside represents the association among a plurality of vertexes, time dependence in the multivariate time sequence is captured in a time-oriented supergraph attention network model, and all time stamps in a sliding window are regarded as a complete supergraph, and the vertexes in the supersides comprise all other time stamps of the current sliding window.
Parallel hypergraph attention network model construction: the parallel hypergraph attention network model constructed based on the hypergraph attention (Hypergraph Attention, HGAT) network model is a new deep learning network, the overall architecture of which is shown in fig. 2, in which v1 represents the vertex with number 1, v2 represents the vertex with number 2, v3 represents the vertex with number 3, v4 represents the vertex with number 4, v5 represents the vertex with number 5, v6 represents the vertex with number 6, v7 represents the vertex with number 7, v8 represents the vertex with number 8, and v9 represents the vertex with number 9; e1 represents a superside numbered 1, and e2 represents a superside numbered 2; e3 represents the superside numbered 3, and e4 represents the superside numbered 4;
before extracting features with a parallel hypergraph attention network model, it is necessary to transform the multivariate time series into a hypergraph construction for spatial features and a hypergraph construction for temporal features, respectively, according to requirements; wherein the construction of hypergraphs differs from simple graphs in that hyperedges in hypergraphs connect two or more vertices; hypergraphs are generally defined as g= (V, E, W) and include a vertex set V, a hyperedge set E, and an edge weight diagonal matrix W that assigns weights to each hyperedge, where the relationship in the hypergraph can be represented by an association matrix H of |v|×|e|, which is specifically defined as:
of the formula, e.g. verticesAt the supersideIn the inner part of the inner part,otherwise, 0; for a vertex,The degree of (2) is defined asThe method comprises the steps of carrying out a first treatment on the surface of the For a supersideThe degree of e is defined asThe method comprises the steps of carrying out a first treatment on the surface of the In order to smooth vertex labels on hypergraph structures, hypergraph laplacian regularization is employed, the framework of which is as follows:
in the method, in the process of the invention,representing a mapping vector for classification;representing a supervised experience loss;is regularization on hypergraphs, which is expressed as:
order theThen can be used toWriting into
In the method, in the process of the invention,is a unit matrix;a hypergraph laplace operator which is semi-positive;is beyond the limitWeights of (2);is the vertexAnd overtlimitThe associated probabilities;is the vertexAnd overtlimitThe associated probabilities;is a classification function;is thatIs a transpose of (2);anddiagonal matrices representing the degree of the superside and the degree of the vertex, respectively;taking the reciprocal value of the degree root number of all the vertexes to form a diagonal matrix;taking the reciprocal value of all the superside degrees to form a diagonal matrix;is an association matrixIs a transpose of (2);andrespectively isAnddegree of (3);
after the hypergraph is built, inputting the hypergraph into a parallel hypergraph attention network model, wherein the hypergraph attention network model is composed of an attention vertex aggregation module and an attention superedge aggregation module; the attention vertex aggregation module aggregates the communicated vertices to the supersides, and the attention superside aggregation module aggregates information of the supersides back to the vertices, so that vertex representation can be learned; at the same time, vertices are computed using a shared attention mechanismAnd overrunAttention coefficient between
In the method, in the process of the invention,representation connected to the top pointIs a supersound set of (1);is the vertexAnd overrunIs a probability of association;is the vertexAnd overrunIs a probability of association;to respectively vertexSuper-limitAnd overrunThe features of the high-level are projected to the high-level features generated by the high-level through a weight matrix W;
therefore, the attention coefficient matrix of the final attention vertex aggregation module and the generated superside characteristic formula are as follows:
in the method, in the process of the invention,the functions are applied in a row-wise fashion;is a linear rectification function;is a multiplication function;is an activation function;is a matrix of attention coefficients;a transpose of the attention coefficient matrix;is an association matrix;the feature set is the feature set of the vertex and the incidence matrix;is a weighted superside feature set;generating superside characteristics; the process of the attention over-edge aggregation module is similar to that of the attention vertex aggregation module, and the attention coefficient matrix is obtained by calculating the attention coefficient between the over-edge and the vertex, so that the vertex characteristics can be generated; through the vertex-superside-vertex transformation mechanism, the high-order relation between the supergraph data can be effectively represented;
the space-oriented hypergraph attention network model regards a multivariate time sequence as a complete hypergraph, wherein each vertex represents a certain feature, for one vertex we choose its nearest S neighbors to construct a hyperedge, thereby generating a hyperedge with S+1 vertices, and capturing the vertex through the space-oriented hypergraph attention network modelRelationships between points; each vertexFrom a sequence vectorThe representation is made of a combination of a first and a second color,representing vectorsThe first of (3)The number of the characteristic vector values is n, and k is the total number of the multiple characteristics;the number of the multi-element features; a time-oriented hypergraph attention network model for capturing time dependence in a multivariate time series, treating all timestamps within a sliding window as a complete hypergraph, verticesRepresenting a time stampThe feature vector of the position, the superside in the hypergraph attention network model facing to time comprises u time stamps closest to the current sliding window; and finally, fusing the outputs of the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model to obtain the characteristic information from different sources.
Step S3: the characteristic fusion data passes through a gate control loop unit (GRU) layer with a hidden dimension d1, and a sequence mode in the characteristic fusion data is obtained;
the gating circulation unit layer (GRU) is used for acquiring the sequence mode in the feature fusion data, the GRU is one of the circulation neural networks (RNNs), the problem of long dependence in the circulation neural networks (RNNs) can be solved, the structure is simpler than that of a long-short-term memory network, the effect is better, meanwhile, the GRU parameters are fewer, the training speed can be increased, and the fitting risk is reduced;
the gating cycle unit layer (GRU) comprises two gates, namely an update gate and a reset gate, which adopt sigmoid as an activation function; the larger the value of the update gate, the more state information is brought in at the previous time, the smaller the value of the reset gate, and the less information of the previous state is written.
Step S4: the method comprises the steps of jointly training feature fusion data after a sequence mode is acquired by adopting a prediction-based model and a reconstruction-based model, predicting a time stamp of the feature fusion data after the sequence mode is acquired by adopting the prediction-based model to obtain a predicted value of a next time stamp, and reconstructing data distribution in the feature fusion data after the sequence mode is acquired by adopting the reconstruction-based model to obtain reconstruction probability;
the model based on prediction and the model based on reconstruction have advantages and can complement each other, and the combined training comprises the steps that the time stamp of the feature fusion data after the sequence mode is acquired is predicted based on the model based on prediction to obtain a predicted value of the next time stamp; reconstructing data distribution in the feature fusion data after the sequence mode is acquired based on a reconstructed model to obtain reconstructed probability, and redefining a loss function, wherein the specific structure is shown in figure 3;
prediction-based model: using a transducer prediction sub-network as a prediction model, using the output of a gate-controlled loop unit (GRU) as the input of the transducer prediction sub-network, and calculating the output prediction value by adopting the following formula:
in the method, in the process of the invention,normalizing the substitute surface layer;represents an attention operation;representing feed-forward layer operation;is an intermediate value;is a predicted value;is input; while using the root mean square error as a loss function:
in the method, in the process of the invention,to predict loss;a timestamp representing the current input;is thatMiddle (f)A characteristic value;is thatMiddle (f)A model predicted value based on the prediction;
based on the reconstructed model: using a variational self-encoder (VAE) as a reconstruction model that provides a probabilistic way to describe observations in potential space by capturing the data distribution of the entire multivariate time series by treating the data of the multivariate time series as variables; variation ofA self-encoder (VAE) is composed of an encoder and a decoder, wherein the encoder compresses the data of the multivariate time series into the stealth space, and inputs the data asThe output is a hidden vectorThe parameters areThus the encoder can be expressed asThe method comprises the steps of carrying out a first treatment on the surface of the While the decoder reconstructs the data from the hidden vector state, the input being a hidden vectorThe output is a probability distribution of the data, the parameters areSo the decoder can be expressed asThe method comprises the steps of carrying out a first treatment on the surface of the The reconstruction loss calculation formula is as follows:
in the method, in the process of the invention,is a representation of a vector in a potential space,is a desired calculation;calculating KL divergence;loss for reconstruction;is a standard normal distribution;
the resulting losses were:
step S5: calculating an abnormality score by using the predicted value and the reconstructed probability, setting a threshold value, comparing the abnormality score with the threshold value, and judging whether the multivariate time sequence is abnormal or not so as to complete an abnormality detection task;
the final inference score should consider the inference results of the joint optimization objective to maximize the overall effectiveness and accuracy of anomaly detection, and calculate the inference score for the timestamp by:
in the method, in the process of the invention,is the firstAn inference score for each timestamp;is a true value;is a predicted value;is the reconstruction probability;is a learnable hyper-parameter; the threshold is automatically selected using a peak over-threshold approach, and a timestamp with an inference score greater than the threshold is considered abnormal.
As shown in fig. 4, a timing anomaly detection system based on hypergraph attention network includes:
the data preprocessing module is used for preprocessing the data of the collected multivariable time sequence;
the feature fusion module is used for carrying out feature fusion on the space dimension and the time dimension of the preprocessed multivariable time sequence by adopting a parallel hypergraph attention network model to obtain feature fusion data;
two branches exist in the parallel hypergraph attention network model, one is a space-oriented hypergraph attention network model, and the other is a time-oriented hypergraph attention network model; the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model are both composed of an attention vertex aggregation module and an attention superside aggregation module, wherein the attention vertex aggregation module aggregates the communicated vertex information to the superside, and the attention superside aggregation module is used for aggregating the superside information; fusing the outputs of the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model;
the acquisition module is used for acquiring the sequence mode in the feature fusion data through the gating circulating unit layer;
the training module is used for jointly training the feature fusion data after the sequence mode is acquired by adopting a prediction-based model and a reconstruction-based model, predicting the time stamp of the feature fusion data after the sequence mode is acquired by adopting the prediction-based model to obtain a predicted value of the next time stamp, and reconstructing the data distribution in the feature fusion data after the sequence mode is acquired by adopting the reconstruction-based model to obtain the reconstruction probability;
and the judging module is used for calculating the abnormality score of the predicted value and the reconstruction probability, setting a threshold value, comparing the abnormality score with the threshold value, and judging whether the multivariate time sequence is abnormal or not so as to complete the abnormality detection task.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. The time sequence abnormality detection method based on the hypergraph attention network is characterized by comprising the following steps of:
step S1: preprocessing data of the collected multivariable time sequence;
step S2: adopting a parallel hypergraph attention network model to perform feature fusion on the space dimension and the time dimension of the preprocessed multivariable time sequence to obtain feature fusion data;
two branches exist in the parallel hypergraph attention network model, one is a space-oriented hypergraph attention network model, and the other is a time-oriented hypergraph attention network model; the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model are both composed of an attention vertex aggregation module and an attention superside aggregation module, wherein the attention vertex aggregation module aggregates the communicated vertex information to the superside, and the attention superside aggregation module is used for aggregating the superside information; fusing the outputs of the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model;
step S3: the characteristic fusion data passes through a gating circulation unit layer to obtain a sequence mode in the characteristic fusion data;
step S4: the method comprises the steps of jointly training feature fusion data after a sequence mode is acquired by adopting a prediction-based model and a reconstruction-based model, predicting a time stamp of the feature fusion data after the sequence mode is acquired by adopting the prediction-based model to obtain a predicted value of a next time stamp, and reconstructing data distribution in the feature fusion data after the sequence mode is acquired by adopting the reconstruction-based model to obtain reconstruction probability;
step S5: calculating an abnormality score by using the predicted value and the reconstructed probability, setting a threshold value, comparing the abnormality score with the threshold value, and judging whether the multivariate time sequence is abnormal or not so as to complete an abnormality detection task.
2. The hypergraph attention network-based timing anomaly detection method of claim 1, wherein: in the step S2, the space-oriented hypergraph attention network model is used for sequencing multiple variables in time sequenceThe columns are considered as a complete hypergraph, wherein each vertex represents a feature, its nearest S neighbors are selected for a vertex to construct a hyperedge, thereby generating a hyperedge with S+1 vertices, the relationship between the vertices is captured by a spatial-oriented hypergraph attention network model, each vertexFrom a sequence vector->Indicating (I)>Representing a sequence vector +.>The%>The number of the characteristic vector values is n, and k is the total number of the multiple characteristics; />The number of the multi-element features; a time-oriented hypergraph attention network model for capturing time dependencies in a multivariate time series, treating all timestamps within a sliding window as one complete hypergraph, vertex->Representing a time stamp->The feature vector of the position, the superside in the hypergraph attention network model facing to time comprises u time stamps closest to the current sliding window; and fusing the outputs of the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model to obtain the characteristic information from different sources.
3. The hypergraph attention network based timing anomaly detection method of claim 2, wherein: before extracting features with a parallel hypergraph attention network model, it is necessary to transform the multivariate time series into a hypergraph construction for spatial features and a hypergraph construction for temporal features, respectively, according to requirements; the hyperedge in the hypergraph is connected with two or more vertexes, the hypergraph is defined as G= (V, E, W), the hypergraph comprises a vertex set V, a hyperedge set E and an edge weight diagonal matrix W for assigning weight to each hyperedge, and the relation in the hypergraph is represented by an incidence matrix H of |V|×|E|, and the specific definition is as follows:
of the formula, e.g. verticesIn the superb->Inner part (S)>Otherwise, 0; for a vertex->,/>The degree of (2) is defined asThe method comprises the steps of carrying out a first treatment on the surface of the For a superside->The degree of e is defined as +.>The method comprises the steps of carrying out a first treatment on the surface of the The frame of the method adopts supergraph Laplace regularization as follows:
in the method, in the process of the invention,representing a mapping vector for classification; />Representing a supervised experience loss; />Is regularization on hypergraphs, which is expressed as:
order the,/>Will->Write->
In the method, in the process of the invention,is a unit matrix; />A hypergraph laplace operator which is semi-positive; />Is superb->Weight of (2); />Is vertex->And exceed->The associated probabilities; />Is vertex->And exceed->The associated probabilities; />Is a classification function; />Is thatIs a transpose of (2); />And->Diagonal matrices representing the degree of the superside and the degree of the vertex, respectively; />Taking the reciprocal value of the degree root number of all the vertexes to form a diagonal matrix; />Taking the reciprocal value of all the superside degrees to form a diagonal matrix; />Is an association matrix->Is a transpose of (2); />And->Respectively->And->Is a degree of (3).
4. A hypergraph attention network based timing anomaly detection method as recited in claim 3, wherein: the specific process for obtaining the output of the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model is as follows: inputting hypergraphs into a parallel hypergraph attention network model, and calculating vertexes by using a shared attention mechanismAnd overrunAttention coefficient between->
In the method, in the process of the invention,;/>representing connection to vertex->Is a supersound set of (1); />Is vertex->And exceed->Is a probability of association; />Is vertex->And exceed->Is a probability of association; />、/>、/>To respectively add vertices->Super-limit->And exceed->The features of the high-level are projected to the high-level features generated by the high-level through a weight matrix W;
the attention coefficient matrix of the final attention vertex aggregation module and the generated superside characteristic formula are as follows:
in the method, in the process of the invention,the functions are applied in a row-wise fashion; />Is a linear rectification function; />Representing a multiplication function; />Is an activation function; />Is a matrix of attention coefficients; />A transpose of the attention coefficient matrix; />The feature set is the feature set of the vertex and the incidence matrix; />Is a weighted superside feature set; />Generating superside characteristics;
the process of generating vertex characteristics by the attention over-edge aggregation module is as follows: and calculating the attention coefficient between the superside and the vertex to obtain an attention coefficient matrix, and generating the vertex characteristics.
5. The hypergraph attention network based timing anomaly detection method of claim 4, wherein: the specific process of the step S4 is as follows:
prediction-based model: the method comprises the steps of adopting a transducer prediction sub-network as a prediction model, taking output of a gating circulating unit layer as input of the transducer prediction sub-network, and calculating a predicted value of the output by adopting the following formula:
in the method, in the process of the invention,representing a normalization operation; />Represents an attention operation; />Representing feed-forward layer operation;is an intermediate value; />Is a predicted value; />Is input; while using the root mean square error as a loss function:
in the method, in the process of the invention,to predict loss; />A timestamp representing the current input; />Is->Middle->A characteristic value; />Is->Middle->A model predicted value based on the prediction;
based on the reconstructed model: a variable self-encoder is adopted as a reconstruction model; the variable self-encoder is composed of an encoder and a decoder, wherein the encoder compresses the data of the multivariate time series into the stealth space and inputs the data into the stealth space asThe output is hidden vector +.>The parameter is->The encoder is denoted->The method comprises the steps of carrying out a first treatment on the surface of the While the decoder reconstructs the data from the hidden vector state, the input is hidden vector +.>Probability distribution of data whose output is a multivariate time series, the parameter is +.>The decoder is denoted +.>The method comprises the steps of carrying out a first treatment on the surface of the The reconstruction loss calculation formula is:
in the method, in the process of the invention,is a vector representation in potential space; />Is a desired calculation; />Calculating KL divergence; />Loss for reconstruction; />Is a standard normal distribution;
the final loss is obtained:
6. the hypergraph attention network-based timing anomaly detection method of claim 1, wherein: the gating circulation unit layer comprises two gates, namely an update gate and a reset gate, which adopt sigmoid as an activation function.
7. The hypergraph attention network-based timing anomaly detection method of claim 1, wherein: the preprocessing in step S1 adopts a data normalization and data cleaning mode.
8. A hypergraph attention network based timing anomaly detection system, comprising:
the data preprocessing module is used for preprocessing the data of the collected multivariable time sequence;
the feature fusion module is used for carrying out feature fusion on the space dimension and the time dimension of the preprocessed multivariable time sequence by adopting a parallel hypergraph attention network model to obtain feature fusion data;
two branches exist in the parallel hypergraph attention network model, one is a space-oriented hypergraph attention network model, and the other is a time-oriented hypergraph attention network model; the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model are both composed of an attention vertex aggregation module and an attention superside aggregation module, wherein the attention vertex aggregation module aggregates the communicated vertex information to the superside, and the attention superside aggregation module is used for aggregating the superside information; fusing the outputs of the space-oriented hypergraph attention network model and the time-oriented hypergraph attention network model;
the acquisition module is used for acquiring the sequence mode in the feature fusion data through the gating circulating unit layer;
the training module is used for jointly training the feature fusion data after the sequence mode is acquired by adopting a prediction-based model and a reconstruction-based model, predicting the time stamp of the feature fusion data after the sequence mode is acquired by adopting the prediction-based model to obtain a predicted value of the next time stamp, and reconstructing the data distribution in the feature fusion data after the sequence mode is acquired by adopting the reconstruction-based model to obtain the reconstruction probability;
and the judging module is used for calculating the abnormality score of the predicted value and the reconstruction probability, setting a threshold value, comparing the abnormality score with the threshold value, and judging whether the multivariate time sequence is abnormal or not so as to complete the abnormality detection task.
CN202311580493.6A 2023-11-24 2023-11-24 Timing sequence anomaly detection method and system based on hypergraph attention network Active CN117290800B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311580493.6A CN117290800B (en) 2023-11-24 2023-11-24 Timing sequence anomaly detection method and system based on hypergraph attention network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311580493.6A CN117290800B (en) 2023-11-24 2023-11-24 Timing sequence anomaly detection method and system based on hypergraph attention network

Publications (2)

Publication Number Publication Date
CN117290800A true CN117290800A (en) 2023-12-26
CN117290800B CN117290800B (en) 2024-01-26

Family

ID=89258952

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311580493.6A Active CN117290800B (en) 2023-11-24 2023-11-24 Timing sequence anomaly detection method and system based on hypergraph attention network

Country Status (1)

Country Link
CN (1) CN117290800B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117475518A (en) * 2023-12-27 2024-01-30 华东交通大学 Synchronous human motion recognition and prediction method and system
CN117786374A (en) * 2024-02-28 2024-03-29 南京信息工程大学 Multivariate time sequence anomaly detection method and system based on graph neural network
CN117851920A (en) * 2024-03-07 2024-04-09 国网山东省电力公司信息通信公司 Power Internet of things data anomaly detection method and system
CN118036477A (en) * 2024-04-11 2024-05-14 中国石油大学(华东) Well position and well control parameter optimization method based on space-time diagram neural network

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120137367A1 (en) * 2009-11-06 2012-05-31 Cataphora, Inc. Continuous anomaly detection based on behavior modeling and heterogeneous information analysis
CN113516226A (en) * 2021-05-18 2021-10-19 长沙理工大学 Hybrid model multivariate time sequence anomaly detection method based on graph neural network
CN113962358A (en) * 2021-09-29 2022-01-21 西安交通大学 Information diffusion prediction method based on time sequence hypergraph attention neural network
CN114077811A (en) * 2022-01-19 2022-02-22 华东交通大学 Electric power Internet of things equipment abnormality detection method based on graph neural network
CN115618296A (en) * 2022-10-26 2023-01-17 河海大学 Dam monitoring time sequence data anomaly detection method based on graph attention network
CN116502161A (en) * 2023-03-21 2023-07-28 浙江师范大学 Anomaly detection method based on dynamic hypergraph neural network
CN116665130A (en) * 2023-06-07 2023-08-29 河海大学 Space-time diagram-based dam safety monitoring multivariate time sequence anomaly detection method
CN116680105A (en) * 2023-05-31 2023-09-01 南京大学 Time sequence abnormality detection method based on neighborhood information fusion attention mechanism
CN116845889A (en) * 2023-09-01 2023-10-03 东海实验室 Hierarchical hypergraph neural network-based power load prediction method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120137367A1 (en) * 2009-11-06 2012-05-31 Cataphora, Inc. Continuous anomaly detection based on behavior modeling and heterogeneous information analysis
CN113516226A (en) * 2021-05-18 2021-10-19 长沙理工大学 Hybrid model multivariate time sequence anomaly detection method based on graph neural network
CN113962358A (en) * 2021-09-29 2022-01-21 西安交通大学 Information diffusion prediction method based on time sequence hypergraph attention neural network
CN114077811A (en) * 2022-01-19 2022-02-22 华东交通大学 Electric power Internet of things equipment abnormality detection method based on graph neural network
CN115618296A (en) * 2022-10-26 2023-01-17 河海大学 Dam monitoring time sequence data anomaly detection method based on graph attention network
CN116502161A (en) * 2023-03-21 2023-07-28 浙江师范大学 Anomaly detection method based on dynamic hypergraph neural network
CN116680105A (en) * 2023-05-31 2023-09-01 南京大学 Time sequence abnormality detection method based on neighborhood information fusion attention mechanism
CN116665130A (en) * 2023-06-07 2023-08-29 河海大学 Space-time diagram-based dam safety monitoring multivariate time sequence anomaly detection method
CN116845889A (en) * 2023-09-01 2023-10-03 东海实验室 Hierarchical hypergraph neural network-based power load prediction method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZHIXUAN WU 等: "Spatial-Temporal Hypergraph Neural Network based on Attention Mechanism for Multi-view Data Action Recognition", 《2023 10TH INTERNATIONAL CONFERENCE ON DEPENDABLE SYSTEMS AND THEIR APPLICATIONS (DSA)》, pages 487 - 493 *
赵文博 等: "基于超图神经网络的恶意流量分类模型", 《网络与信息安全学报》, vol. 09, no. 05, pages 166 - 177 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117475518A (en) * 2023-12-27 2024-01-30 华东交通大学 Synchronous human motion recognition and prediction method and system
CN117475518B (en) * 2023-12-27 2024-03-22 华东交通大学 Synchronous human motion recognition and prediction method and system
CN117786374A (en) * 2024-02-28 2024-03-29 南京信息工程大学 Multivariate time sequence anomaly detection method and system based on graph neural network
CN117786374B (en) * 2024-02-28 2024-05-14 南京信息工程大学 Multivariate time sequence anomaly detection method and system based on graph neural network
CN117851920A (en) * 2024-03-07 2024-04-09 国网山东省电力公司信息通信公司 Power Internet of things data anomaly detection method and system
CN118036477A (en) * 2024-04-11 2024-05-14 中国石油大学(华东) Well position and well control parameter optimization method based on space-time diagram neural network

Also Published As

Publication number Publication date
CN117290800B (en) 2024-01-26

Similar Documents

Publication Publication Date Title
CN112784965B (en) Large-scale multi-element time series data anomaly detection method oriented to cloud environment
CN117290800B (en) Timing sequence anomaly detection method and system based on hypergraph attention network
Chen et al. Anomaly detection and critical SCADA parameters identification for wind turbines based on LSTM-AE neural network
Cheng et al. Autoencoder quasi-recurrent neural networks for remaining useful life prediction of engineering systems
CN111275288B (en) XGBoost-based multidimensional data anomaly detection method and device
Wang et al. A method for rapidly evaluating reliability and predicting remaining useful life using two-dimensional convolutional neural network with signal conversion
Jiang et al. A multi-step progressive fault diagnosis method for rolling element bearing based on energy entropy theory and hybrid ensemble auto-encoder
Tian et al. Identification of abnormal conditions in high-dimensional chemical process based on feature selection and deep learning
CN110929765A (en) Convolution self-coding fault monitoring method based on batch imaging
de Paula Monteiro et al. A hybrid prototype selection-based deep learning approach for anomaly detection in industrial machines
Deng et al. Sparse stacked autoencoder network for complex system monitoring with industrial applications
CN114363195A (en) Network flow prediction early warning method for time and spectrum residual convolution network
Li et al. A novel unsupervised anomaly detection method for rotating machinery based on memory augmented temporal convolutional autoencoder
Zhang et al. Gated recurrent unit-enhanced deep convolutional neural network for real-time industrial process fault diagnosis
CN111222689A (en) LSTM load prediction method, medium, and electronic device based on multi-scale temporal features
Zhu et al. Condition monitoring of wind turbine based on deep learning networks and kernel principal component analysis
CN117076936A (en) Time sequence data anomaly detection method based on multi-head attention model
Zhang et al. MS-TCN: A multiscale temporal convolutional network for fault diagnosis in industrial processes
CN114707577A (en) Anomaly detection method and system based on self-confrontation variational self-encoder
Zhu et al. Hybrid scheme through read-first-LSTM encoder-decoder and broad learning system for bearings degradation monitoring and remaining useful life estimation
Xu et al. Global attention mechanism based deep learning for remaining useful life prediction of aero-engine
CN113283546B (en) Furnace condition abnormity alarm method and system of heating furnace integrity management centralized control device
CN117313015A (en) Time sequence abnormality detection method and system based on time sequence and multiple variables
CN116933643A (en) Intelligent data monitoring method based on partial robust M regression and multiple interpolation
KR20210126378A (en) Real-time sliding window based anomaly detection system for multivariate data generated by manufacturing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant