CN110535723A - The message method for detecting abnormality of deep learning is used in a kind of SDN - Google Patents

The message method for detecting abnormality of deep learning is used in a kind of SDN Download PDF

Info

Publication number
CN110535723A
CN110535723A CN201910798422.0A CN201910798422A CN110535723A CN 110535723 A CN110535723 A CN 110535723A CN 201910798422 A CN201910798422 A CN 201910798422A CN 110535723 A CN110535723 A CN 110535723A
Authority
CN
China
Prior art keywords
message
prediction
deep learning
model
detecting abnormality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910798422.0A
Other languages
Chinese (zh)
Other versions
CN110535723B (en
Inventor
王换招
许世民
张方政
王肖晨
张鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201910798422.0A priority Critical patent/CN110535723B/en
Publication of CN110535723A publication Critical patent/CN110535723A/en
Application granted granted Critical
Publication of CN110535723B publication Critical patent/CN110535723B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/14Network analysis or design
    • H04L41/145Network analysis or design involving simulating, designing, planning or modelling of a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/14Network analysis or design
    • H04L41/147Network analysis or design for predicting network behaviour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/08Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
    • H04L43/0823Errors, e.g. transmission errors
    • H04L43/0847Transmission error
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1425Traffic logging, e.g. anomaly detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/14Network analysis or design
    • H04L41/142Network analysis or design using statistical or mathematical methods

Abstract

The invention discloses the message method for detecting abnormality that deep learning is used in a kind of SDN, message sequence is converted for streaming message using the method for sliding window and predicts the mapping between message, by learning history message, predict output probability corresponding to a specified message sequence, this method can be effective influence of the consideration history message to next-hop message, and can support real-time abnormality detection;The message of k high preceding in prediction probability is accordingly to be regarded as normal messages, overcomes and predicts the not unique problem of message under dynamic strategy.Message id and parameter in parameter vector are predicted respectively, when all parameters of newly arrived message pass through detection, are just judged as normal messages.And when the situation for detection failure occur, then incremental training is carried out using correct message sequence and prediction message as new input, this solves the problems, such as that the prior art can not adapt to time variation network, can also operate normally in dynamic network.The method of the present invention is simple, and testing result is accurate.

Description

The message method for detecting abnormality of deep learning is used in a kind of SDN
Technical field
The invention belongs to Internet technical fields, and in particular to the message abnormality detection of deep learning is used in a kind of SDN Method.
Background technique
Software defined network (Software Defined Network, SDN) passes through the Forwarding plane of separated network equipment With control plane, the flexible control of network flow is realized.In SDN, forwarding device does not have computing capability, only in accordance with control The flow entry of device installation carries out the forwarding of data packet.Although this framework brings the benefits such as programmability, also bring new Network Abnormal, for example, interchanger delay machine, link failure, stream rule installation exception and network attack etc..These are abnormal by initiation Interchanger is interacted with the unexpected message of controller.
In recent years there is the method for abnormality detection in SDN:
Technical solution 1: probe data packet method.This method is to work as input for the detection method for flowing regular anomalous event When interchanger receives data packet, which is forwarded a packet into controller, and the precomputation in path is carried out to the data packet.Simultaneously The label that interchanger is squeezed into the data packet repeating process is forwarded to controller in output interchanger, verifies the consistent of the two Property.
Technical solution 2: automation method.This method passes through the message sequence structure corresponding for all network event buildings Automaton model is built, is automaton model of each group of message sequence inquiry with the presence or absence of adaptation.
Technical solution 3: statistical method.It is right to calculate each group of list entries institute for this method statistical history message probability of occurrence The possibility probability for the next-hop message answered is come by comparing the next-hop message and the highest message of prediction probability that are actually reached Carry out unexpected message detection.
Main problem present in above scheme 1 is: additional modifications interchanger and data packet is needed, although this method uses The scheme of code optimization, still can bring additional bandwidth cost, while this method is examined just for exception rules event It surveys, the anomalous event of other types can not be detected effectively.
Main problem present in above scheme 2 is: it needs to construct automaton model for the network event of each type, when It when network strategy changes, needs to rebuild automaton model, the stronger network of time variation can not be adapted to.
Main problem present in above scheme 3 is: need to guarantee sequence length less than 3, no calligraphy learning is relied on to long-term, Detection accuracy not can guarantee.
Summary of the invention
To solve the above-mentioned problems, the invention proposes in a kind of SDN use deep learning message method for detecting abnormality, Can real-time detection, detection range is wide and detection accuracy is high.
In order to achieve the above objectives, the present invention adopts the following technical scheme:
The message method for detecting abnormality of deep learning is used in a kind of SDN, comprising the following steps:
Step 1 monitors the OpenFlow message interacted between controller and interchanger, id pairs of thread according to controller OpenFlow message threads reorder, and obtain the message that reorders;
Step 2 parses the message that reorders, and generates message id and parameter vector, to obtain subtype message, and day is written Will;
Step 3, sub-Type message carry out one-hot encoding coding according to message id, are remembered using the shot and long term in deep learning The message sequence of network model training subtype message, obtains parameter matrix;Message in each moment sliding window is considered as Input of one sequence as shot and long term memory network model;
Step 4, the shot and long term memory using the input message sequence at sliding window storage current time, in steps for importing 3 Network model and parameter matrix obtain prediction result via softmax layers;
Step 5 carries out the prediction message of k high before probability in prediction result and the OpenFlow message that is actually reached pair Than judgement is actually reached if the OpenFlow message being actually reached is before prediction probability in the massage set of k high OpenFlow message is normal messages, is otherwise determined as unexpected message.
Further, in step 1, classified according to thread id to message, to the message in same thread according to the time It is ranked up, to the message in different threads, the message in time interval t is considered as an entirety, in same time interval The arrival time of first message is that foundation is ranked up.
Further, in step 2, parameter vector includes timestamp, interchanger id, source ip, purpose ip and forwarding behavior.
It further, is each message using the shot and long term memory network model solution in depth model in step 3 Id constructs different training patterns from parameter vector.
Further, the subtype message that step 2 obtains is pre-processed, message id and parameter vector is distinguished first It is normalized, i.e., all data in message id is obtained into disappearing for normalized form compared with maximum value in message id Cease id data set;By each parameter in parameter vector compared with maximum value in the parameter vector, the ginseng of normalized form is obtained Number vector data set.
Further, in step 4, setting sliding window size h determines that history inputs message sequence, in steps for importing 3 Shot and long term memory network model and parameter matrix, by softmax layers in shot and long term memory network model to next-hop message into Row prediction, wherein softmax layers of effect is the probability point for converting the output of neural network to each message in massage set M Cloth P (mt=ki|xt)(ki∈ K), to realize more classification predictions.Assuming that the original output of neural network is [y1,...yn], then it passes through Cross softmax layers treated that prediction result probability distribution is as follows:
Further, in step 5, the prediction result in step 4 is ranked up, chooses the message conduct of k high before probability Normal messages set P;Often listen to a new message m, update sliding window and prediction massage set P, and judge m whether In set P.
Further, event flow model is constructed by the prediction result of step 5, by network event and different message sequences It is corresponding;It is whether correct by the judgement result of event flow model verification step 5, if it is decided that result is incorrect, then this disappears Sequence and correct message are ceased as new input, and incremental training is carried out by step 4.
Further, event flow model is constructed by prediction result, disappeared when one group of input message sequence corresponds to multiple predictions When breath, there are time-dependent relation, each prediction message between final jump message and each prediction message in input message A corresponding new branch realizes streaming prediction.
Further, the mechanism of feedback can be supported by event flow model, and user is allowed " entangle to prediction result Just ";User navigates to network event corresponding to the unexpected message by event flow model, and judges the accurate of prediction result Property;If the user find that testing result be it is wrong, by RESTAPI using correct result as new input, jump to step Continue to train in rapid 3.
Compared with prior art, the invention has the following beneficial technical effects:
The message method for detecting abnormality that deep learning is used in a kind of SDN of the present invention, by Network Abnormal message detection problem It is converted into one the problem of classifying prediction more, and solves the problems, such as this using deep learning method;The present invention have real-time detection, Detection range is wide and the advantages such as detection accuracy height, and it is not necessary to modify the logics of interchanger and data packet, this solves existing inspection Survey method influences the problem of proper network operation.The present invention converts message sequence for streaming message using the method for sliding window It predicts output probability corresponding to a specified message sequence by learning history message with the mapping between prediction message, is somebody's turn to do Method can be effective influence of the consideration history message to next-hop message, and can support real-time abnormality detection.Due to message Corresponding relationship between sequence and prediction message reflects current network policies and state, and same message sequence is in heterogeneous networks ring Different next-hop message is corresponded under border.The message of k high preceding in prediction probability is accordingly to be regarded as normal messages, this method by the present invention Overcoming existing method can not solve the problems, such as to predict that message is not unique under dynamic strategy.The present invention is to message id and parameter vector Middle parameter is predicted respectively, when all parameters of newly arrived message pass through detection, is just judged as normal messages.And work as There is the situation of detection failure, then carries out incremental training using correct message sequence and prediction message as new input, this solution The prior art of having determined can not adapt to the problem of time variation network, can also operate normally in dynamic network.The present invention uses length Phase memory network (LSTM) carries out history message study and prediction, situation that can be bigger in sliding window by LSTM model Long-term Dependency Specification is arrived in lower study, to obtain accurate testing result.
Detailed description of the invention
Fig. 1 is exemplary diagram of the invention;
Fig. 2 is event flow model figure;Arrow indicates the time-dependent relation between message in Fig. 2.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art are not making All other embodiment obtained, shall fall within the protection scope of the present invention under the premise of creative work.
The present invention is by learning interactive information, and by storage current time corresponding message sequence, prediction next-hop disappears The message being actually reached and prediction message are compared, if inconsistent, are judged as unexpected message by the possibility probability of breath.
Due to event same in network, corresponding multiple may normally disappear in different control layer strategies under network environment Sequence is ceased, network event is associated with message sequence by building event flow model;Administrator is judged by event flow model The correctness of testing result, and by incremental update, realize the unexpected message detection in dynamic network.
Below with reference to Fig. 1, the invention will be further described, and the present invention provides disappearing for deep learning is used in a kind of SDN Cease method for detecting abnormality, comprising the following steps:
Step 1 monitors OpenFlow (Internet communication agreement) message interacted between controller and interchanger, according to control The thread id of device reorders to OpenFlow message threads, obtains the message that reorders;
For each network event, controller completes processing logic in the same thread.Disappeared according to thread id Breath classification is ranked up the message in same thread according to arrival time, to the message in different threads, by time interval Message in t is considered as an entirety, is that foundation is ranked up with the arrival time of first message in same time interval, specifically It is setting certain time interval t, in the time interval, the arrival time according to first message in each thread is carried out Sequence, ultimately generates relatively orderly time series.
That is message rearrangement sequence is ranked up as unit of event in the same event with message arrival time for foundation, no It is ranked up between event with first message arrival time in each message for foundation.
Step 2 parses the message that reorders, and generates message id and parameter vector, to obtain subtype message, and day is written Will;
The message that reorders received in step 1 is parsed, message id and parameter vector are generated.Wherein parameter vector Including timestamp, interchanger id, source ip, purpose ip and forwarding behavior;If in a certain message be not present a parameter, parameter to Null is set in amount.Each message that reorders is decomposed into bottom subtype message, such as encapsulates IP in PacketIn Data packet, IP data packet encapsulate TCP data packet again, and the present invention is that the sub- type of message of each bottom generates independent message id.Same type of message is also regarded as different messages in the interaction of different interchanger and controller, and generates independent disappear Id:M_swId=(M_id-1) sw_n+sw_id is ceased, wherein M_swId is indicated using the coding after interchanger parameter, M_Id table Show that, using the coding before interchanger parameter, sw_n indicates the number of interchanger, sw_id indicates the id of current SWITCH.
Same message is considered as different message on different interchangers, includes message in the parameter vector of every kind of message Id, timestamp, interchanger id, the source address ip, the address purpose ip and forwarding behavior.If a message does not include some parameter, Then corresponding position should be set to sky in parameter vector.
Step 3, sub-Type message carry out one-hot encoding (one-hot) coding according to message id, using in deep learning The message sequence of shot and long term memory network (LSTM) model training subtype message, obtains parameter matrix;Each moment is slided Message in window is considered as input of the sequence as shot and long term memory network (LSTM) model.
It converts unexpected message detection to more classification forecasting problems of message, remembers net using the shot and long term in depth model Network (LSTM) solves, and different training patterns are constructed from parameter vector for each message id.
The subtype message that step 2 obtains is pre-processed, normalizing is carried out respectively to message id and parameter vector first Change processing, i.e., by all data in message id compared with maximum value in message id, obtain the message id data of normalized form Collection;By each parameter in parameter vector compared with maximum value in the parameter vector, the parameter vector number of normalized form is obtained According to collection.Since in SDN network, there is long-term dependence between message, the present invention uses shot and long term memory network (LSTM) Model carries out the training of history message, including an input layer, multiple hidden layers and an output layer.To message in input layer Message in set M carries out one-hot encoding (onehot) coding, so that each message corresponds to a unique n-dimensional vectorAndWherein vectorI-th of value be 1, and the value of other positions (j) is 0.
The present invention is that each message id constructs different training patterns from parameter vector, and constructs the damage of training pattern It loses function and is divided into two classes, the only corresponding training pattern of time parameter is used using MSE loss function, remaining model Categorical cross-entropy loss loss function.The neighboring rights of input layer, hidden layer and output interlayer are reconstructed into Parameter matrix.The present invention is to minimize loss function as target, by the method training parameter matrix of gradient decline, until loss The value of function tends towards stability, and parameter matrix is exported and is stored in local.
Step 4, the shot and long term memory using the input message sequence at sliding window storage current time, in steps for importing 3 Network (LSTM) model and parameter matrix obtain prediction result via softmax (normalization exponential function) layer.
Setting sliding window size h determines that history inputs message sequence, the shot and long term memory network model in steps for importing 3 With parameter matrix, next-hop message is predicted by softmax layers in shot and long term memory network (LSTM) model.Wherein Softmax layers of effect is the probability distribution P (m for converting the output of neural network to each message in massage set Mt=ki| xt)(ki∈ K), to realize more classification predictions.Assuming that the original output of neural network is [y1,...yn], then pass through softmax layers Treated, and prediction result probability distribution is as follows:
Step 5 carries out the prediction message of k high before probability in the OpenFlow message being actually reached and prediction result pair Than judgement is actually reached if the OpenFlow message being actually reached is before prediction probability in the massage set of k high OpenFlow message is normal messages, is otherwise determined as unexpected message.
Output softmax (y) in step 4 is ranked up, the message of k high is as normal messages set before selection probability P.Often listen to a new message m, update sliding window and prediction massage set P, and judge m whether in set P, thus Realize streaming prediction.
Step 6 constructs event flow model by the prediction result of step 5, by network event and different message sequence phases It is corresponding.
Whenever, there are when multiple output message, event flow model there is corresponding multiple branches, when all defeated in step 5 When the probability of outbound message is relatively low, determine that new network event reaches, terminates the building of current event stream, and construct new event Stream.
Event flow model is constructed by the prediction result of step 5, as shown in Fig. 2, correspondence is more when one group of input message sequence When a prediction message, there are time-dependent relations between final jump message and each prediction message in input message, each Prediction message will construct a new branch, realize streaming prediction.
For the present invention by slide window implementation to the Stream Processing of message, this requires event flow model to can recognize that new net The generation of network event.Different between network event inside story there is strong dependence, be not present between network event Apparent dependence, therefore in flow of event building, if each prediction message in the prediction result of a list entries Probability it is relatively low, then the message is labeled as to the initial message of a new events, terminates current event flow model, and starts Construct new event flow model.Present invention determine that new message belongs to the threshold value of another network event are as follows:
Event flow model provides the ocular connection between message and event, this facilitates administrator and analyzes abnormal root Place, and premise is provided for the presence of feedback mechanism.When detecting that message is of the presence of an anomaly with, event flow model can be passed through Network event and message are contacted, and assess the correctness of message abnormality detection.
It is step 7, whether correct by the judgement result of the event flow model verification step 5 in step 6, if testing result It is incorrect, then incremental training is carried out by step 4 using the message sequence and correct message as new input.
The mechanism of feedback can be supported by step 6 event flow model, and user is allowed to carry out " correction " to prediction result.Than Such as there is the input message sequence { m that a group window size is 31,m2,m3, corresponding prediction result is { m1:1,m2:0,m3: 0}.At this time if the message of next-hop is m2, it is judged as unexpected message, and alarm to user.User passes through event Flow model navigates to network event corresponding to the unexpected message, and judges the accuracy of prediction result.If the user find that inspection It surveys the result is that mistake, then by RESTAPI by result { m1,m2,m3}→m2As new input, step 3 relaying is jumped to Continuous training, inputs message sequence { m at this time1,m2,m3Just correspond to two possible output result { m1,m2}.During this, The original data of re -training are not needed, and only need in load step 3 trained parameter and weight, and is again initial Change model, continues the training of increment on the basis of this.

Claims (10)

1. using the message method for detecting abnormality of deep learning in a kind of SDN, which comprises the following steps:
Step 1 monitors the OpenFlow message interacted between controller and interchanger, id pairs of thread according to controller OpenFlow message threads reorder, and obtain the message that reorders;
Step 2 parses the message that reorders, and generates message id and parameter vector, to obtain subtype message, and log is written;
Step 3, sub-Type message carry out one-hot encoding coding according to message id, using the shot and long term memory network in deep learning The message sequence of model training subtype message, obtains parameter matrix;Message in each moment sliding window is considered as one Input of the sequence as shot and long term memory network model;
Step 4, the shot and long term memory network using the input message sequence at sliding window storage current time, in steps for importing 3 Model and parameter matrix obtain prediction result via softmax layers;
Step 5 compares the prediction message of k high before probability in prediction result with the OpenFlow message being actually reached, such as The OpenFlow message that fruit is actually reached then determines that the OpenFlow being actually reached disappears before prediction probability in the massage set of k high Breath is normal messages, is otherwise determined as unexpected message.
2. the message method for detecting abnormality in a kind of SDN according to claim 1 based on deep learning, which is characterized in that In step 1, classified according to thread id to message, the message in same thread is ranked up according to the time, to not collinear Message in time interval t is considered as an entirety by the message in journey, in same time interval when the arrival of first message Between for according to being ranked up.
3. the message method for detecting abnormality in a kind of SDN according to claim 1 based on deep learning, which is characterized in that In step 2, parameter vector includes timestamp, interchanger id, source ip, purpose ip and forwarding behavior.
4. the message method for detecting abnormality in a kind of SDN according to claim 1 based on deep learning, which is characterized in that In step 3, using the shot and long term memory network model solution in depth model, constructed for each message id with parameter vector Different training patterns.
5. the message method for detecting abnormality in a kind of SDN according to claim 1 based on deep learning, which is characterized in that The subtype message that step 2 obtains is pre-processed, message id is normalized respectively with parameter vector first, i.e., By all data in message id compared with maximum value in message id, the message id data set of normalized form is obtained;By parameter Each parameter in vector obtains the parameter vector data set of normalized form compared with maximum value in the parameter vector.
6. the message method for detecting abnormality in a kind of SDN according to claim 1 based on deep learning, which is characterized in that In step 4, setting sliding window size h determines that history inputs message sequence, the shot and long term memory network model in steps for importing 3 With parameter matrix, next-hop message is predicted by softmax layers in shot and long term memory network model, wherein softmax The effect of layer is the probability distribution P (m for converting the output of neural network to each message in massage set Mt=ki|xt)(ki∈ K), to realize more classification predictions;If the original output of neural network is [y1,...yn], then by softmax layer treated in advance It is as follows to survey probability of outcome distribution:
7. the message method for detecting abnormality in a kind of SDN according to claim 1 based on deep learning, which is characterized in that In step 5, the prediction result in step 4 is ranked up, the message of k high is as normal messages set P before selection probability;Every prison It hears a new message m, updates sliding window and prediction massage set P, and judge m whether in set P.
8. the message method for detecting abnormality in a kind of SDN according to claim 1 based on deep learning, which is characterized in that Event flow model is constructed by the prediction result of step 5, network event and different message sequences is corresponding;Pass through flow of event Whether the judgement result of model verification step 5 is correct, if it is decided that result is incorrect, then by the message sequence and correct message Incremental training is carried out by step 4 as new input.
9. the message method for detecting abnormality in a kind of SDN according to claim 8 based on deep learning, which is characterized in that Event flow model is constructed by prediction result, when one group of input message sequence corresponds to multiple prediction message, is inputted in message most There are time-dependent relation between latter jump message and each prediction message, each prediction message corresponds to a new branch, Realize streaming prediction.
10. the message method for detecting abnormality in a kind of SDN according to claim 8 based on deep learning, which is characterized in that The mechanism of feedback can be supported by event flow model, and user is allowed to carry out " correction " to prediction result;User passes through flow of event Network event corresponding to model orientation to the unexpected message, and judge the accuracy of prediction result;If the user find that detection The result is that mistake, by RESTAPI using correct result as new input, jumps to and continue to train in step 3.
CN201910798422.0A 2019-08-27 2019-08-27 Message anomaly detection method adopting deep learning in SDN Active CN110535723B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910798422.0A CN110535723B (en) 2019-08-27 2019-08-27 Message anomaly detection method adopting deep learning in SDN

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910798422.0A CN110535723B (en) 2019-08-27 2019-08-27 Message anomaly detection method adopting deep learning in SDN

Publications (2)

Publication Number Publication Date
CN110535723A true CN110535723A (en) 2019-12-03
CN110535723B CN110535723B (en) 2021-01-19

Family

ID=68664560

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910798422.0A Active CN110535723B (en) 2019-08-27 2019-08-27 Message anomaly detection method adopting deep learning in SDN

Country Status (1)

Country Link
CN (1) CN110535723B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111190804A (en) * 2019-12-28 2020-05-22 同济大学 Multi-level deep learning log fault detection method for cloud native system
CN111865667A (en) * 2020-06-28 2020-10-30 新华三技术有限公司 Network connectivity fault root cause positioning method and device
CN112016701A (en) * 2020-09-09 2020-12-01 四川大学 Abnormal change detection method and system integrating time sequence and attribute behaviors
CN112069787A (en) * 2020-08-27 2020-12-11 西安交通大学 Log parameter anomaly detection method based on word embedding
CN112187639A (en) * 2020-08-31 2021-01-05 西安交通大学 Method and system for generating data packet path code based on stream attribute
CN113093695A (en) * 2021-03-23 2021-07-09 武汉大学 Data-driven SDN controller fault diagnosis system
CN113992562A (en) * 2021-09-16 2022-01-28 新华三大数据技术有限公司 Method and system for updating routing information and routing analyzer

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3031174A1 (en) * 2013-08-09 2016-06-15 NEC Laboratories America, Inc. Hybrid network management
CN105809248A (en) * 2016-03-01 2016-07-27 中山大学 Method for configuring DANN onto SDN and an interaction method between them
CN107743147A (en) * 2017-11-06 2018-02-27 柏域信息科技(上海)有限公司 Dynamic cloud storage SDN controllers collocation method and device based on global optimization
US20180109557A1 (en) * 2016-10-17 2018-04-19 Foundation Of Soongsil University Industry Cooperation SOFTWARE DEFINED NETWORK CAPABLE OF DETECTING DDoS ATTACKS USING ARTIFICIAL INTELLIGENCE AND CONTROLLER INCLUDED IN THE SAME
WO2018068867A1 (en) * 2016-10-14 2018-04-19 Telefonaktiebolaget Lm Ericsson (Publ) Time-parallelized integrity testing of software code
CN108712292A (en) * 2018-05-29 2018-10-26 广州大学 A kind of network flow type prediction method based on deep learning
CN109039942A (en) * 2018-08-29 2018-12-18 南京优速网络科技有限公司 A kind of Network Load Balance system and equalization methods based on deeply study
CN109120630A (en) * 2018-09-03 2019-01-01 上海海事大学 A kind of SDN network ddos attack detection method based on Optimized BP Neural Network
CN109274673A (en) * 2018-09-26 2019-01-25 广东工业大学 A kind of detection of exception of network traffic and defence method
CN109547251A (en) * 2018-11-27 2019-03-29 广东电网有限责任公司 A kind of operation system failure and performance prediction method based on monitoring data
CN109743261A (en) * 2019-01-07 2019-05-10 中国人民解放军国防科技大学 SDN-based container network resource scheduling method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3031174A1 (en) * 2013-08-09 2016-06-15 NEC Laboratories America, Inc. Hybrid network management
CN105809248A (en) * 2016-03-01 2016-07-27 中山大学 Method for configuring DANN onto SDN and an interaction method between them
WO2018068867A1 (en) * 2016-10-14 2018-04-19 Telefonaktiebolaget Lm Ericsson (Publ) Time-parallelized integrity testing of software code
US20180109557A1 (en) * 2016-10-17 2018-04-19 Foundation Of Soongsil University Industry Cooperation SOFTWARE DEFINED NETWORK CAPABLE OF DETECTING DDoS ATTACKS USING ARTIFICIAL INTELLIGENCE AND CONTROLLER INCLUDED IN THE SAME
CN107743147A (en) * 2017-11-06 2018-02-27 柏域信息科技(上海)有限公司 Dynamic cloud storage SDN controllers collocation method and device based on global optimization
CN108712292A (en) * 2018-05-29 2018-10-26 广州大学 A kind of network flow type prediction method based on deep learning
CN109039942A (en) * 2018-08-29 2018-12-18 南京优速网络科技有限公司 A kind of Network Load Balance system and equalization methods based on deeply study
CN109120630A (en) * 2018-09-03 2019-01-01 上海海事大学 A kind of SDN network ddos attack detection method based on Optimized BP Neural Network
CN109274673A (en) * 2018-09-26 2019-01-25 广东工业大学 A kind of detection of exception of network traffic and defence method
CN109547251A (en) * 2018-11-27 2019-03-29 广东电网有限责任公司 A kind of operation system failure and performance prediction method based on monitoring data
CN109743261A (en) * 2019-01-07 2019-05-10 中国人民解放军国防科技大学 SDN-based container network resource scheduling method

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111190804A (en) * 2019-12-28 2020-05-22 同济大学 Multi-level deep learning log fault detection method for cloud native system
CN111865667A (en) * 2020-06-28 2020-10-30 新华三技术有限公司 Network connectivity fault root cause positioning method and device
CN111865667B (en) * 2020-06-28 2023-10-17 新华三技术有限公司 Network connectivity fault root cause positioning method and device
CN112069787A (en) * 2020-08-27 2020-12-11 西安交通大学 Log parameter anomaly detection method based on word embedding
CN112187639A (en) * 2020-08-31 2021-01-05 西安交通大学 Method and system for generating data packet path code based on stream attribute
CN112187639B (en) * 2020-08-31 2021-11-19 西安交通大学 Method and system for generating data packet path code based on stream attribute
CN112016701A (en) * 2020-09-09 2020-12-01 四川大学 Abnormal change detection method and system integrating time sequence and attribute behaviors
CN112016701B (en) * 2020-09-09 2023-09-15 四川大学 Abnormal change detection method and system integrating time sequence and attribute behaviors
CN113093695A (en) * 2021-03-23 2021-07-09 武汉大学 Data-driven SDN controller fault diagnosis system
CN113992562A (en) * 2021-09-16 2022-01-28 新华三大数据技术有限公司 Method and system for updating routing information and routing analyzer

Also Published As

Publication number Publication date
CN110535723B (en) 2021-01-19

Similar Documents

Publication Publication Date Title
CN110535723A (en) The message method for detecting abnormality of deep learning is used in a kind of SDN
CN109086889B (en) Terminal fault diagnosis method, device and system based on neural network
CN106168799B (en) A method of batteries of electric automobile predictive maintenance is carried out based on big data machine learning
AU2019200498B2 (en) Telecommunications network troubleshooting systems
US10903554B2 (en) Machine learning models for detecting the causes of conditions of a satellite communication system
EP3460496B1 (en) A method and apparatus for automatic localization of a fault
EP2997756B1 (en) Method and network device for cell anomaly detection
CN108989075A (en) A kind of network failure locating method and system
CN104794057B (en) A kind of crossing event automated testing method and device
CN107274011A (en) The equipment state recognition methods of comprehensive Markov model and probability net
CN107103359A (en) The online Reliability Prediction Method of big service system based on convolutional neural networks
CN106841928A (en) A kind of Fault Section Location of Distribution Network and system based on Multi-source Information Fusion
CN103926490A (en) Power transformer comprehensive diagnosis method with self-learning function
CN108090606A (en) Equipment fault finds method and system
CN115455746B (en) Nuclear power device operation monitoring data anomaly detection and correction integrated method
CN111650472A (en) Method for positioning voltage sag source
CN111124852A (en) Fault prediction method and system based on BMC health management module
CN114647525A (en) Diagnostic method, diagnostic device, terminal and storage medium
KR20190107523A (en) System and method for handling network failure using syslog
CN108880909A (en) A kind of network energy-saving method and device based on intensified learning
Scheffel et al. Increasing sensor reliability through confidence attribution
KR20200039877A (en) Apparatus for detecting change of electric load and method thereof
CN112507720A (en) Graph convolution network root identification method based on causal semantic relation transfer
CN106874423A (en) search control method and system
CN109547248A (en) Based on artificial intelligence in orbit aerocraft ad hoc network method for diagnosing faults and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant