CN112147432A - BiLSTM module based on attention mechanism, transformer state diagnosis method and system - Google Patents

BiLSTM module based on attention mechanism, transformer state diagnosis method and system Download PDF

Info

Publication number
CN112147432A
CN112147432A CN202010861283.4A CN202010861283A CN112147432A CN 112147432 A CN112147432 A CN 112147432A CN 202010861283 A CN202010861283 A CN 202010861283A CN 112147432 A CN112147432 A CN 112147432A
Authority
CN
China
Prior art keywords
transformer
layer
bilstm
module
attention
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010861283.4A
Other languages
Chinese (zh)
Inventor
陈洪岗
王劭菁
任茂鑫
任辰
徐鹏
盛戈皞
宋辉
江秀臣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
State Grid Shanghai Electric Power Co Ltd
East China Power Test and Research Institute Co Ltd
Original Assignee
Shanghai Jiaotong University
State Grid Shanghai Electric Power Co Ltd
East China Power Test and Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University, State Grid Shanghai Electric Power Co Ltd, East China Power Test and Research Institute Co Ltd filed Critical Shanghai Jiaotong University
Priority to CN202010861283.4A priority Critical patent/CN112147432A/en
Publication of CN112147432A publication Critical patent/CN112147432A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01RMEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
    • G01R31/00Arrangements for testing electric properties; Arrangements for locating electric faults; Arrangements for electrical testing characterised by what is being tested not provided for elsewhere
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Testing Electric Properties And Detecting Electric Faults (AREA)

Abstract

The invention discloses a BilSTM module based on an attention mechanism and used for transformer state diagnosis, which comprises: the input layer is used for inputting original data representing the state type of the transformer; the system comprises a feature extraction layer and a fusion layer, wherein the feature extraction layer comprises a plurality of BilSTM layers with different scales and the fusion layer, each BilSTM layer extracts multiple features with different scales in original data and respectively inputs the extracted multiple features into the fusion layer, and the fusion layer splices the multiple features with different scales to form an original feature matrix; the attention module is used for optimizing weight parameters of the original feature matrix to obtain an optimized feature matrix; and the classification layer is used for classifying the data for characterizing the transformer state and outputting the transformer state type based on the optimized feature matrix. In addition, the invention also discloses a transformer state diagnosis method. Correspondingly, the invention also discloses a transformer state diagnosis system which is used for implementing the diagnosis method.

Description

BiLSTM module based on attention mechanism, transformer state diagnosis method and system
Technical Field
The invention relates to a fault diagnosis method and a fault diagnosis system, in particular to a transformer fault diagnosis method and a transformer fault diagnosis system.
Background
The transformer is one of the most important devices in the power system, and is the key for ensuring the safe, reliable, economic and high-quality operation of the power system. However, it should be noted that the failure of the power transformer may be induced by various factors such as natural aging of insulation, severe environmental conditions, and excessive operation load, which may cause serious social and economic losses.
The method is beneficial to accurately identifying the fault type by utilizing the differentiated expression of different fault types on the index attribute, and further has important guiding significance for maintaining the transformer in operation, formulating a proper maintenance strategy and the like.
It should be noted that most of the conventional transformer fault diagnosis methods rely on expert knowledge, and feature extraction is performed on original signals by manual means, which is inefficient and difficult to process mass data growing at high speed.
In the prior art, methods exist for identifying the state of a transformer using long-short term memory neural networks (LSTM). The long-short term memory neural network belongs to a deep circulation neural network, can be effectively suitable for processing time sequence data, and solves the problem that the traditional algorithm cannot learn long-term characteristic relation and gradient dispersion. Therefore, the LSTM neural network can be used for carrying out fault diagnosis on the motor and obtaining a better diagnosis result.
In the prior art, a method for predicting the residual service life of the turbofan engine by extracting features from signals of the turbofan engine by using a self-encoder and then capturing the characteristics of the two-way long-range dependence of the features by using a two-way long-short-term memory neural network (BilSTM) also exists.
However, despite the above-mentioned good results, LSTM and BiLSTM neural networks still suffer from shortcomings in the depth and complexity of feature extraction.
Based on the above, in order to overcome the above defects, the invention provides a BilSTM module based on an attention mechanism for transformer state diagnosis, which can take an original vibration signal as model input, utilize a plurality of groups of BilSTM networks to self-adaptively extract multi-scale features from the original vibration signal, introduce the attention mechanism, optimize feature weight parameters under different scales, improve the model diagnosis precision, and realize effective diagnosis of a fault state.
Disclosure of Invention
One of the purposes of the invention is to provide a BilSTM module based on an attention mechanism for transformer state diagnosis, which can take an original vibration signal as a model input, utilize a plurality of groups of BilSTM networks to extract multi-scale features from the original vibration signal in a self-adaptive manner, introduce the attention mechanism, optimize feature weight parameters under different scales, improve the model diagnosis precision and realize effective diagnosis of a fault state.
The BilSTM module based on the attention mechanism can accurately and effectively diagnose the transformer faults, has high operation efficiency, and has good identification and diagnosis capability on a small number of fault samples while ensuring the integral classification performance and the operation efficiency.
In accordance with the above object, the present invention provides an attention-based BiLSTM module for transformer status diagnosis, comprising:
the input layer is used for inputting original data representing the state type of the transformer;
the system comprises a feature extraction layer and a fusion layer, wherein the feature extraction layer comprises a plurality of BilSTM layers with different scales and the fusion layer, each BilSTM layer extracts multiple features with different scales in original data and respectively inputs the extracted multiple features into the fusion layer, and the fusion layer splices the multiple features with different scales to form an original feature matrix;
the attention module is used for optimizing weight parameters of the original feature matrix to obtain an optimized feature matrix;
and the classification layer is used for classifying the data for characterizing the transformer state and outputting the transformer state type based on the optimized feature matrix.
Further, in the attention-based BilSTM module for transformer status diagnosis according to the present invention, the plurality of different BiLSTM layers at least include a single layer of BilSTM, a double layer of BilSTM and a triple layer of BilSTM.
Further, in the attention-based BiLSTM module for transformer status diagnosis according to the present invention, the classification layer includes a full connection layer and a Softmax classifier, wherein the full connection layer transforms the optimized feature matrix output by the attention module into a one-dimensional sequence; the Softmax classifier classifies data characterizing the transformer state and outputs a transformer state type.
Further, in the attention-based BilSTM module for transformer state diagnosis according to the present invention, the BilSTM module employs a cross entropy loss function to output a transformer state type.
Accordingly, another objective of the present invention is to provide a transformer state diagnosis method, which can accurately and effectively diagnose transformer faults, has high algorithm operation efficiency, and has good identification and diagnosis capability for a small number of fault samples while ensuring the overall classification performance and operation efficiency.
According to the above object, the present invention provides a transformer status diagnosis method, which comprises the steps of:
(1) collecting sample data of transformer oil color spectrums in different state types;
(2) preprocessing the collected transformer oil chromatographic sample data;
(3) constructing the BilSTM module based on the attention mechanism, training the BilSTM module by adopting preprocessed transformer oil chromatographic sample data, and inputting the preprocessed transformer oil chromatographic sample data serving as the original data representing the state of the transformer to the input layer in the training process;
(4) and inputting the original data of the measured transformer oil chromatographic sample as the representation transformer state type into the input layer of the trained attention mechanism-based BilTM module, and outputting the transformer state type by the classification layer of the attention mechanism-based BilTM module.
Further, in the transformer state diagnosis method according to the present invention, in the step (2), the preprocessing includes a normalization processing.
Further, in the transformer state diagnosis method of the present invention, in the step (1), the transformer state types include low-energy discharge, high-energy discharge, low-energy discharge and overheat, high-energy discharge and overheat, partial discharge, medium-temperature overheat, low-temperature overheat, and high-temperature overheat.
In addition, another objective of the present invention is to provide a transformer status diagnostic system, which can accurately and effectively diagnose transformer faults, has high algorithm operation efficiency, and has good identification and diagnosis capability for a few types of fault samples while ensuring the overall classification performance and operation efficiency.
In accordance with the above object, the present invention provides a transformer status diagnosis system, comprising:
the data acquisition device is used for acquiring transformer oil chromatographic sample data of different state types and actually measured transformer oil chromatographic sample data;
the preprocessing unit is used for preprocessing the acquired transformer oil chromatographic sample data and the actually measured transformer oil chromatographic sample data;
a control module that performs the steps of:
constructing the BilSTM module based on the attention mechanism, training the BilSTM module by adopting preprocessed transformer oil chromatographic sample data, and inputting the preprocessed transformer oil chromatographic sample data serving as the original data representing the state of the transformer to the input layer in the training process;
and inputting the original data of the measured transformer oil chromatographic sample as the representation transformer state type into the input layer of the trained attention mechanism-based BilTM module, and outputting the transformer state type by the classification layer of the attention mechanism-based BilTM module.
Further, in the transformer state diagnosis system of the present invention, the preprocessing unit performs normalization processing.
Compared with the prior art, the BiLSTM module based on the attention mechanism, the transformer state diagnosis method and the system have the following advantages and beneficial effects:
the BilSTM module based on the attention mechanism can take original vibration signals as model input, utilizes a plurality of groups of BilSTM networks to extract multi-scale features from the original vibration signals in a self-adaptive manner, introduces the attention mechanism, optimizes feature weight parameters under different scales, improves the model diagnosis precision and realizes effective diagnosis of fault states.
The BilSTM module based on the attention mechanism can accurately and effectively diagnose the transformer faults, has high operation efficiency, and has good identification and diagnosis capability on a small number of fault samples while ensuring the integral classification performance and the operation efficiency.
Accordingly, the transformer state diagnosis method and system of the invention also have the advantages and beneficial effects.
Drawings
Fig. 1 schematically shows the structure of an LSTM neural network.
Fig. 2 is a schematic diagram of a fault diagnosis process of the transformer state diagnosis method according to an embodiment of the present invention.
Fig. 3 is a fault diagnosis flowchart of a BiLSTM module based on an attention mechanism for transformer state diagnosis according to an embodiment of the present invention.
Fig. 4 schematically shows a schematic diagram of the attention mechanism feature optimization.
Fig. 5 schematically shows an objective function fitting distribution model of the transformer state diagnosis method according to an embodiment of the present invention.
Fig. 6 schematically shows a minimum variation curve of an objective function of the transformer state diagnosis method according to an embodiment of the present invention.
Detailed Description
The attention-based BiLSTM module, the transformer state diagnosis method and the system according to the present invention will be further explained and illustrated with reference to the drawings and the specific embodiments of the specification, which, however, should not be construed as unduly limiting the technical solution of the present invention.
Fig. 1 schematically shows the structure of an LSTM neural network.
As shown in FIG. 1, FIG. 1 schematically illustrates the structure of a Long Short Term Memory (LSTM) neural network. In a deep network, when the length of single time sequence data is large or the time is short, the RNN has the problems of gradient disappearance, gradient explosion and the like, and the training difficulty is large. Thus, a memory cell concept was introduced into RNN, so that LSTM models could be obtained, which have a forward propagation chain structure as shown in fig. 1.
It should be noted that the bi-directional LSTM (BiLSTM) is a deformed structure formed by introducing the concept of positive and negative time directions on the basis of the LSTM by taking the idea of human understanding the context of characters as a reference. Each hidden layer unit of the BilSTM neural network structure stores two pieces of information: a and A*A participates in the forward operation, A*And participate in the inverse operation, and the two can jointly determine the final output value y. When performing forward operation, the hidden layer unit StAnd St-1Correlation; when doing the inverse operation, the hidden layer unit St *And St+1 *And (4) correlating. For the time sequence independent of the causal relationship, the BilSTM utilizes the known time sequence and the reverse position sequence, and the feature extraction level of the original sequence is deepened through the forward and reverse bidirectional operation, so that the accuracy of the output result of the model can be effectively improved.
Therefore, BiLSTM tends to achieve better results than unidirectional LSTM in solving the timing problem. LSTM can more realistically simulate human behavioral logic and neurocognitive processes than other neural network structures.
It should be noted that the core technical feature of the present invention is to construct a binstm module based on attention mechanism for transformer status diagnosis, and the fault diagnosis process of the binstm module based on attention mechanism is shown in fig. 2.
Fig. 2 is a fault diagnosis flowchart of a BiLSTM module based on an attention mechanism for transformer state diagnosis according to an embodiment of the present invention.
As shown in fig. 2, in this embodiment, the attention-based BiLSTM module for transformer status diagnosis according to the present invention may include: the system comprises an input layer, a feature extraction layer, an attention module and a classification layer. Wherein the input layer may be used to input raw data characterizing the transformer state type.
It should be noted that the feature extraction layer includes a plurality of BilSTM layers with different scales and a fusion layer, wherein each BilSTM layer can extract multiple features with different scales in original data, and the extracted multiple features are respectively input into the fusion layer, and the fusion layer can splice the multiple features with different scales, thereby forming the original feature matrix attention module.
With continued reference to FIG. 2, in this embodiment, the BilTM layers include a single layer of BilTM, a double layer of BilTM, and a triple layer of BilTM.
Accordingly, the attention module may perform weight parameter optimization on the original feature matrix formed by the fusion layer, so as to obtain an optimized feature matrix. The classification layer can classify the data for characterizing the transformer state and output the transformer state type based on the optimized feature matrix.
It is noted that, as shown in fig. 2, in the present embodiment, the classification layer includes a full connection layer and a Softmax classifier. Wherein, the main effect of full tie layer is: and flattening the optimized feature matrix output by the attention module to convert the feature matrix into a one-dimensional sequence. And the Softmax classifier can perform classification operation on the data for representing the state of the transformer and output the type of the state of the transformer.
In the BiLSTM module based on the attention mechanism, the BiLSTM module adopts a cross entropy loss function and can obtain the accuracy of fault type identification by comparing the similarity degree of the probability distribution of the prediction result output by the Softmax classifier and the probability distribution of the target category. The cross entropy loss function can effectively overcome the defect that the traditional mean square error loss function is slow in weight updating.
Fig. 3 schematically shows a schematic diagram of the attention mechanism feature optimization.
For the diagnosis of the transformer fault, the common practice is to obtain a feature matrix of the equipment state through a series of feature extraction means, and then perform fault diagnosis. However, since different features in the feature matrix do not necessarily contribute to the fault diagnosis, it is necessary to introduce a mechanism of attention to screen the extracted features. The principle of the relevant attention mechanism feature optimization is shown in fig. 3.
As shown in fig. 3, with combined reference to fig. 2, fig. 3 schematically illustrates the principle of the attention module optimizing the raw feature matrix. Denote the sample label as YiExpressing the fault signature as KiWherein, K isi={Ki1,Ki2,…,KijI denotes the different samples, Kij(j ═ 1, 2, …, n) is the j different feature. By fully connecting the neural network, the calculation of each feature KijThe resulting feature weight parameter Wij,WijIs used to indicate that is represented by KijThe target value Y thus obtainedijAnd YiThe probability distribution with the sum of all the feature weights being 1 can be obtained through the Softmax normalization processing, and the original feature K is subjected toijAnd weighting is carried out, so that an optimized feature matrix can be obtained.
Fig. 4 is a schematic diagram of a fault diagnosis process of the transformer state diagnosis method according to an embodiment of the present invention.
As shown in fig. 4, in the present embodiment, the transformer state diagnosis method according to the present invention includes the steps of:
(1) collecting sample data of transformer oil color spectrums in different state types;
(2) preprocessing the collected transformer oil chromatographic sample data;
(3) constructing a BilSTM module based on an attention mechanism, training the BilSTM module by adopting preprocessed transformer oil chromatographic sample data, and inputting the preprocessed transformer oil chromatographic sample data into an input layer as original data representing the state of a transformer in the training process;
(4) and inputting the actually measured transformer oil chromatographic sample data serving as original data representing the transformer state type into the input layer of the trained attention mechanism-based BilTM module, and outputting the transformer state type by the classification layer of the attention mechanism-based BilTM module.
However, in an actual oil chromatogram fault sample, the numerical value of part of characteristic gas grows exponentially, so that the distance of the similar fault sample is large, and the kNN algorithm for classifying based on the measurement distance is greatly influenced. Meanwhile, in order to reduce the influence of absolute value fluctuation of each characteristic gas concentration in different cases, in the step (2), pretreatment is required to be performed on the acquired transformer oil chromatographic sample data, wherein the pretreatment may include normalization treatment.
Referring to fig. 2, fig. 3 and fig. 4, in order to better illustrate the application of the transformer state diagnosis method of the present invention, a fault case library of a certain power grid company and a data set with a total sample number of 662 sets formed by oil chromatography data in published documents in related fields are further described as an example.
In the present invention, the transformer state diagnosis system of the present invention may be used to perform the transformer state diagnosis method of the present invention.
In this embodiment, the transformer state diagnosis system according to the present invention includes: data acquisition device, preprocessing unit and control module. The data acquisition device can acquire the transformer oil chromatographic sample data of different state types and the actually measured transformer oil chromatographic sample data, and the preprocessing unit can preprocess the acquired transformer oil chromatographic sample data and the actually measured transformer oil chromatographic sample data, wherein the preprocessing can include normalization processing.
Accordingly, in the transformer state diagnosis system according to the present invention, the control module may perform the following steps:
constructing a BilSTM module based on an attention mechanism, training the BilSTM module by adopting preprocessed transformer oil chromatographic sample data, and inputting the preprocessed transformer oil chromatographic sample data serving as the original data representing the state of the transformer into an input layer in the training process;
and inputting the actually measured transformer oil chromatographic sample data serving as original data representing the transformer state type into the input layer of the trained attention-based BilTM module, and outputting the transformer state type by the classification layer of the attention-based BilTM module.
In the embodiment, each sample of the fault case library of the power grid company contains H2,CH4,C2H2,C2H4,C2H6,CO,CO2And a total hydrocarbon content eight characterizing parameters. The fault types of the transformer are divided into eight types, namely low-energy discharge LD, high-energy discharge HD, low-energy discharge and overheating LDT, partial discharge PD, medium-temperature overheating MT (T is more than 300 ℃ and less than 700 ℃), low-temperature overheating LT (T is less than 300 ℃), high-energy discharge and overheating HDT, high-temperature overheating HT (T is less than 700 ℃), and the like. And taking 468 data as a training set and 194 data as a test set, wherein the data are used for parameter training and generalization test of the model, and the number distribution of the data set samples is shown in table 1.
TABLE 1
Status type Total number of samples Number of training samples Number of samples tested
LD 80 56 24
HD 279 196 83
LDT 90 63 27
MT 48 34 14
PD 31 22 9
HT 96 68 28
LT 24 18 6
HDT 14 10 4
Total of 662 467 195
Then, according to the oil chromatographic data of 1104 fault samples of the power grid company in the past year, the method is adopted to respectively calculate the oilThe support degree of each gas parameter of the chromatogram is obtained to obtain the initial value M of the measurement matrix0. With H2,CH4For example, there are 37 samples in which the values of the two parameters are simultaneously greater than the corresponding mean value, and the calculation formula (1) of the association rule support degree S is:
S(X→Y)=count(X∪Y)/||T|| (1)
wherein the count represents the number of times the set of entries occur in T, and | T | is the transaction database record total number.
From the above equation (1), it can be calculated that: s (CH)4→H2)=S(CH4←H2) 37/1104 0.0335145. Similarly, the other parameters can be calculated to obtain an 8-dimensional data, as shown in table 2.
Table 2 lists the initial matrix for quantitative correlation of oil chromatography sample parameters.
Table 2.
H2 CH4 C2H2 C2H4 C2H6 CO CO2 Total hydrocarbons
H2 3.351 4.076 5.616 3.623 2.627 2.899 2.264 4.62
CH4 2.808 5.435 3.623 6.069 1.812 2.264 2.536 5.163
C2H2 1.721 2.083 2.627 1.812 3.351 1.359 1.268 2.355
C2H4 3.351 6.341 4.076 5.435 2.083 2.808 2.174 5.344
C2H6 3.533 3.351 3.351 2.808 1.721 2.627 1.449 3.08
CO 2.627 2.808 2.899 2.264 1.359 5.254 2.627 2.627
CO2 1.449 2.174 2.264 2.536 1.268 2.627 32.428 2.355
Total hydrocarbons 3.08 5.344 4.62 5.163 2.355 2.627 2.355 6.703
Fig. 5 schematically shows an objective function fitting distribution model of the transformer state diagnosis method according to an embodiment of the present invention.
Fig. 6 schematically shows a minimum variation curve of an objective function of the transformer state diagnosis method according to an embodiment of the present invention.
As shown in fig. 5 and fig. 6, fig. 5 is an objective function distribution model obtained from a historical observation set, wherein a slightly smaller dot indicates a sampled observation point, and a slightly larger dot is a best estimation feasible point, i.e., an acquisition point with the lowest estimation function value according to the latest model. FIG. 6 is a graph of the minimum value of the objective function historical observation set varying with the number of iterations in the training process, and it can be seen that the accuracy of fault classification of the test set is increased and the model diagnosis performance is enhanced by using the optimized hyper-parameter training model.
The transformer state diagnosis system is used for diagnosing the faults of the transformer, and for comparison, other traditional methods are used for diagnosing the faults simultaneously, wherein the four methods are respectively based on a Support Vector Machine (SVM), a kNN and an NCA-kNN of a three-layer BP neural network and a Radial Basis Function (RBF). The diagnosis accuracy and the operation time are compared, partial discharge PD, low-temperature overheating LT and high-energy discharge and overheating HDT are classified into a few types of samples according to the number of fault samples, and the diagnosis accuracy ratio of each model test set is shown in a table 3.
Table 3.
Figure BDA0002648218660000101
It should be noted that, in order to ensure fair comparison, the same bayesian optimization algorithm is used for the optimization of the hyper-parameters of each model, the learning rate is set to 0.001, the precision is 1e-5, and meanwhile, the SVM uses inter-class imbalance weight adjustment during training.
As shown in Table 4, the conventional NCA-kNN performed the best of the five methods in terms of overall diagnostic accuracy, which reached 92.8%, and the accuracy of the improved NCA-kNN model of the present invention was 91.3%. From the run time of each model, both NCA-kNN models achieved better performance than BPNN and SVM algorithms with only about 1/2 to 1/3 time.
However, from the classification accuracy of a few types of samples, namely from the recall rate, the BilSTM module based on the attention mechanism used in the transformer state diagnosis system has the best performance, which reaches 78.9%, and is not lower than 60% in each fault type, and has more stable performance compared with other models. The BPNN model has a few sample accuracy of only 47.4% and performs the worst in the overall model because it does not adopt any method for training unbalanced data. Although the SVM adopts the weight adjustment of the inter-class imbalance to slightly reduce the difference in expression between the few classes of samples and the majority of samples, the effect is still not ideal.
According to the BilSTM module based on the attention mechanism for transformer state diagnosis, under the condition that the total accuracy is only 1.5% lower than the optimal value of the whole model, the accuracy of a few types of samples is improved by 15% -31% compared with other models, and the BilSTM module has good identification and diagnosis capacity on the few types of samples while ensuring the whole classification performance and the operation efficiency.
In conclusion, the BilSTM module based on the attention mechanism for transformer state diagnosis can accurately and effectively diagnose transformer faults, has high operation efficiency, and has good identification and diagnosis capability on a few types of fault samples while ensuring the overall classification performance and the operation efficiency.
Accordingly, the transformer state diagnosis method and system of the invention also have the advantages and beneficial effects.
It should be noted that the prior art in the protection scope of the present invention is not limited to the examples given in the present application, and all the prior art which is not inconsistent with the technical scheme of the present invention, including but not limited to the prior patent documents, the prior publications and the like, can be included in the protection scope of the present invention.
In addition, the combination of the features in the present application is not limited to the combination described in the claims of the present application or the combination described in the embodiments, and all the features described in the present application may be freely combined or combined in any manner unless contradictory to each other.
It should also be noted that the above-mentioned embodiments are only specific embodiments of the present invention. It is apparent that the present invention is not limited to the above embodiments and similar changes or modifications can be easily made by those skilled in the art from the disclosure of the present invention and shall fall within the scope of the present invention.

Claims (9)

1. An attention-based BilSTM module for transformer condition diagnosis, comprising:
the input layer is used for inputting original data representing the state type of the transformer;
the system comprises a feature extraction layer and a fusion layer, wherein the feature extraction layer comprises a plurality of BilSTM layers with different scales and the fusion layer, each BilSTM layer extracts multiple features with different scales in original data and respectively inputs the extracted multiple features into the fusion layer, and the fusion layer splices the multiple features with different scales to form an original feature matrix;
the attention module is used for optimizing weight parameters of the original feature matrix to obtain an optimized feature matrix;
and the classification layer is used for classifying the data for characterizing the transformer state and outputting the transformer state type based on the optimized feature matrix.
2. The attention-based BilSTM module for transformer condition diagnosis as claimed in claim 1, wherein said number of different-scale BilSTM layers comprises at least a single layer of BilSTM, a double layer of BilSTM and a triple layer of BilSTM.
3. The attention-based mechanism of BilSTM module for transformer status diagnosis as claimed in claim 1, wherein said classification layer comprises a fully connected layer and a Softmax classifier, wherein said fully connected layer transforms the optimized feature matrix output by the attention module into a one-dimensional sequence; the Softmax classifier classifies data characterizing the transformer state and outputs a transformer state type.
4. The attention-based BilSTM module for transformer state diagnosis as claimed in claim 1, wherein said BilSTM module employs a cross entropy loss function to output a transformer state type.
5. A transformer condition diagnostic method, comprising the steps of:
(1) collecting sample data of transformer oil color spectrums in different state types;
(2) preprocessing the collected transformer oil chromatographic sample data;
(3) constructing the BiLSTM module based on attention mechanism according to any one of claims 1-4, and training it with the preprocessed transformer oil color spectrum sample data, wherein in the training process, the preprocessed transformer oil color spectrum sample data is inputted to the input layer as the original data representing the transformer state;
(4) and inputting the original data of the measured transformer oil chromatographic sample as the representation transformer state type into the input layer of the trained attention mechanism-based BilTM module, and outputting the transformer state type by the classification layer of the attention mechanism-based BilTM module.
6. The transformer status diagnosis method according to claim 5, wherein in the step (2), the preprocessing includes a normalization processing.
7. The transformer status diagnosis method according to claim 5, wherein in step (1), the transformer status types include low energy discharge, high energy discharge, low energy discharge with overheat, high energy discharge with overheat, partial discharge, medium temperature overheat, low temperature overheat, and high temperature overheat.
8. A transformer condition diagnostic system, comprising:
the data acquisition device is used for acquiring transformer oil chromatographic sample data of different state types and actually measured transformer oil chromatographic sample data;
the preprocessing unit is used for preprocessing the acquired transformer oil chromatographic sample data and the actually measured transformer oil chromatographic sample data;
a control module that performs the steps of:
constructing the BiLSTM module based on attention mechanism according to any one of claims 1-4, and training it with the preprocessed transformer oil color spectrum sample data, wherein in the training process, the preprocessed transformer oil color spectrum sample data is inputted to the input layer as the original data representing the transformer state;
and inputting the original data of the measured transformer oil chromatographic sample as the representation transformer state type into the input layer of the trained attention mechanism-based BilTM module, and outputting the transformer state type by the classification layer of the attention mechanism-based BilTM module.
9. The transformer state diagnostic system of claim 8, wherein the pre-processing unit performs a normalization process.
CN202010861283.4A 2020-08-25 2020-08-25 BiLSTM module based on attention mechanism, transformer state diagnosis method and system Pending CN112147432A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010861283.4A CN112147432A (en) 2020-08-25 2020-08-25 BiLSTM module based on attention mechanism, transformer state diagnosis method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010861283.4A CN112147432A (en) 2020-08-25 2020-08-25 BiLSTM module based on attention mechanism, transformer state diagnosis method and system

Publications (1)

Publication Number Publication Date
CN112147432A true CN112147432A (en) 2020-12-29

Family

ID=73888487

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010861283.4A Pending CN112147432A (en) 2020-08-25 2020-08-25 BiLSTM module based on attention mechanism, transformer state diagnosis method and system

Country Status (1)

Country Link
CN (1) CN112147432A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113158537A (en) * 2021-01-18 2021-07-23 中国航发湖南动力机械研究所 Aeroengine gas circuit fault diagnosis method based on LSTM combined attention mechanism
CN113361197A (en) * 2021-06-08 2021-09-07 山东大学 Method and system for predicting remaining service life of lithium battery
CN113537360A (en) * 2021-07-19 2021-10-22 中国人民解放军国防科技大学 Point-to-point classification fault detection method based on deep learning
CN114062812A (en) * 2021-11-15 2022-02-18 国网四川省电力公司营销服务中心 Fault diagnosis method and system for metering cabinet
CN114707257A (en) * 2022-02-18 2022-07-05 江苏赛德力制药机械制造有限公司 Mechanical residual service life prediction method based on all-state attention and BiLSTM
CN115563528A (en) * 2022-11-25 2023-01-03 四川飞宇电力股份有限公司 Transformer maintenance equipment control method, electronic equipment and medium
CN116930741A (en) * 2023-07-19 2023-10-24 中国电子产品可靠性与环境试验研究所((工业和信息化部电子第五研究所)(中国赛宝实验室)) Switching device fault degree diagnosis method and device and computer equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108037378A (en) * 2017-10-26 2018-05-15 上海交通大学 Running state of transformer Forecasting Methodology and system based on long memory network in short-term
CN110501585A (en) * 2019-07-12 2019-11-26 武汉大学 A kind of Diagnosis Method of Transformer Faults based on Bi-LSTM and dissolved gas analysis

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108037378A (en) * 2017-10-26 2018-05-15 上海交通大学 Running state of transformer Forecasting Methodology and system based on long memory network in short-term
CN110501585A (en) * 2019-07-12 2019-11-26 武汉大学 A kind of Diagnosis Method of Transformer Faults based on Bi-LSTM and dissolved gas analysis

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王太勇: "基于注意力机制BiLSTM的设备智能故障诊断方法", 《天津大学学报(自然科学与工程技术版)》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113158537A (en) * 2021-01-18 2021-07-23 中国航发湖南动力机械研究所 Aeroengine gas circuit fault diagnosis method based on LSTM combined attention mechanism
CN113361197A (en) * 2021-06-08 2021-09-07 山东大学 Method and system for predicting remaining service life of lithium battery
CN113361197B (en) * 2021-06-08 2022-10-25 山东大学 Method and system for predicting remaining service life of lithium battery
CN113537360A (en) * 2021-07-19 2021-10-22 中国人民解放军国防科技大学 Point-to-point classification fault detection method based on deep learning
CN113537360B (en) * 2021-07-19 2023-02-03 中国人民解放军国防科技大学 Point-to-point classification fault detection method based on deep learning
CN114062812A (en) * 2021-11-15 2022-02-18 国网四川省电力公司营销服务中心 Fault diagnosis method and system for metering cabinet
CN114062812B (en) * 2021-11-15 2024-05-07 国网四川省电力公司营销服务中心 Metering cabinet fault diagnosis method and system
CN114707257A (en) * 2022-02-18 2022-07-05 江苏赛德力制药机械制造有限公司 Mechanical residual service life prediction method based on all-state attention and BiLSTM
CN115563528A (en) * 2022-11-25 2023-01-03 四川飞宇电力股份有限公司 Transformer maintenance equipment control method, electronic equipment and medium
CN116930741A (en) * 2023-07-19 2023-10-24 中国电子产品可靠性与环境试验研究所((工业和信息化部电子第五研究所)(中国赛宝实验室)) Switching device fault degree diagnosis method and device and computer equipment

Similar Documents

Publication Publication Date Title
CN112147432A (en) BiLSTM module based on attention mechanism, transformer state diagnosis method and system
Liao et al. Fault diagnosis of power transformers using graph convolutional network
Li et al. A novel deep autoencoder and hyperparametric adaptive learning for imbalance intelligent fault diagnosis of rotating machinery
CN110929847A (en) Converter transformer fault diagnosis method based on deep convolutional neural network
CN112016251B (en) Nuclear power device fault diagnosis method and system
CN112070128A (en) Transformer fault diagnosis method based on deep learning
CN111046961B (en) Fault classification method based on bidirectional long-time and short-time memory unit and capsule network
CN111680726A (en) Transformer fault diagnosis method and system based on neighbor component analysis and k neighbor learning fusion
CN115618732B (en) Nuclear reactor digital twin key parameter autonomous optimization data inversion method
CN116562114A (en) Power transformer fault diagnosis method based on graph convolution neural network
CN114091504A (en) Rotary machine small sample fault diagnosis method based on generation countermeasure network
CN113887136A (en) Improved GAN and ResNet based electric vehicle motor bearing fault diagnosis method
CN114169091A (en) Method for establishing prediction model of residual life of engineering mechanical part and prediction method
CN114266297A (en) Semantic knowledge base of thermal power equipment, construction method and zero sample fault diagnosis method
CN114462459A (en) Hydraulic machine fault diagnosis method based on 1DCNN-LSTM network model
CN114326639B (en) Industrial process performance evaluation method based on mixed attention convolutional neural network
Chen et al. A novel Bayesian-optimization-based adversarial TCN for RUL prediction of bearings
Saufi et al. Machinery fault diagnosis based on a modified hybrid deep sparse autoencoder using a raw vibration time-series signal
Gao et al. A status-relevant blocks fusion approach for operational status monitoring
CN111695288A (en) Transformer fault diagnosis method based on Apriori-BP algorithm
CN114581699A (en) Transformer state evaluation method based on deep learning model in consideration of multi-source information
CN116894215B (en) Gear box fault diagnosis method based on semi-supervised dynamic graph attention
CN110674791B (en) Forced oscillation layered positioning method based on multi-stage transfer learning
CN112380763A (en) System and method for analyzing reliability of in-pile component based on data mining
CN110490218B (en) Rolling bearing fault self-learning method based on two-stage DBN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201229