CN107590567A - Recurrent neural network short-term load prediction method based on information entropy clustering and attention mechanism - Google Patents
Recurrent neural network short-term load prediction method based on information entropy clustering and attention mechanism Download PDFInfo
- Publication number
- CN107590567A CN107590567A CN201710848981.9A CN201710848981A CN107590567A CN 107590567 A CN107590567 A CN 107590567A CN 201710848981 A CN201710848981 A CN 201710848981A CN 107590567 A CN107590567 A CN 107590567A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- msubsup
- mfrac
- load
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 21
- 230000000306 recurrent effect Effects 0.000 title claims abstract description 20
- 230000007246 mechanism Effects 0.000 title claims abstract description 17
- 238000000034 method Methods 0.000 title claims abstract description 13
- 238000012549 training Methods 0.000 claims description 10
- 238000013277 forecasting method Methods 0.000 claims description 6
- 230000006870 function Effects 0.000 claims description 5
- 238000004364 calculation method Methods 0.000 claims description 4
- 230000007787 long-term memory Effects 0.000 claims description 4
- 230000036316 preload Effects 0.000 claims description 3
- 230000008859 change Effects 0.000 claims description 2
- 238000012887 quadratic function Methods 0.000 claims description 2
- 230000000452 restraining effect Effects 0.000 claims description 2
- 230000008901 benefit Effects 0.000 abstract description 3
- 238000004458 analytical method Methods 0.000 abstract description 2
- 125000004122 cyclic group Chemical group 0.000 abstract 1
- 238000003066 decision tree Methods 0.000 description 3
- 230000005611 electricity Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 241001269238 Data Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013531 bayesian neural network Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000003064 k means clustering Methods 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 239000000843 powder Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
Landscapes
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention designs a cyclic neural network short-term load prediction method based on information entropy clustering and ATTENTION mechanism, which comprises the following steps: analyzing characteristics affecting the power load; calculating the information entropy of all the characteristics to the load by using an xgboost algorithm; performing clustering analysis on historical data of the predicted region based on each characteristic information entropy as weight by using a clustering algorithm; selecting the cluster with the closest weight of the prediction day from the clustering results, and forming a time sequence T from far to near according to the prediction time of the distance; the time sequence T is used as an Encoder (Encoder) of the ATTENTION recurrent neural network, and a prediction result is obtained by a Decoder (Decoder). Compared with the prior art, the method has the advantages of high prediction precision, good self-adaptability and the like.
Description
Technical field
The present invention relates to grid power electric powder prediction, more particularly to a kind of feature based comentropy cluster and notice
The Recognition with Recurrent Neural Network short-term load forecasting method of mechanism.
Background technology
Short-term load forecasting all plays importance in the works in Electric control, safety, the market promotion and power network rational management
Effect.Short-term electric load prediction is mainly used in predicting coming few hours, one day or one week or so electric load service condition.
High-precision short-term load forecasting is advantageous to reduce power grid operation financial cost, power system device scheduling and safety.Due to electricity
Power load is influenceed by various factors, and high-precision load prediction is difficult to realize in actual production process.
In the Short-term Load Forecasting Model mainly applied, conventional method and artificial intelligence approach are broadly divided into.Pass
System method is constructed based on mathematical modeling, including multiple linear regression, Random time sequence, exponential smoothing and based on priori
The method of knowledge.Load prediction is a nonlinear problem, therefore conventional method high-precision prediction relatively difficult to achieve.Artificial intelligence
Method mainly uses artificial neural network, SVMs, Expert System Model, fuzzy logic method and Bayesian neural network
The methods of.Which kind of because load is influenceed by numerous characteristic attributes and X factor, can ensure all without method
In the case of can ensure high-precision prediction result.
In addition, with the popularization of intelligent electric meter and each natural cause, social factor data it is constantly sound, how from huge
Historical data in choose maximally effective part also turn into research focus.
The content of the invention
It is an object of the present invention to overcome the above-mentioned drawbacks of the prior art and provide one kind is gathered based on comentropy
The Recognition with Recurrent Neural Network short-term load forecasting method of class and notice (ATTENTION) mechanism.
The purpose of the present invention can be achieved through the following technical solutions:
Recognition with Recurrent Neural Network short-term load forecasting side of the one kind based on comentropy cluster and notice (ATTENTION) mechanism
Method, specifically include following steps:
Step 1:Power system load data is pre-processed, determines the input feature vector variable and output target of forecast model.
The input feature vector of xgboost algorithms includes:Season, temperature (DEG C), humidity (%), wind speed (m/s), rainfall, week type,
Legal festivals and holidays, a few days ago load peak, the load of the first seven day are average, seven days preload values, last year is the same as period load value;Export mesh
It is designated as each feature of the above and obtains comentropy (importance) for load value;
Step 2:Being calculated using extreme gradient boosting algorithm (xgboost) influences the information of each input feature vector of electric load
Entropy, and the comentropy of each feature is normalized, to be adapted to requirement of the clustering algorithm to data, by the comentropy of m obtained feature
It is designated as { w1, w2..., wm};
Step 3:The cluster of feature based weight, i.e. sample are constructed using the comentropy of each feature as importance heuristic information
Between distance using weighted euclidean distance weigh,Adjust
Whole clustering parameter so that clustering algorithm obtains and wants predicted time section load service condition most like history day;
Step 4:According to the range prediction time by as far as closely by similar history day makeup time sequence T;
Step 5:Encoder (Encoder) list entries of time series T as ATTENTION Recognition with Recurrent Neural Network, instruction
Practice forecast model, the task of encoder is to change the list entries of variable-length and be encoded to regular length vector, then
It is used as the input state of decoder.Then, decoder produces the forecasting sequence that length is n.Similar go through is inputted in Encoder
Shi load value y={ y0, y1..., ym-1And characteristic vector f={ f0, f1..., fm}.And notice mechanism allows decoder
The different piece of prediction output is participated in when each step exports.By model Decoder fractional prediction load valuesRealize the high-precision forecast of short term.In order to improve precision of prediction, Recognition with Recurrent Neural Network here
It is shot and long term memory network (LSTM) structure;
Step 6:Whether training of judgement model restrains, return to step 5 if not yet restraining, otherwise by decoder
(Decoder) load prediction results are obtained.
Compared with prior art, the present invention uses the variable weight clustering algorithm based on each characteristic information entropy to electricity consumption historical data
Clustered, excavated the electricity consumption rule of prediction day to the full extent, eliminated and the prediction day less load of characteristic similarity
Curve, by choose cluster after with want predicted time section load service condition it is most like history day be predicted model training,
The precision of short-term load forecasting can be effectively improved.
In addition, the present invention additionally uses the LSTM Recognition with Recurrent Neural Network with ATTENTION mechanism and establishes forecast model, energy
The load curve of enough more effectively reaction historical datas, so as to further improve load prediction precision.
Brief description of the drawings
Fig. 1 is the flow chart of the present invention;
Fig. 2 is the flow chart of the Recognition with Recurrent Neural Network forecast model based on notice (ATTENTION) mechanism.
Embodiment
The present invention is further described below in conjunction with the accompanying drawings.Following examples are only used for clearly illustrating the present invention's
Technical scheme, and can not be limited the scope of the invention with this.
Step 1:Data prediction, determine input feature vector variable.
Because the size of electric load is influenceed by several factors, in addition to needing load data, season, temperature
The feature such as (DEG C), humidity (%), wind speed (m/s), rainfall, week type, legal festivals and holidays all plays influence.And in number
According to load peak, the load of the first seven day a few days ago average, seven days preload values, last year are analyzed during analysis again with period load
Value is with predicting that certain correlation is all presented in the coefficient of relationship of day.Simultaneously in order to ensure that training sample is sufficiently large, the collection of this example
It is the load data at daily 24 hours integral point moment in 10 years, then in summary ten features form sample set.
Step 2:Each feature is calculated using xgboost algorithms (extreme gradient boosting algorithm) relative to the information of load to increase
Benefit, and feature importance ranking is carried out according to information gain value, particular content is as follows:
Assuming that a decision tree has J-1 node, and each characteristic variable corresponds to a node t, and sample is at each t
All it is divided into two.And each t corresponds to the information gain which characteristic variable will be brought according to this feature to the load to be predicted
To determine.The principle of node split is so that the information gain after node split becomes big.So structure as decision tree we
Can be obtained by influences the importance of each feature of load value, and the object function for building decision tree is:
Part I is training error.And Part II is the sum of the complexity of each tree.Wherein IJ=i | q (xi)=j }
It is defined as sample set above each leaf,Represent each data point
First derivative and second dervative on error function.Re-define
Here above formula is converted into the problem of minimum value for how seeking one-dimensional quadratic function, i.e.,:
Wherein information gain value calculation formula is as follows:
So obtaining the comentropy of m feature, { w is designated as1, w2..., wm}。
Step 3:Illustrated in the present embodiment by taking somewhere one day in 2016 as an example, for prediction somewhere one day 24
The load value of hour each integral point, the data of nearly 4000 days in 10 years in history are clustered, and are chosen and to be predicted day 24
Hour load service condition most like history day.Common clustering method does not account for Different Effects of the feature to classification, letter
Single Euclidean distance.The present invention constructs the cluster of feature based weight using feature weight as importance heuristic information.Adopt herein
Clustered with k means clustering algorithms.
Step 3.1 is for history day set X={ x1, x2..., xi..., xn, wherein for each xiHave correspondingly
M feature, i.e. xi={ xi1, xi2..., xim, first by all characteristic normalizeds.
Step 3.2 selection will predict the characteristic vector of day as first initial cluster center u0, selected in historical set
Select and u0Apart from farthest history day as u1, selection and { u in historical set0, u1Apart from farthest history day conduct
u2..., until finding K initial cluster center { u0, u1..., uj..., uk}。
Step 3.3 chooses nearest central point, is classified as such, distance calculation formula is for each history day:
Step 3.4 renewal cluster centre point is the average per class.
Step 3.5 computes repeatedly step 3.3 and 3.4 until cluster centre no longer changes.
The cluster of feature based weight (comentropy) is carried out to the historical data in prediction area using clustering algorithm, adjustment is poly-
Class parameter so that clustering algorithm obtains and wants predicted time section load service condition most like history day.
Step 4:According to the range prediction time by as far as closely by similar history makeup time sequence T, time series T conducts
The Encoder encoders of ATTENTION Recognition with Recurrent Neural Network, and prediction result is obtained by Decoder decoders.
Accompanying drawing 2 is shown for predicting the sequence of short term to sequence (sequence2sequence) architecture.
The framework is made up of two LSTM (shot and long term memory network) network:Encoder and decoder.The task of encoder is that conversion can
Become the list entries of length and be encoded to regular length vector, be then used as the input state of decoder.Then, solve
Code device produces the output sequence (such as to predict the load of following 24 hours each integral points, then n=24) that length is n.It is this
The major advantage of framework is the input of its permission random length.Can be by any number of disposable load of previous time step
Measured value is used as input, to predict the load of any number of future time step-length.2 obtained optimal short term with reference to the accompanying drawings
Notice mechanism Recognition with Recurrent Neural Network model is predicted, can be in a model to the load value of 24 hours R days of prediction each integral point
Input corresponds to the load value y={ y of the similar historical day of this R days0, y1..., ym-1, season, temperature (DEG C), humidity (%), wind
Fast (m/s), rainfall, week type, legal festivals and holidays state, a few days ago load peak, the load of the first seven day be averaged, before seven days
Load value, last year is the same as the characteristic variable f=such as period load value { f0, f1..., fm}.Prediction day R can be exported automatically by model
24 hours each integral point load valuesRealize the high-precision forecast of short term.Notice mechanism permits
Perhaps decoder participates in the different piece of prediction output when each step exports.Significantly allow sequence of the model according to input
Row and caused sequence determine what is participated in.For training, encoder network has carried out training in advance, with maximum limit
Reduce on degree groundThen two networks of encoder insertion decoder network and training, to reduce target
Function:In order to improve precision of prediction, Recognition with Recurrent Neural Network is LSTM units here.
Described above is only the preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art
For member, without departing from the technical principles of the invention, some improvement and deformation can also be made, these are improved and deformation
Also it should be regarded as protection scope of the present invention.
Claims (5)
1. Recognition with Recurrent Neural Network short-term load forecasting side of the one kind based on comentropy cluster and notice (ATTENTION) mechanism
Method, it is characterised in that specifically include following steps:
Step 1:Power system load data is pre-processed, determines the input feature vector variable and output target of forecast model;
Step 2:Being calculated using extreme gradient boosting algorithm (xgboost) influences the comentropy of each input feature vector of electric load, and
The comentropy of each feature is normalized, to be adapted to requirement of the clustering algorithm to data;
Step 3:The cluster of feature based weight, i.e. sample spacing are constructed using the comentropy of each feature as importance heuristic information
Weighed from using weighted euclidean distance, adjust clustering parameter so that clustering algorithm obtains and wants predicted time section load to use feelings
Condition most like history day;
Step 4:According to the range prediction time by as far as closely by similar history day makeup time sequence T;
Step 5:Encoder (Encoder) list entries of time series T as ATTENTION Recognition with Recurrent Neural Network, training are pre-
Survey model;
Step 6:Whether training of judgement model restrains, return to step 5 if not yet restraining, otherwise by decoder (Decoder)
Obtain load prediction results.
2. a kind of Recognition with Recurrent Neural Network based on comentropy cluster and ATTENTION mechanism according to claim 1 is short-term
Load forecasting method, it is characterized in that, the input feature vector of xgboost algorithms includes in step 1:Season, temperature (DEG C), humidity
(%), wind speed (m/s), rainfall, week type, the legal festivals and holidays, a few days ago load peak, the load of the first seven day it is average, seven days
Preload value, last year, output target each feature for more than obtained comentropy (importance) for load value with period load value.
3. a kind of Recognition with Recurrent Neural Network based on comentropy cluster and ATTENTION mechanism according to claim 1 is short-term
Load forecasting method, it is characterised in that in step 2,
Step 2.1:The object function of developing algorithm is:
<mfenced open = "" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msup>
<mi>Obj</mi>
<mrow>
<mo>(</mo>
<mi>t</mi>
<mo>)</mo>
</mrow>
</msup>
<mo>=</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>n</mi>
</msubsup>
<mrow>
<mo>&lsqb;</mo>
<mrow>
<msub>
<mi>g</mi>
<mi>i</mi>
</msub>
<msub>
<mi>f</mi>
<mi>t</mi>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
<msub>
<mi>h</mi>
<mi>i</mi>
</msub>
<msubsup>
<mi>f</mi>
<mi>t</mi>
<mn>2</mn>
</msubsup>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
</mrow>
<mo>&rsqb;</mo>
</mrow>
<mo>+</mo>
<mi>&Omega;</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>f</mi>
<mi>t</mi>
</msub>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>=</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>n</mi>
</msubsup>
<mrow>
<mo>&lsqb;</mo>
<mrow>
<msub>
<mi>g</mi>
<mi>i</mi>
</msub>
<msub>
<mi>w</mi>
<mi>q</mi>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
<msub>
<mi>h</mi>
<mi>i</mi>
</msub>
<msubsup>
<mi>w</mi>
<mi>q</mi>
<mn>2</mn>
</msubsup>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
</mrow>
<mo>&rsqb;</mo>
</mrow>
<mo>+</mo>
<mi>&gamma;</mi>
<mi>T</mi>
<mo>+</mo>
<mi>&lambda;</mi>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>T</mi>
</msubsup>
<msubsup>
<mi>w</mi>
<mi>j</mi>
<mn>2</mn>
</msubsup>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>=</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>T</mi>
</msubsup>
<mrow>
<mo>&lsqb;</mo>
<mrow>
<mrow>
<mo>(</mo>
<mrow>
<msub>
<mi>&Sigma;</mi>
<mrow>
<mi>i</mi>
<mo>&Element;</mo>
<msub>
<mi>I</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
<msub>
<mi>g</mi>
<mi>i</mi>
</msub>
</mrow>
<mo>)</mo>
</mrow>
<msub>
<mi>w</mi>
<mi>j</mi>
</msub>
<mo>+</mo>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
<mrow>
<mo>(</mo>
<mrow>
<msub>
<mi>&Sigma;</mi>
<mrow>
<mi>i</mi>
<mo>&Element;</mo>
<msub>
<mi>I</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
<msub>
<mi>h</mi>
<mi>i</mi>
</msub>
<mo>+</mo>
<mi>&lambda;</mi>
</mrow>
<mo>)</mo>
</mrow>
<msubsup>
<mi>w</mi>
<mi>j</mi>
<mn>2</mn>
</msubsup>
</mrow>
<mo>&rsqb;</mo>
</mrow>
<mo>+</mo>
<mi>&gamma;</mi>
<mi>T</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>=</mo>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>T</mi>
</msubsup>
<mrow>
<mo>&lsqb;</mo>
<mrow>
<msub>
<mi>G</mi>
<mi>j</mi>
</msub>
<msub>
<mi>w</mi>
<mi>j</mi>
</msub>
<mo>+</mo>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
<mrow>
<mo>(</mo>
<mrow>
<msub>
<mi>H</mi>
<mi>j</mi>
</msub>
<mo>+</mo>
<mi>&lambda;</mi>
</mrow>
<mo>)</mo>
</mrow>
<msubsup>
<mi>w</mi>
<mi>j</mi>
<mn>2</mn>
</msubsup>
</mrow>
<mo>&rsqb;</mo>
</mrow>
<mo>+</mo>
<mi>&gamma;</mi>
<mi>T</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
Part I is training error, and Part II is the sum of the complexity of each tree, wherein IJ=i | q (xi)=j } determined
Justice is sample set above each leaf,Represent missing for each data point
First derivative and second dervative on difference function, definition
Step 2.2:The problem of translating into the minimum value for how seeking one-dimensional quadratic function, i.e.,:
<mrow>
<msubsup>
<mi>w</mi>
<mi>j</mi>
<mo>*</mo>
</msubsup>
<mo>=</mo>
<mo>-</mo>
<mfrac>
<msub>
<mi>G</mi>
<mi>j</mi>
</msub>
<mrow>
<msub>
<mi>H</mi>
<mi>j</mi>
</msub>
<mo>+</mo>
<mi>&lambda;</mi>
</mrow>
</mfrac>
<mo>,</mo>
<mi>O</mi>
<mi>b</mi>
<mi>j</mi>
<mo>=</mo>
<mo>-</mo>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
<msubsup>
<mi>&Sigma;</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>T</mi>
</msubsup>
<mfrac>
<msubsup>
<mi>G</mi>
<mi>j</mi>
<mn>2</mn>
</msubsup>
<mrow>
<msub>
<mi>H</mi>
<mi>j</mi>
</msub>
<mo>+</mo>
<mi>&lambda;</mi>
</mrow>
</mfrac>
<mo>+</mo>
<mi>&gamma;</mi>
<mi>T</mi>
</mrow>
Step 2.3:Wherein characteristic information entropy calculation formula is as follows:
<mrow>
<mi>G</mi>
<mi>a</mi>
<mi>i</mi>
<mi>n</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
<mrow>
<mo>&lsqb;</mo>
<mrow>
<mfrac>
<msubsup>
<mi>G</mi>
<mi>L</mi>
<mn>2</mn>
</msubsup>
<mrow>
<msub>
<mi>H</mi>
<mi>L</mi>
</msub>
<mo>+</mo>
<mi>&lambda;</mi>
</mrow>
</mfrac>
<mo>+</mo>
<mfrac>
<msubsup>
<mi>G</mi>
<mi>R</mi>
<mn>2</mn>
</msubsup>
<mrow>
<msub>
<mi>H</mi>
<mi>R</mi>
</msub>
<mo>+</mo>
<mi>&lambda;</mi>
</mrow>
</mfrac>
<mo>-</mo>
<mfrac>
<mrow>
<mo>(</mo>
<mrow>
<msub>
<mi>G</mi>
<mi>L</mi>
</msub>
<mo>+</mo>
<msub>
<mi>G</mi>
<mi>R</mi>
</msub>
</mrow>
<mo>)</mo>
</mrow>
<mrow>
<msub>
<mi>H</mi>
<mi>L</mi>
</msub>
<mo>+</mo>
<msub>
<mi>H</mi>
<mi>R</mi>
</msub>
<mo>+</mo>
<mi>&lambda;</mi>
</mrow>
</mfrac>
</mrow>
<mo>&rsqb;</mo>
</mrow>
<mo>-</mo>
<mi>&gamma;</mi>
</mrow>
The comentropy of m obtained feature is designated as { w1, w2..., wm}。
4. a kind of Recognition with Recurrent Neural Network based on comentropy cluster and ATTENTION mechanism according to claim 1 is short-term
Load forecasting method, it is characterised in that in step 3, the clustering algorithm of feature based weight concretely comprises the following steps:
Step 3.1 is for history day set X={ x1, x2..., xi..., xn, wherein for each xiThere are corresponding m
Feature, i.e. xi={ xi1, xi2..., xim, first by all characteristic normalizeds;
Step 3.2 selection will predict the characteristic vector of day as first initial cluster center u0, selection and u in historical set0
Apart from farthest history day as u1, selection and { u in historical set0, u1Apart from farthest history day as u2...,
Until finding K initial cluster center { u0, u1..., uj..., uk};
Step 3.3 chooses nearest central point, is classified as such, distance calculation formula is for each history day:
<mrow>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mrow>
<msub>
<mi>x</mi>
<mi>i</mi>
</msub>
<mo>,</mo>
<msub>
<mi>u</mi>
<mi>j</mi>
</msub>
</mrow>
<mo>)</mo>
</mrow>
<mo>=</mo>
<msqrt>
<mrow>
<msub>
<mi>w</mi>
<mn>1</mn>
</msub>
<msup>
<mrow>
<mo>(</mo>
<mrow>
<msub>
<mi>x</mi>
<mrow>
<mi>i</mi>
<mn>1</mn>
</mrow>
</msub>
<mo>-</mo>
<msub>
<mi>u</mi>
<mrow>
<mi>j</mi>
<mn>1</mn>
</mrow>
</msub>
</mrow>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<mn>...</mn>
<mo>+</mo>
<msub>
<mi>w</mi>
<mi>p</mi>
</msub>
<msup>
<mrow>
<mo>(</mo>
<mrow>
<msub>
<mi>x</mi>
<mrow>
<mi>i</mi>
<mi>p</mi>
</mrow>
</msub>
<mo>-</mo>
<msub>
<mi>u</mi>
<mrow>
<mi>j</mi>
<mi>p</mi>
</mrow>
</msub>
</mrow>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<mn>...</mn>
<mo>+</mo>
<msub>
<mi>w</mi>
<mi>m</mi>
</msub>
<msup>
<mrow>
<mo>(</mo>
<mrow>
<msub>
<mi>x</mi>
<mrow>
<mi>i</mi>
<mi>m</mi>
</mrow>
</msub>
<mo>-</mo>
<msub>
<mi>u</mi>
<mrow>
<mi>j</mi>
<mi>m</mi>
</mrow>
</msub>
</mrow>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</msqrt>
</mrow>
Step 3.4 renewal cluster centre point is the average per class;
Step 3.5 computes repeatedly step 3.3 and 3.4 until cluster centre no longer changes.
5. a kind of Recognition with Recurrent Neural Network based on comentropy cluster and ATTENTION mechanism according to claim 1 is short-term
Load forecasting method, it is characterised in that in step 5,
For predicting the sequence of short term to sequence (sequence2sequence) architecture.The framework is by two
LSTM (shot and long term memory network) network forms:Encoder and decoder, the task of encoder are to change the input of variable-length
Sequence is simultaneously encoded to regular length vector, is then used as the input state of decoder, it is n's that decoder, which produces length,
Forecasting sequence, in the load value y={ y of Encoder input similar historical days0, y1..., ym-1And characteristic vector f={ f0,
f1..., fm, and notice mechanism allows decoder to participate in the different piece that prediction exports when each step exports, by model
Decoder fractional prediction load valuesRealize the high-precision forecast of short term.In order to improve prediction
Precision, Recognition with Recurrent Neural Network is shot and long term memory network (LSTM) structure here.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710848981.9A CN107590567A (en) | 2017-09-13 | 2017-09-13 | Recurrent neural network short-term load prediction method based on information entropy clustering and attention mechanism |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710848981.9A CN107590567A (en) | 2017-09-13 | 2017-09-13 | Recurrent neural network short-term load prediction method based on information entropy clustering and attention mechanism |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107590567A true CN107590567A (en) | 2018-01-16 |
Family
ID=61047522
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710848981.9A Pending CN107590567A (en) | 2017-09-13 | 2017-09-13 | Recurrent neural network short-term load prediction method based on information entropy clustering and attention mechanism |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107590567A (en) |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108229580A (en) * | 2018-01-26 | 2018-06-29 | 浙江大学 | Sugared net ranking of features device in a kind of eyeground figure based on attention mechanism and Fusion Features |
CN108416690A (en) * | 2018-01-19 | 2018-08-17 | 中国矿业大学 | Load Forecasting based on depth LSTM neural networks |
CN108510113A (en) * | 2018-03-21 | 2018-09-07 | 中南大学 | A kind of application of XGBoost in short-term load forecasting |
CN108648746A (en) * | 2018-05-15 | 2018-10-12 | 南京航空航天大学 | A kind of open field video natural language description generation method based on multi-modal Fusion Features |
CN108921341A (en) * | 2018-06-26 | 2018-11-30 | 国网山东省电力公司电力科学研究院 | A kind of steam power plant's short term thermal load forecasting method encoded certainly based on gate |
CN109034500A (en) * | 2018-09-04 | 2018-12-18 | 湘潭大学 | A kind of mid-term electric load forecasting method of multiple timings collaboration |
CN109034453A (en) * | 2018-06-21 | 2018-12-18 | 南京邮电大学 | A kind of Short-Term Load Forecasting Method based on multiple labeling neural network |
CN109165854A (en) * | 2018-08-29 | 2019-01-08 | 中国民用航空总局第二研究所 | Blank pipe operational efficiency grade appraisal procedure and its device |
CN109242220A (en) * | 2018-11-16 | 2019-01-18 | 国家电网有限公司 | Charging station transaction power predicating method, device, electronic equipment and storage medium |
CN109447107A (en) * | 2018-09-14 | 2019-03-08 | 华南理工大学 | Office building air-conditioning based on comentropy is daily can mode exception online test method |
CN109583625A (en) * | 2018-10-19 | 2019-04-05 | 顺丰科技有限公司 | One kind pulling part amount prediction technique, system, equipment and storage medium |
CN109657890A (en) * | 2018-09-14 | 2019-04-19 | 阿里巴巴集团控股有限公司 | A kind of risk for fraud of transferring accounts determines method and device |
CN109685290A (en) * | 2019-02-11 | 2019-04-26 | 南方电网科学研究院有限责任公司 | Deep learning-based power consumption prediction method, device and equipment |
CN109754118A (en) * | 2018-12-26 | 2019-05-14 | 复旦大学 | A kind of prediction technique of system self-adaption |
CN109840613A (en) * | 2018-10-25 | 2019-06-04 | 浙江理工大学 | A kind of short-term wind speed forecasting method merging coding and decoding and linear regression |
CN109886747A (en) * | 2019-02-22 | 2019-06-14 | 网易(杭州)网络有限公司 | Method for Sales Forecast method, medium, device and calculating equipment |
CN110070209A (en) * | 2019-03-18 | 2019-07-30 | 天津理工大学 | A kind of space-heating system short-term load forecasting method based on SD-DNNs |
CN110084406A (en) * | 2019-04-03 | 2019-08-02 | 新奥数能科技有限公司 | Load forecasting method and device based on self-encoding encoder and meta learning strategy |
CN110084424A (en) * | 2019-04-25 | 2019-08-02 | 国网浙江省电力有限公司 | A kind of Methods of electric load forecasting based on LSTM and LGBM |
CN110210993A (en) * | 2019-05-22 | 2019-09-06 | 重庆大学 | The short-term Gas Load Forecasting method in city based on Recognition with Recurrent Neural Network model |
CN110210677A (en) * | 2019-06-06 | 2019-09-06 | 国网山东省电力公司莱芜供电公司 | A kind of bus Short-term Load Forecast method and apparatus of combination cluster and deep learning algorithm |
CN110245801A (en) * | 2019-06-19 | 2019-09-17 | 中国电力科学研究院有限公司 | A kind of Methods of electric load forecasting and system based on combination mining model |
CN110266002A (en) * | 2019-06-20 | 2019-09-20 | 北京百度网讯科技有限公司 | Method and apparatus for predicting electric load |
CN111079805A (en) * | 2019-12-03 | 2020-04-28 | 浙江工业大学 | Abnormal image detection method combining attention mechanism and information entropy minimization |
CN111080032A (en) * | 2019-12-30 | 2020-04-28 | 成都数之联科技有限公司 | Load prediction method based on Transformer structure |
CN111291940A (en) * | 2020-03-02 | 2020-06-16 | 桂林电子科技大学 | Student class dropping prediction method based on Attention deep learning model |
CN111680786A (en) * | 2020-06-10 | 2020-09-18 | 中国地质大学(武汉) | Time sequence prediction method based on improved weight gating unit |
CN112163689A (en) * | 2020-08-18 | 2021-01-01 | 国网浙江省电力有限公司绍兴供电公司 | Short-term load quantile probability prediction method based on depth Attention-LSTM |
CN113051130A (en) * | 2021-03-19 | 2021-06-29 | 南京航空航天大学 | Mobile cloud load prediction method and system of LSTM network combined with attention mechanism |
CN113762356A (en) * | 2021-08-17 | 2021-12-07 | 中山大学 | Cluster load prediction method and system based on clustering and attention mechanism |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103218675A (en) * | 2013-05-06 | 2013-07-24 | 国家电网公司 | Short-term load prediction method based on clustering and sliding window |
CN105631483A (en) * | 2016-03-08 | 2016-06-01 | 国家电网公司 | Method and device for predicting short-term power load |
-
2017
- 2017-09-13 CN CN201710848981.9A patent/CN107590567A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103218675A (en) * | 2013-05-06 | 2013-07-24 | 国家电网公司 | Short-term load prediction method based on clustering and sliding window |
CN105631483A (en) * | 2016-03-08 | 2016-06-01 | 国家电网公司 | Method and device for predicting short-term power load |
Non-Patent Citations (2)
Title |
---|
HUITING ZHENG等: "Short-Term Load Forecasting Using EMD-LSTM Neural Networks with a Xgboost Algorithm for Feature Importance Evaluation", 《ENERGIES》 * |
TIANQI CHEN等: "XGBoost: A Scalable Tree Boosting System", 《ARXIV》 * |
Cited By (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108416690A (en) * | 2018-01-19 | 2018-08-17 | 中国矿业大学 | Load Forecasting based on depth LSTM neural networks |
CN108229580A (en) * | 2018-01-26 | 2018-06-29 | 浙江大学 | Sugared net ranking of features device in a kind of eyeground figure based on attention mechanism and Fusion Features |
CN108229580B (en) * | 2018-01-26 | 2020-12-11 | 浙江大学 | Sugar net feature grading device in fundus map based on attention mechanism and feature fusion |
CN108510113A (en) * | 2018-03-21 | 2018-09-07 | 中南大学 | A kind of application of XGBoost in short-term load forecasting |
CN108648746B (en) * | 2018-05-15 | 2020-11-20 | 南京航空航天大学 | Open domain video natural language description generation method based on multi-modal feature fusion |
CN108648746A (en) * | 2018-05-15 | 2018-10-12 | 南京航空航天大学 | A kind of open field video natural language description generation method based on multi-modal Fusion Features |
CN109034453A (en) * | 2018-06-21 | 2018-12-18 | 南京邮电大学 | A kind of Short-Term Load Forecasting Method based on multiple labeling neural network |
CN108921341A (en) * | 2018-06-26 | 2018-11-30 | 国网山东省电力公司电力科学研究院 | A kind of steam power plant's short term thermal load forecasting method encoded certainly based on gate |
CN109165854A (en) * | 2018-08-29 | 2019-01-08 | 中国民用航空总局第二研究所 | Blank pipe operational efficiency grade appraisal procedure and its device |
CN109165854B (en) * | 2018-08-29 | 2021-06-18 | 中国民用航空总局第二研究所 | Empty pipe operation efficiency grade evaluation method and device |
CN109034500A (en) * | 2018-09-04 | 2018-12-18 | 湘潭大学 | A kind of mid-term electric load forecasting method of multiple timings collaboration |
CN109447107B (en) * | 2018-09-14 | 2021-08-10 | 华南理工大学 | On-line detection method for daily energy consumption mode abnormality of air conditioner of office building based on information entropy |
CN109657890A (en) * | 2018-09-14 | 2019-04-19 | 阿里巴巴集团控股有限公司 | A kind of risk for fraud of transferring accounts determines method and device |
CN109447107A (en) * | 2018-09-14 | 2019-03-08 | 华南理工大学 | Office building air-conditioning based on comentropy is daily can mode exception online test method |
CN109583625A (en) * | 2018-10-19 | 2019-04-05 | 顺丰科技有限公司 | One kind pulling part amount prediction technique, system, equipment and storage medium |
CN109840613A (en) * | 2018-10-25 | 2019-06-04 | 浙江理工大学 | A kind of short-term wind speed forecasting method merging coding and decoding and linear regression |
CN109840613B (en) * | 2018-10-25 | 2021-02-23 | 浙江理工大学 | Short-term wind speed prediction method integrating coding and decoding and linear regression |
CN109242220A (en) * | 2018-11-16 | 2019-01-18 | 国家电网有限公司 | Charging station transaction power predicating method, device, electronic equipment and storage medium |
CN109754118A (en) * | 2018-12-26 | 2019-05-14 | 复旦大学 | A kind of prediction technique of system self-adaption |
CN109685290A (en) * | 2019-02-11 | 2019-04-26 | 南方电网科学研究院有限责任公司 | Deep learning-based power consumption prediction method, device and equipment |
CN109685290B (en) * | 2019-02-11 | 2023-06-16 | 南方电网科学研究院有限责任公司 | Power consumption prediction method, device and equipment based on deep learning |
CN109886747A (en) * | 2019-02-22 | 2019-06-14 | 网易(杭州)网络有限公司 | Method for Sales Forecast method, medium, device and calculating equipment |
CN110070209A (en) * | 2019-03-18 | 2019-07-30 | 天津理工大学 | A kind of space-heating system short-term load forecasting method based on SD-DNNs |
CN110084406B (en) * | 2019-04-03 | 2021-09-24 | 新奥数能科技有限公司 | Load prediction method and device based on self-encoder and meta-learning strategy |
CN110084406A (en) * | 2019-04-03 | 2019-08-02 | 新奥数能科技有限公司 | Load forecasting method and device based on self-encoding encoder and meta learning strategy |
CN110084424A (en) * | 2019-04-25 | 2019-08-02 | 国网浙江省电力有限公司 | A kind of Methods of electric load forecasting based on LSTM and LGBM |
CN110210993A (en) * | 2019-05-22 | 2019-09-06 | 重庆大学 | The short-term Gas Load Forecasting method in city based on Recognition with Recurrent Neural Network model |
CN110210993B (en) * | 2019-05-22 | 2023-04-07 | 重庆大学 | Urban short-term gas load prediction method based on cyclic neural network model |
CN110210677A (en) * | 2019-06-06 | 2019-09-06 | 国网山东省电力公司莱芜供电公司 | A kind of bus Short-term Load Forecast method and apparatus of combination cluster and deep learning algorithm |
CN110210677B (en) * | 2019-06-06 | 2022-03-04 | 国网山东省电力公司莱芜供电公司 | Bus short-term daily load prediction method and device combining clustering and deep learning algorithm |
CN110245801A (en) * | 2019-06-19 | 2019-09-17 | 中国电力科学研究院有限公司 | A kind of Methods of electric load forecasting and system based on combination mining model |
CN110266002A (en) * | 2019-06-20 | 2019-09-20 | 北京百度网讯科技有限公司 | Method and apparatus for predicting electric load |
CN111079805A (en) * | 2019-12-03 | 2020-04-28 | 浙江工业大学 | Abnormal image detection method combining attention mechanism and information entropy minimization |
CN111080032A (en) * | 2019-12-30 | 2020-04-28 | 成都数之联科技有限公司 | Load prediction method based on Transformer structure |
CN111080032B (en) * | 2019-12-30 | 2023-08-29 | 成都数之联科技股份有限公司 | Load prediction method based on transducer structure |
CN111291940A (en) * | 2020-03-02 | 2020-06-16 | 桂林电子科技大学 | Student class dropping prediction method based on Attention deep learning model |
CN111291940B (en) * | 2020-03-02 | 2022-06-07 | 桂林电子科技大学 | Student class dropping prediction method based on Attention deep learning model |
CN111680786A (en) * | 2020-06-10 | 2020-09-18 | 中国地质大学(武汉) | Time sequence prediction method based on improved weight gating unit |
CN111680786B (en) * | 2020-06-10 | 2023-12-05 | 中国地质大学(武汉) | Time sequence prediction method based on improved weight gating unit |
CN112163689A (en) * | 2020-08-18 | 2021-01-01 | 国网浙江省电力有限公司绍兴供电公司 | Short-term load quantile probability prediction method based on depth Attention-LSTM |
CN113051130A (en) * | 2021-03-19 | 2021-06-29 | 南京航空航天大学 | Mobile cloud load prediction method and system of LSTM network combined with attention mechanism |
CN113051130B (en) * | 2021-03-19 | 2023-05-02 | 南京航空航天大学 | Mobile cloud load prediction method and system of LSTM network combined with attention mechanism |
CN113762356B (en) * | 2021-08-17 | 2023-06-16 | 中山大学 | Cluster load prediction method and system based on clustering and attention mechanism |
CN113762356A (en) * | 2021-08-17 | 2021-12-07 | 中山大学 | Cluster load prediction method and system based on clustering and attention mechanism |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107590567A (en) | Recurrent neural network short-term load prediction method based on information entropy clustering and attention mechanism | |
Tang et al. | Short‐term power load forecasting based on multi‐layer bidirectional recurrent neural network | |
Ke et al. | Short-term electrical load forecasting method based on stacked auto-encoding and GRU neural network | |
CN109754113B (en) | Load prediction method based on dynamic time warping and long-and-short time memory | |
CN108510113A (en) | A kind of application of XGBoost in short-term load forecasting | |
Wang et al. | Deep belief network based k-means cluster approach for short-term wind power forecasting | |
CN108022001A (en) | Short term probability density Forecasting Methodology based on PCA and quantile estimate forest | |
CN109948845A (en) | A kind of distribution network load shot and long term Memory Neural Networks prediction technique | |
CN113554466B (en) | Short-term electricity consumption prediction model construction method, prediction method and device | |
CN111160620B (en) | Short-term wind power prediction method based on end-to-end memory network | |
CN110969290A (en) | Runoff probability prediction method and system based on deep learning | |
CN109492748B (en) | Method for establishing medium-and-long-term load prediction model of power system based on convolutional neural network | |
Liu et al. | Heating load forecasting for combined heat and power plants via strand-based LSTM | |
CN110717610A (en) | Wind power prediction method based on data mining | |
CN114792156A (en) | Photovoltaic output power prediction method and system based on curve characteristic index clustering | |
Al Mamun et al. | A hybrid deep learning model with evolutionary algorithm for short-term load forecasting | |
Inteha | A GRU-GA hybrid model based technique for short term electrical load forecasting | |
Kosana et al. | Hybrid convolutional Bi-LSTM autoencoder framework for short-term wind speed prediction | |
CN111697560A (en) | Method and system for predicting load of power system based on LSTM | |
TWI810487B (en) | Solar power forecasting method | |
Sang et al. | Ensembles of gradient boosting recurrent neural network for time series data prediction | |
CN117674098A (en) | Multi-element load space-time probability distribution prediction method and system for different permeability | |
CN117390550A (en) | Low-carbon park carbon emission dynamic prediction method and system considering emission training set | |
CN115481788B (en) | Phase change energy storage system load prediction method and system | |
CN116883057A (en) | XGBoost-based high-precision power customer marketing channel preference prediction system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20180116 |