CN117081166A - Wind power day scene generation method based on tensor self-organizing mapping neural network - Google Patents

Wind power day scene generation method based on tensor self-organizing mapping neural network Download PDF

Info

Publication number
CN117081166A
CN117081166A CN202310972772.0A CN202310972772A CN117081166A CN 117081166 A CN117081166 A CN 117081166A CN 202310972772 A CN202310972772 A CN 202310972772A CN 117081166 A CN117081166 A CN 117081166A
Authority
CN
China
Prior art keywords
cluster
scene
daily
wind power
self
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310972772.0A
Other languages
Chinese (zh)
Inventor
李丹
王奇
孙光帆
杨帆
谭雅
章可
甘月琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Three Gorges University CTGU
Original Assignee
China Three Gorges University CTGU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Three Gorges University CTGU filed Critical China Three Gorges University CTGU
Priority to CN202310972772.0A priority Critical patent/CN117081166A/en
Publication of CN117081166A publication Critical patent/CN117081166A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J3/00Circuit arrangements for ac mains or ac distribution networks
    • H02J3/38Arrangements for parallely feeding a single network by two or more generators, converters or transformers
    • H02J3/46Controlling of the sharing of output between the generators, converters, or transformers
    • H02J3/466Scheduling the operation of the generators, e.g. connecting or disconnecting generators to meet a given demand
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • G06N3/0455Auto-encoder networks; Encoder-decoder networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0475Generative networks
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J2203/00Indexing scheme relating to details of circuit arrangements for AC mains or AC distribution networks
    • H02J2203/20Simulating, e g planning, reliability check, modelling or computer assisted design [CAD]
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J2300/00Systems for supplying or distributing electric power characterised by decentralized, dispersed, or local generation
    • H02J2300/20The dispersed energy generation being of renewable origin
    • H02J2300/28The renewable source being wind energy
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J2300/00Systems for supplying or distributing electric power characterised by decentralized, dispersed, or local generation
    • H02J2300/40Systems for supplying or distributing electric power characterised by decentralized, dispersed, or local generation wherein a plurality of decentralised, dispersed or local energy generation technologies are operated simultaneously
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E10/00Energy generation through renewable energy sources
    • Y02E10/70Wind energy
    • Y02E10/76Power conversion electric or electronic aspects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Power Engineering (AREA)
  • Wind Motors (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a wind power day scene generation method based on a tensor self-organizing map neural network, which comprises the following steps: acquiring a daily scene data set of a multi-wind power plant; clustering the daily scene data set by using a tensor self-organizing map neural network; respectively constructing a variation self-encoder for each cluster, and extracting hidden features from daily scene data; randomly simulating and sampling the daily scene data of each cluster by using the implicit characteristics to obtain a daily Jing Yinhan variable data set; decoding and reconstructing the daily scene Jing Yinhan variable data set to obtain daily scene data reconstructed by each cluster; and aggregating the reconstructed daily scene data of each cluster to obtain a reconstructed daily scene data set. According to the invention, after the daily scene Jing Yangben is clustered, the dimension of the samples of each cluster is reduced and reconstructed, so that the accuracy of daily scene data generated by reconstruction is improved.

Description

Wind power day scene generation method based on tensor self-organizing mapping neural network
Technical Field
The invention belongs to the field of renewable energy power generation and comprehensive absorption, and particularly relates to a wind power daily scene generation method based on a tensor self-organizing map neural network.
Background
The scene analysis method analyzes the uncertainty of the power system by constructing a deterministic scene, and becomes an effective way for solving the problems of planning and optimizing operation of the power system containing renewable energy sources. The wind power output scene generation is usually modeled by adopting a statistical method, wind power output uncertainty is assumed to be a statistical model, a probability model conforming to wind power output distribution rules is established by adopting historical data, and an output scene sample is randomly generated by combining a sampling method. However, the difficulty of the traditional statistical models is how to build a proper probability model for wind power output uncertainty, and they are used to describe uncertainty of single-feature time sequence scenes, and are suitable for generating total wind power output scenes after aggregation of a certain place or system. Meanwhile, the neural network can solve the problem of difficult modeling of probability distribution to a certain extent in terms of describing uncertainty of wind power output, but the traditional supervised learning model is difficult to fit the probability distribution and has higher requirements on training data quantity. Meanwhile, a difficulty of scene generation of multiple wind farms is that complex time-space correlations are involved. The rapid development of unsupervised learning generation models would be expected to solve these problems, generate probability distributions for which the models can learn training data, and generate new samples that obey the data probability distributions. Currently, the deep learning methods applied to power system scene generation mainly include generation of a countermeasure network (generative adversarial networks, GAN) and a variational auto-encoder (VAE) class 2. The deep learning technology has achieved a certain result in the aspect of generating a scene before the renewable energy source day, but the existing method is to uniformly transform the space-time power in a sample scene into one-dimensional vector, process and model the one-dimensional vector, so that the model of the one-dimensional vector cannot accurately reflect the real distribution relation of the original power in time and space dimensions, and the accuracy of scene generation is affected.
Therefore, the method for researching the generation of the space-time power daily scene of the multi-wind power plant is used for realizing the random generation of the space-time power daily scene of the multi-wind power plant according to the characteristics of wind power data, and the method for combining tensor Self-organizing map neural network (Self-organizing feature Map, SOM) clustering and VAE variation Self-encoder dimension reduction is provided for realizing the generation of the space-time power daily scene of the multi-wind power plant.
Disclosure of Invention
Aiming at the problems, the invention considers the diversity of the time-space power correlation and accurately reflects the power time-space distribution relation in the original scene, and provides a scene generation method based on the combination of a tensor distance self-organizing mapping neural network and a variational self-encoder neural network, so that the time-space power daily scene of a multi-wind power plant can be randomly generated. The method is based on the space-time second-order tensor distance of the power day scene, clusters the historical space-time power day scene sample set, and ensures that the day scene samples in the same cluster have similar space-time correlation; respectively constructing a VAE coding and decoding network for each cluster of daily fields Jing Jige to realize bidirectional transformation between high-dimensional actual space-time power in daily scenes and low-dimensional implicit features obeying independent normal distribution; by randomly sampling independent multidimensional normal distribution of implicit features, decoding to obtain daily scene generation samples of each cluster and proportionally aggregating the daily scene generation samples, a new scene set with similar probability distribution and space-time correlation rules with the original data is obtained, and the effectiveness of scene generation is improved.
The technical proposal of the invention is a wind power daily scene generation method based on tensor self-organizing map neural network, which firstly utilizes tensor self-organizing map neural network to cluster multi-wind power space-time power daily scenes, reduces the dimension of each cluster daily scene of the cluster, independently samples hidden characteristics of the dimension reduction, decodes the sampling characteristics to obtain a new scene set of multi-wind power space-time power, the multi-wind power space-time power daily scene generation method comprises the following steps,
step 1: collecting wind power output daily scene data of a plurality of wind power stations to obtain a daily scene data set;
step 2: clustering the daily scene data set by using a tensor self-organizing map neural network to obtain a daily scene cluster;
step 2.1: inputting multi-wind farm space-time power history sample dataWherein x is i The data of the ith daily scene is represented, N represents the number of historical sample days, m is the number of wind farms, h is the number of time points in the day, and +.>Representing an m x h dimensional real space;
step 2.2: self-organizing mapping of each node weight W of neural network output layer to tensor j J=1, 2, … J randomly gives an initial value;
step 2.3: space-time power day scene data x for multiple wind farms i Calculate and x i Connecting weight matrix with shortest distance to obtain winning unit j of tensor self-organizing mapping neural network * The formula is as follows:
in j * Representing a winning unit; j represents the number of neurons of the input layer of the tensor self-organizing map neural network; i represent a distance function; g lm For element position metric coefficients, G represents a metric matrix related to element position distances;
step 2.4: defining a neighborhood of winning cellsFor the units in the adjacent area, the weight is adjusted to be towards x i The iterative calculation formula of the weight adjustment is as follows:
w in ij (t)、w ij (t+1) represents weights of neurons i to j at times t and t+1, respectively; alpha (t, N) represents the ith neuron in the neighborhood and the winning neuron j * A function of the topological distance D between them;
step 2.5: repeating the step 2.3 and the step 2.4, and when the training ending condition alpha (t) is less than or equal to alpha min When training is stopped, alpha (t) represents learning rate, alpha min Representing the minimum learning rate and outputting the clusterWherein X is k Represents a k-th group wind power dataset, +.>Represents the nth k Daily scene data, n k The number of samples of the wind power data of the kth cluster is K, and the number of clusters is K;
step 3: respectively constructing a variation self-encoder for each cluster, and extracting hidden features from the daily scene data by using the encoder of the variation self-encoder;
step 4: randomly simulating and sampling the daily scene data of each cluster by utilizing the implicit characteristics extracted in the step 3 to obtain a daily field Jing Yinhan variable data set of each cluster;
step 5: decoding and reconstructing the daily field Jing Yinhan variable data set by using a decoder of a variable self-encoder to obtain daily scene data reconstructed by each cluster; and aggregating the reconstructed daily scene data of each cluster to obtain a reconstructed daily scene data set.
Further, the calculation process of the tensor distance self-organizing map neural network specifically includes:
tensor of settingN>1, x is its vectorised form, for element +.>1≤i j ≤I j ,1≤j≤N,x l ,x m The first and the m elements in vectorized form, respectively, wherein +.>Tensor->Tensor of the same space as χ, then +.>And->The tensor distance between them can be calculated by equation (4):
d in TD Represent tensor distance, g lm For the element position measurement coefficient, G represents a measurement matrix related to the element position distance, reflecting the internal relation of different coordinates between the multi-order data. Element position measurement coefficient g lm The formula of (2) is as follows:
sigma is regularization parameter, ||p l -p m || 2 Represents x l And x m In the original tensor space position coordinates (i 1 ,i 2 ,...,i N ) Anddistance between:
when the value of G is the unit array I, the calculated tensor distance is equal to the Euclidean distance.
Further, step 3 comprises the sub-steps of:
step 3.1: scene sample set X based on each cluster k Unsupervised training of variable component self-encoder (VAE) corresponding to each cluster k ,k=1,2,...,K;
Step 3.2: VAE with variable self-encoder k Is used for extracting a sample set X by an encoder of (1) k Is implicit in (1) to obtain a sample set X k Is represented by the implicit features mu and sigma of (a), mu representing the sample set X k Sigma represents the sample set X k Is a variance of (c).
Further, step 4 comprises the sub-steps of:
step 4.1: using sample set X k Random simulation is carried out on implicit characteristics mu and sigma of the model to obtain a sample set X k ={x 1 ,x 2 ,...,x nk Implicit variable z k =μ(x k )+εσ(x k ),μ(x k )、σ(x k ) Respectively representing encoder-generated sample sets X k ={x 1 ,x 2 ,...,x nk Mean and variance, epsilon is a random number that obeys a standard normal distribution N (0, 1),
step 4.2: for implicit variable z i Independently sampling to obtain n' k =Mn k N implicit variable samples subject to standard normal distribution, wherein M represents total scene number, N represents historical sample days, N k Representing the number of samples of the kth cluster;
step 4.3: implications to be extractedVariable sample, and constitute r-dimensional implicit characteristic vector sample setz′ j Representing the jth implicit feature vector.
Further, step 5 comprises the following sub-steps:
step 5.1: sample set Z 'of implicit feature vectors' k Input corresponds to the cluster and divides from the decoder of the encoder, utilize the decoder to divide Z' k Reconstructing to obtain a daily scene sample set generated by reconstruction of the kth cluster1≤j≤n′ k ,/>Day scene data representing a jth reconstruction generation;
step 5.2: aggregating the K cluster generation day scene sample sets to obtain M random simulation new scenes of the space-time power of the multi-wind power plant
Compared with the prior art, the invention has the beneficial effects that:
1) According to the scene generation method, tensor distance self-organizing mapping neural network clustering is carried out on the multi-wind power plant data, so that the real distribution of wind power in time or space dimension is accurately reflected, the clustering result with smaller similarity and more diversified information between classes can be obtained, and the diversity and accuracy of the clustering result are improved;
2) According to the invention, the generated network model is constructed for each cluster, so that the training effect of the generated network can be effectively improved, the reconstruction error can be reduced, and the precision of the generated network and the diversity of the generated scene can be improved.
3) The scene generation method provided by the invention has the advantages that the space and time correlation errors of wind power are obviously reduced, the accuracy of probability distribution characteristics is effectively improved, and meanwhile, the characteristic expression capability of the scene generation method is enhanced.
Drawings
The invention is further described below with reference to the drawings and examples.
Fig. 1 is a SOM neural network model according to an embodiment of the present invention.
FIG. 2 is a graph of SOM feature contrast based on tensor distance and Euclidean distance.
Fig. 3 is a block diagram of a VAE model.
Fig. 4 is a scene generation flowchart of a multi-wind farm scene generation method according to an embodiment of the present invention.
FIG. 5 is a graph showing the comparison of the sample duty ratios of each cluster in two clustering methods, namely Euclidean distance SOM and tensor distance SOM, according to an embodiment of the present invention
Fig. 6a is a cluster center diagram of a tensor distance SOM clustering result according to an embodiment of the present invention.
Fig. 6b is a cluster center diagram of the euclidean distance SOM clustering result according to an embodiment of the present invention.
FIG. 7 is a graph comparing the change in loss values during training of clustered and non-clustered VAEs according to an embodiment of the present invention.
FIG. 8 is a map of MAPE error comparison of clustered and non-clustered generated scenarios according to an embodiment of the present invention.
Fig. 9a is a matrix diagram of spatial correlation of historical scene data according to an embodiment of the present invention.
Fig. 9b is an absolute error diagram of a spatial correlation matrix of a scene generated by the daily scene generation method of the present invention.
Fig. 9c is an absolute error diagram of a spatial correlation matrix of scenes generated by a daily scene generation method without clustering applied VAEs.
Fig. 9d is an absolute error diagram of a spatial correlation matrix of a scene generated by a daily scene generation method combining euclidean distance SOM and VAE.
Fig. 10a is a matrix diagram of time-dependent relationships of historical scene data according to an embodiment of the present invention.
Fig. 10b is a graph of absolute error of time correlation coefficients of a generated scene by the daily scene generation method of the present invention.
Fig. 10c is a graph of absolute error of time correlation coefficients of a non-clustered daily scene generation method using VAE.
Fig. 10d is a graph of absolute error of time correlation coefficients of a scene generated by a daily scene generation method combining euclidean distance SOM with VAE.
FIG. 11 is a cumulative probability distribution of scene average power generated by three methods according to embodiments of the invention.
Detailed Description
As shown in fig. 1, the wind power daily scene generation method based on tensor self-organizing map neural network adopts an SOM neural network, and the method extracts important features or internal rules in a group of data for classification by simulating the self-organizing map function of a brain nervous system and unsupervised competitive learning. Any high-dimensional input can be mapped to a low-dimensional space using a SOM network, and certain similar properties inside the input data are represented as geometrically contiguous feature maps, preserving invariance of the data topology.
As shown in fig. 2, the tensor distance and euclidean distance in the wind power daily scene generation method based on the tensor self-organizing map neural network are compared, so that the space-time characteristics among scene data are reserved, and the space-time position relation of the data in the scene can be embodied through weights.
As shown in fig. 3, the variable self-Encoder network structure in the wind power day scene generation method based on the tensor self-organizing map neural network comprises two parts of an Encoder (Encoder) and a Decoder (Decoder). The encoder generates hidden variable z of the corresponding sample from the input sample data and marks the hidden variable z as a recognition model q φ (z|x) and the decoder generates observations from a series of hidden variables zIs recorded as a generation model p θ (x|z). In an embodiment, both the decoder and the encoder employ a feed-forward neural network that includes a plurality of hidden layers.
In order to solve the problem that the distribution of the hidden variable z is not directly observable, the VAE introduces an identification model q in the inferred network φ (z|x) instead of the true posterior distribution p which cannot be determined θ (x|z), and assume that the model q is identified φ (z|x) is a known distribution form, usually using standard normal componentsCloth, thus identifying model q φ (z|x) can be used as the inferred network portion of the VAE, condition distribution p θ (x|z) as part of the generation network. In order to identify the model q φ (z|x) and true posterior distribution p θ (x|z) is approximately equal, and the VAE uses KL divergence to measure the similarity between the two. The complete calculation formula for the loss function L (θ, φ, x) of the VAE is as follows:
L(θ,φ,x)=KL((q φ (z|x)),(p θ (z|x)))
-E qφ(z|x) [log(p θ (x|z)] (7)
wherein the former term KL ((q) φ (z|x)),(p θ (z|x))) the probability distribution, q, characterizing the implicit variable z φ Similarity of (z|x), the closer the probability distribution of the two, the smaller the KL divergence, the later term E qφ(z|x) [log(p θ (x|z)]The error between the reconstructed sample and the original sample is characterized. I.e. in the training of the VAE network, the training goal pursues that the reconstruction error between the reconstructed sample and the original sample is minimal, and the probability distribution of the implicit variable z is as close as possible to the prior distribution such as the standard normal N (0, 1).
As shown in fig. 4, the wind power daily scene generation method based on the tensor self-organizing map neural network specifically includes the following steps:
step 1: collecting wind power output daily scene data of a plurality of wind power stations to obtain a daily scene data set;
step 2: clustering the daily scene data set by using a tensor self-organizing map neural network to obtain a daily scene cluster;
step 2.1: inputting multi-wind farm space-time power history sample dataWherein x is i And (3) representing the ith daily scene data, wherein N is the number of historical sample days, m is the number of wind farms, and h is the number of time points in the day.
Step 2.2: self-organizing mapping of each node weight W of neural network output layer to tensor j (j=1, 2, … J) is randomly assigned an initial value.
Step 2.3: for multiple wind farmsSpatiotemporal power daily scene data x i Calculate x i And W is equal to j The connection weight matrix with the shortest distance (j=1, 2, … J) is calculated as follows:
in j * Is a winning unit; j is the number of neurons of the output layer; the I & I is a distance function; g lm For element position metric coefficients, G represents a metric matrix related to element position distances;
step 2.4: defining a neighborhood of winning cellsFor the units in the adjacent area, the weight is adjusted to be towards x i Closing; the iterative calculation formula of the weight adjustment is as follows:
wherein w is ij (t)、w ij (t+1) represents weights of neurons i to j at times t and t+1, respectively; alpha (t, N) represents the ith neuron in the neighborhood and the winning neuron j * A function of the topological distance D between them;
step 2.5: repeating the step 2.3 and the step 2.4, and when the training ending condition alpha (t) is less than or equal to alpha min Stopping training at the time of alpha min Representing the minimum learning rate and outputting the clusterWherein n is k And K is the number of clusters, wherein K is the number of samples of the kth cluster wind power data.
The calculation process of the tensor distance specifically comprises the following steps:
tensor of settingN>1, x is its vectorised form, for element +.>1≤i j ≤I j ,1≤j≤N,x l ,x m The first and the m elements in vectorized form, respectively, wherein +.>Tensor->Is in combination with->Tensor of the same space, then->And->The tensor distance between them can be calculated by equation (4):
d in TD Represent tensor distance, g lm Is an element position measurement coefficient, and G is a measurement matrix related to element position distances, reflecting the internal relation of different coordinates between multi-order data. g lm Is defined as follows:
in the formula, sigma is regularization parameter, ||p l -p m || 2 Is x l And x m In the original tensor space position coordinates (i 1 ,i 2 ,...,i N ) Anddistance between:
when the value of G is the unit array I, the calculated tensor distance is equal to the Euclidean distance.
Step 3: respectively constructing a variation self-encoder for each cluster, and extracting hidden features from the daily scene data by using the encoder of the variation self-encoder;
step 3.1: scene sample set X based on each cluster k Unsupervised training of VAE corresponding to each cluster k Network k=1, 2,.;
step 3.2: VAE with variable self-encoder k Is used for extracting a sample set X by an encoder of (1) k Is implicit in (1) to obtain a sample set X k Is represented by the implicit features mu and sigma of (a), mu representing the sample set X k Sigma represents the sample set X k Is a variance of (c).
Step 4: randomly simulating and sampling the daily scene data of each cluster according to the extracted implicit characteristics to obtain a daily field Jing Yinhan variable data set of each cluster;
step 4.1: using sample set X k Random simulation is carried out on implicit characteristics mu and sigma of the model to obtain a sample setImplicit variable z of (2) k =μ(x k )+εσ(x k ),μ(x k )、σ(x k ) Respectively representing encoder generated sample sets +.>Epsilon is a random number obeying a standard normal distribution N (0, 1),
step 4.2: independent sampling is carried out on the implicit characteristics to obtain n' k =Mn k N implicit variable samples subject to standard normal distribution, wherein M is the total scene number, N represents the number of historical sample days, N k Representing the number of samples of the kth cluster;
step 4.3: the extracted implicit variable samples are formed into an r-dimensional implicit characteristic vector sample setz′ j Representing the jth implicit feature vector.
Step 5: decoding and reconstructing the daily field Jing Yinhan variable data set by using a decoder of a variable self-encoder to obtain daily scene data reconstructed by each cluster; and aggregating the reconstructed daily scene data of each cluster to obtain a reconstructed daily scene data set.
Step 5.1: sample set Z 'of implicit feature vectors' k Input corresponds to the cluster and divides from the decoder of the encoder, utilize the decoder to divide Z' k Reconstructing to obtain a daily scene sample set generated by reconstruction of the kth cluster Day scene data representing a jth reconstruction generation;
step 5.2: aggregating the K cluster generation day scene sample sets to obtain M random simulation new scenes of the space-time power of the multi-wind power plant
Fig. 5 shows the proportion of the number of samples of each cluster in the euclidean distance SOM neural network clustering result and the tensor distance SOM neural network clustering result.
Fig. 6a and 6b show a cluster center daily field Jing Yuntu of each cluster sample in the result of the clustering of the euclidean distance SOM neural network and the result of the clustering of the tensor distance SOM neural network.
Fig. 7 is a graph showing the comparison of network loss values of the method and the scene generation method adopting the VAE without clustering, and fig. 7 shows that the method is beneficial to pointedly extracting the daily scene essential characteristics with different time-space correlation rules by adopting the VAE to reduce the dimension after clustering the daily scene history samples with similar time-space correlation, thereby improving the accuracy of the VAE generation network and the diversity of the generated scene.
Fig. 8 shows calculation of a mean absolute percentage error of reconstruction error MAPE (mean absolute percentage error, MAPE) between a reconstructed scene generated by a VAE network and an original scene, the calculation formula is as follows:
p in the formula j Rated power of the jth wind power plant, x j,h,i Representing the actual power of the jth moment point of the jth wind farm of the ith daily scene of the original scene data set,and the power of the h moment point of the j-th wind power plant of the i-th day scene of the generated scene is represented.
As can be seen from fig. 8, the MAPE of the generated data of the 4 clusters of the present invention is smaller overall than the MAPE of the non-clustered direct VAE results. The average absolute percentage error of 4 clusters is calculated to obtain a reconstruction error MAPE value of 5.32% according to the sample duty ratio weight, and the MAPE value is reduced by 17.37% compared with the MAPE value of the direct VAE method without clustering.
FIG. 9a illustrates the absolute error of the spatial correlation matrix of historical power day scene data for multiple wind farms according to an embodiment. Fig. 9b, 9c and 9d are respectively absolute errors of a spatial correlation matrix of a scene generated by the method for generating the daily scene by combining the SOM with the VAE, the method for generating the daily scene by applying the VAE without clustering, and the method for generating the daily scene by combining the SOM with the VAE.
FIG. 10a illustrates the absolute error of the time-dependent coefficients of historical power day scene data for a multi-wind farm of an embodiment. Fig. 10b, 10c and 10d are respectively the time-dependent coefficient absolute errors of the scenes generated by the sun scene generation method of the Zhang distance SOM combined with the VAE, the sun scene generation method of the non-clustering application VAE and the sun scene generation method of the Euclidean distance SOM combined with the VAE. From fig. 9a to 9d and fig. 10a to 10d, it can be seen that the spatial and temporal correlation of the scene generated by the method of the present invention is the deepest with the error color between the original scene, which indicates that the error of the correlation coefficient is the smallest.
Fig. 11 shows the empirical cumulative probability distribution of the total average power after computing the average power of the 18 wind farms at each moment in the original scenario and the three method generation scenarios. As can be seen from comparison of the cumulative probability distribution of the three methods in FIG. 11, the three methods all perform better in terms of probability distribution, and the effect of probability modeling in the conventional scene generation method is realized in an unsupervised manner. However, the generated scene is closest to the original scene, which indicates that the probability distribution rule of the historical wind power data can be more accurately captured by the method.

Claims (5)

1. The wind power day scene generation method based on tensor self-organizing map neural network is characterized by comprising the following steps of:
step 1: collecting wind power output daily scene data of a plurality of wind power stations to obtain a daily scene data set;
step 2: clustering the daily scene data set by using a tensor self-organizing map neural network to obtain a daily scene cluster;
step 2.1: inputting multi-wind farm space-time power history sample dataWherein x is i The data of the ith daily scene is represented, N represents the number of historical sample days, m is the number of wind farms, h is the number of time points in the day, and +.>Representing an m x h dimensional real space;
step 2.2: self-organizing mapping of each node weight W of neural network output layer to tensor j J=1, 2, … J randomly gives an initial value;
step 2.3: space-time power day scene data x for multiple wind farms i Calculate and x i Connecting weight matrix with shortest distance to obtain winning unit j of tensor self-organizing mapping neural network *
Step 2.4: defining a neighborhood of winning cellsFor the units in the adjacent area, the weight is adjusted to be towards x i Closing;
step 2.5: repeating the step 2.3 and the step 2.4, and when the training ending condition alpha (t) is less than or equal to alpha min When training is stopped, alpha (t) represents learning rate, alpha min Representing the minimum learning rate and outputting the clusterWherein X is k Represents a k-th group wind power dataset, +.>Represents the nth k Daily scene data, n k The number of samples of the wind power data of the kth cluster is K, and the number of clusters is K;
step 3: respectively constructing a variation self-encoder for each cluster, and extracting hidden features from the daily scene data by using the encoder of the variation self-encoder;
step 4: randomly simulating and sampling the daily scene data of each cluster by utilizing the implicit characteristics extracted in the step 3 to obtain a daily field Jing Yinhan variable data set of each cluster;
step 5: decoding and reconstructing the daily field Jing Yinhan variable data set by using a decoder of a variable self-encoder to obtain daily scene data reconstructed by each cluster; and aggregating the reconstructed daily scene data of each cluster to obtain a reconstructed daily scene data set.
2. The method for generating a daily scene of wind power according to claim 1, wherein in step 2.4, the iterative calculation formula of the weight adjustment is as follows:
w ij (t+1)=w ij (t)+α(t,D)[x i (t)-w ij (t)]
w in ij (t)、w ij (t+1) represents weights of neurons i to j at times t and t+1, respectively; alpha (t, D) represents the ith neuron in the neighborhood and the winning neuron j * A function of the topological distance D between them.
3. A method of generating a wind power day scene according to claim 2, wherein step 3 comprises the sub-steps of:
step 3.1: scene sample set X based on each cluster k Unsupervised training of variable component self-encoder (VAE) corresponding to each cluster k ,k=1,2,...,K;
Step 3.2: VAE with variable self-encoder k Is used for extracting a sample set X by an encoder of (1) k Is implicit in (1) to obtain a sample set X k Is represented by the implicit features mu and sigma of (a), mu representing the sample set X k Sigma represents the sample set X k Is a variance of (c).
4. A method of generating a wind power day scene according to claim 3 wherein step 4 comprises the sub-steps of:
step 4.1: using sample set X k Random simulation is carried out on implicit characteristics mu and sigma of the model to obtain a sample set X k Is an implicit variable z of (2);
step 4.2: independent sampling is carried out on the implicit variable z to obtain n' k =Mn k N implicit variable samples subject to standard normal distribution, wherein M represents total scene number, N represents historical sample days, N k Representing the number of samples of the kth cluster wind power data;
step 4.3: the extracted implicit variable samples are formed into an r-dimensional implicit characteristic vector sample setz′ j Representing the jth implicit feature vector.
5. The method for generating a wind power day scene according to claim 4, wherein the step 5 comprises the following sub-steps:
step 5.1: sample set Z 'of implicit feature vectors' k Input corresponds to the cluster and divides from the decoder of the encoder, utilize the decoder to divide Z' k Reconstructing to obtain a daily scene sample set generated by reconstruction of the kth cluster Day scene data representing a jth reconstruction generation;
step 5.2: aggregating the K cluster generation day scene sample sets to obtain M random simulation new scenes of the space-time power of the multi-wind power plant
CN202310972772.0A 2021-09-05 2021-09-05 Wind power day scene generation method based on tensor self-organizing mapping neural network Pending CN117081166A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310972772.0A CN117081166A (en) 2021-09-05 2021-09-05 Wind power day scene generation method based on tensor self-organizing mapping neural network

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111035137.7A CN113890109B (en) 2021-09-05 2021-09-05 Multi-wind power plant power daily scene generation method with time-space correlation
CN202310972772.0A CN117081166A (en) 2021-09-05 2021-09-05 Wind power day scene generation method based on tensor self-organizing mapping neural network

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202111035137.7A Division CN113890109B (en) 2021-09-05 2021-09-05 Multi-wind power plant power daily scene generation method with time-space correlation

Publications (1)

Publication Number Publication Date
CN117081166A true CN117081166A (en) 2023-11-17

Family

ID=79008202

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111035137.7A Active CN113890109B (en) 2021-09-05 2021-09-05 Multi-wind power plant power daily scene generation method with time-space correlation
CN202310972772.0A Pending CN117081166A (en) 2021-09-05 2021-09-05 Wind power day scene generation method based on tensor self-organizing mapping neural network

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202111035137.7A Active CN113890109B (en) 2021-09-05 2021-09-05 Multi-wind power plant power daily scene generation method with time-space correlation

Country Status (1)

Country Link
CN (2) CN113890109B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201718756D0 (en) * 2017-11-13 2017-12-27 Cambridge Bio-Augmentation Systems Ltd Neural interface
EP3576029A1 (en) * 2018-05-31 2019-12-04 IMEC vzw Method and device for determining energy system operating scenarios
US11748414B2 (en) * 2018-06-19 2023-09-05 Priyadarshini Mohanty Methods and systems of operating computerized neural networks for modelling CSR-customer relationships
DE102019123455A1 (en) * 2018-09-04 2020-03-05 Nvidia Corporation COMMON SYNTHESIS AND PLACEMENT OF OBJECTS IN SCENES
CN112186761B (en) * 2020-09-30 2022-03-01 山东大学 Wind power scene generation method and system based on probability distribution

Also Published As

Publication number Publication date
CN113890109B (en) 2023-08-25
CN113890109A (en) 2022-01-04

Similar Documents

Publication Publication Date Title
CN112988723B (en) Traffic data restoration method based on space self-attention force diagram convolution cyclic neural network
CN109255505B (en) Short-term load prediction method of multi-model fusion neural network
CN109490814B (en) Metering automation terminal fault diagnosis method based on deep learning and support vector data description
CN112464004A (en) Multi-view depth generation image clustering method
WO2020143253A1 (en) Method employing sparse autoencoder to cluster power system operation modes
CN113177587B (en) Generalized zero sample target classification method based on active learning and variational self-encoder
CN111371611B (en) Weighted network community discovery method and device based on deep learning
CN116050621A (en) Multi-head self-attention offshore wind power ultra-short-time power prediction method integrating lifting mode
CN110691319B (en) Method for realizing high-precision indoor positioning of heterogeneous equipment in self-adaption mode in use field
CN113890109B (en) Multi-wind power plant power daily scene generation method with time-space correlation
CN117093924A (en) Rotary machine variable working condition fault diagnosis method based on domain adaptation characteristics
Ding et al. Greedy broad learning system
CN114168822A (en) Method for establishing time series data clustering model and time series data clustering
CN110569807B (en) Multi-source target tracking method for complex scene
CN114372418A (en) Wind power space-time situation description model establishing method
Cataño et al. Wavelet estimation for factor models with time-varying loadings
Liu et al. Generative Adversarial Network and CNN-LSTM Based Short-Term Power Load Forecasting
WO2021046681A1 (en) Complex scenario-oriented multi-source target tracking method
Jürgens et al. Synthetic time series generation using gans with application in energy economics
CN116662766B (en) Wind speed prediction method and device based on data two-dimensional reconstruction and electronic equipment
CN114842276B (en) Dimension reduction method based on multi-graph fusion typical correlation analysis
Singh et al. A hybrid neuro-wavelet based pre-processing technique for data representation
Peng et al. A Deep Convolutional Embedded Clustering Method for Scenario Reduction of Production Simulation
Hu et al. A Scenario Generation Method for Wind, PV and Load Uncertainty Based on DFKM
FangYuan et al. A Multi-view Images Classification Based on Deep Graph Convolution

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination