CN112561288A - Intelligent identification method for running state of wind driven generator by adopting image model - Google Patents

Intelligent identification method for running state of wind driven generator by adopting image model Download PDF

Info

Publication number
CN112561288A
CN112561288A CN202011438196.4A CN202011438196A CN112561288A CN 112561288 A CN112561288 A CN 112561288A CN 202011438196 A CN202011438196 A CN 202011438196A CN 112561288 A CN112561288 A CN 112561288A
Authority
CN
China
Prior art keywords
image
wind
image model
driven generator
channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011438196.4A
Other languages
Chinese (zh)
Inventor
宋崇辉
陈庆
张海峰
高俊山
王文文
王景琦
赵青青
邢旭进
张子阔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeastern University China
Original Assignee
Northeastern University China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeastern University China filed Critical Northeastern University China
Priority to CN202011438196.4A priority Critical patent/CN112561288A/en
Publication of CN112561288A publication Critical patent/CN112561288A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F03MACHINES OR ENGINES FOR LIQUIDS; WIND, SPRING, OR WEIGHT MOTORS; PRODUCING MECHANICAL POWER OR A REACTIVE PROPULSIVE THRUST, NOT OTHERWISE PROVIDED FOR
    • F03DWIND MOTORS
    • F03D17/00Monitoring or testing of wind motors, e.g. diagnostics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/06Energy or water supply
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Economics (AREA)
  • Data Mining & Analysis (AREA)
  • Strategic Management (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Marketing (AREA)
  • Mathematical Physics (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Computing Systems (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Multimedia (AREA)
  • Game Theory and Decision Science (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Public Health (AREA)
  • Water Supply & Treatment (AREA)
  • Primary Health Care (AREA)
  • Computer Graphics (AREA)

Abstract

The invention relates to an intelligent identification method for the running state of a wind driven generator by adopting an image model, wherein the identification method for the running state of the wind driven generator comprises the following steps: s1, obtaining an image model of the operation of the wind generating set; s2, identifying the image model by adopting a deep convolution neural network, and determining the running state of the wind driven generator; the deep convolutional neural network is trained by preset image model training samples in advance. The wind generating set running state identification method has the advantages that the wind generating set running state identification method is realized based on the viewpoint of the geometric distribution characteristics of the point cloud formed by the sampling points, and is not analyzed based on the numerical characteristics of the sampling points, so that a new angle is provided for running state identification and fault prediction of the wind generating set.

Description

Intelligent identification method for running state of wind driven generator by adopting image model
Technical Field
The invention relates to the technical field of wind generating set operation state identification and fault prediction, in particular to an intelligent identification method for an operation state of a wind driven generator by adopting an image model.
Background
The monitoring of the operating state of a wind turbine is an important component of a wind turbine system. When the wind driven generator breaks down, not only can the economic benefit be influenced, but also the safety of the wind power plant can be threatened. With the increase of the loading amount of the wind generating set, the problem of how to accurately and efficiently identify the operation mode of the wind generating set becomes more and more important. The method for identifying or predicting the running state of the wind driven generator mainly comprises model analysis based method and data driving based method. The model analysis-based method identifies the running state of the fan by analyzing the residual errors of the measured data and the prior data. However, the running process of the fan is a complex process, and it is difficult to establish an accurate model. There have been many achievements in methods relating to wind turbine operating state identification or fault prediction, but the following problems remain to be solved:
(1) the operation of the wind driven generator is a dynamic process, and is influenced by the inherent characteristics of the wind driven generator such as system inertia, control delay and the like, and the data of different transient points in the same state have great difference. The prior method is to analyze the numerical values of sampling points from the microcosmic aspect, when the system input changes frequently, the deviation of the transient numerical value and the theoretical value is large, the expressed characteristics are not enough to explain the running state of the fan, and the data interfere with the characteristic learning process of the subsequent network.
(2) The fan structure is complicated, and the information variety that the sensor gathered is many, and is in a large amount, and most information all is in the flow of a closed loop, has complicated nonlinear coupling relation between the different information. Changes in input variables can cause dynamic adjustments to be made in multiple links within the system, which can result in the values of certain variables overlapping in different states. Therefore, there are great limitations to performing analysis in one or two dimensions.
(3) Different states may lead to the same result, e.g. too high wind speed and a shutdown eventually lead to a power of 0, and all links of the system have a tendency to approach the state with an output of 0. Therefore, from the numerical analysis of the sampling points, only some variables (such as wind speed) in the two states have difference, and the small information difference has little effect on the large amount of information.
Disclosure of Invention
Technical problem to be solved
In view of the above disadvantages and shortcomings of the prior art, the present invention provides a method and system for identifying an operating state of a wind turbine generator system.
(II) technical scheme
In order to achieve the purpose, the invention adopts the main technical scheme that:
in a first aspect, the embodiment of the invention provides an intelligent identification method for an operating state of a wind driven generator by using an image model.
The embodiment of the invention provides an intelligent identification method for the running state of a wind driven generator by adopting an image model, which comprises the following steps:
s1, obtaining an image model of the operation of the wind generating set;
s2, identifying the image model by adopting a deep convolution neural network, and determining the running state of the wind driven generator;
the deep convolutional neural network is trained by preset image model training samples in advance.
Preferably, before S1, the method further includes:
and S0, obtaining an image model training sample according to the pre-obtained observation data of the wind generating set operation.
Preferably, the S0 includes:
s01, acquiring auxiliary variables according to the pre-acquired observation data of the wind generating set operation;
s02, establishing a three-dimensional point cloud in a three-dimensional space according to the auxiliary variable;
s03, acquiring a three-dimensional characteristic curved surface based on the three-dimensional point cloud;
s04, representing the amplitude of the curved surface by gray scale, and converting the three-dimensional characteristic curved surface into a 2D gray scale image;
and S05, stacking the RGB three-channel 2D gray level images, and obtaining an image model training sample containing the state information of the wind driven generator.
Preferably, the step S01 specifically includes:
selecting an auxiliary variable from the S according to a preset selection rule according to the pre-acquired observation data of the operation of the wind generating set;
wherein
Figure BDA0002821377380000031
Figure BDA0002821377380000032
Wherein the content of the first and second substances,
Figure BDA0002821377380000033
for the initial SCADA data, including wind speed v and power P,and r variables, the r +2 observed variables each having nsSampling points;
wherein, the preset selection rule is as follows:
Figure BDA0002821377380000034
wherein, PkIs the power of the kth sample point, k is 1, …, nsIs the line index, sqIs the qth auxiliary variable, q is 1, …, r is the column index, sq,kIs the qth variable of the kth sampling point, and the preset threshold values are epsilon and gamma; KLD (P | | s)q) Is a variable sqAnd the value of the K-L divergence of the power P, wherein
Figure BDA0002821377380000035
Wherein F (-) is a probability distribution function;
Δqis that
Figure BDA0002821377380000036
And
Figure BDA0002821377380000037
the amount of change of (d);
Figure BDA0002821377380000038
represents ρgIn the normal case of (a) a normal case,
Figure BDA0002821377380000039
represents ρgA fault condition of (2);
Figure BDA00028213773800000310
ρqis a variable sqAnd the spearman correlation of power P,
Figure BDA00028213773800000311
d isq,kIs P and sqRespectively arranged in ascending order PkAnd sq,kThe rank difference of (2).
Preferably, the first and second liquid crystal materials are,
the auxiliary variables are:
Figure BDA0002821377380000041
preferably, the step S02 specifically includes:
s021, according to the auxiliary variable, in v-P-S*Point (v, P, s) in the coordinate system*) Form a three-dimensional point cloud distribution map as C*(v,P,s*) Wherein
Figure BDA0002821377380000042
And the number of the first and second electrodes,
Figure BDA0002821377380000043
wherein the subscript b+And b-Respectively represent a maximum value and a theoretical minimum value;
Figure BDA0002821377380000044
is the maximum value of the wind speed v;
Figure BDA0002821377380000045
is the minimum value of the wind speed v;
Figure BDA0002821377380000046
is the maximum value of the power P;
Figure BDA0002821377380000047
is the minimum value of the power P;
Figure BDA0002821377380000048
is s is*A minimum value;
Figure BDA0002821377380000049
is s is*A maximum value;
s022, according to the mapping rule, converting v-P-S*Discretizing and packaging the coordinate system into a discrete coordinate system i-j-k*Obtaining i-j-k*Three-dimensional point cloud picture C under coordinate system*(i,j,k*);
Wherein the mapping rule is:
Figure BDA00028213773800000410
region(s)
Figure BDA00028213773800000411
Is equally divided into niParts, areas
Figure BDA00028213773800000412
Is equally divided into nPA part, a region
Figure BDA00028213773800000413
Is equally divided into
Figure BDA00028213773800000414
A plurality of portions;
after the coordinate conversion, the coordinate conversion is carried out,
Figure BDA00028213773800000415
preferably, the step S03 includes:
according to i-j-k*Three-dimensional point cloud picture C under coordinate system*(i,j,k*) Extracting k from pixel (i, j)*The centroid characteristic in the direction is extracted from the centroid characteristic in the i-j-k*Forming a three-dimensional characteristic surface under a coordinate system;
wherein each channel of the three-dimensional feature surface
Figure BDA0002821377380000051
Comprises the following steps:
Figure BDA0002821377380000052
wherein at (i, j) along k*The centroid of the direction is characterized in that:
Figure BDA0002821377380000053
wherein the content of the first and second substances,
Figure BDA0002821377380000054
is along k at (i, j)*The number of points in the direction of the direction,
Figure BDA0002821377380000055
in order to correspond to the point(s),
Figure BDA0002821377380000056
preferably, the step S04 includes:
s041, and converting each channel of the three-dimensional feature surface on (i, j) according to a quantization formula
Figure BDA0002821377380000057
Amplitude of
Figure BDA0002821377380000058
Converting the gray scale into the gray scale of (i, j), and acquiring a single-channel 2D gray scale foreground image;
the formula for the quantization is such that,
Figure BDA0002821377380000059
l is the level of the gray scale;
Figure BDA00028213773800000510
and
Figure BDA00028213773800000511
is composed of
Figure BDA00028213773800000512
Maximum and minimum values of;
wherein
Figure BDA00028213773800000513
Is the gray value of (i, j), Round (·) is the rounding function;
wherein the single-channel 2D gray level foreground image
Figure BDA00028213773800000514
Comprises the following steps:
Figure BDA00028213773800000515
s042, quantizing single-channel 2D gray image background image in sampling window
Figure BDA00028213773800000516
Is arranged as
Figure BDA00028213773800000517
Wherein the subscript b+And b-Respectively representing theoretical maximum and minimum, average ambient temperature
Figure BDA00028213773800000518
Wind speed xivMean square error of (1) and wind direction xi within the sampling windowωThe mean square error of (a) is,
Figure BDA00028213773800000519
ζv,ζvindicating the degree of environmental change.
Preferably, the step S05 includes:
s051, 2D gray level foreground image based on single channel
Figure BDA0002821377380000061
And single-channel 2D gray level image background image
Figure BDA0002821377380000062
Obtaining 2D grayscale images
Figure BDA0002821377380000063
Wherein the 2D grayscale image
Figure BDA0002821377380000064
Comprises the following steps:
Figure BDA0002821377380000065
s052, stacking the 2D grayscale images of the three channels to obtain an image model training sample containing the state information of the wind driven generator, namely:
Figure BDA0002821377380000066
the STACK (-) is a stacking process from a gray space to a corresponding color space.
In a second aspect, an embodiment of the present invention provides an intelligent recognition system for an operating state of a wind turbine generator using an image model, including:
at least one processor; and
at least one memory communicatively coupled to the processor, wherein the memory stores program instructions executable by the processor, and the processor invokes the program instructions to perform a method for intelligent wind turbine operating state recognition using an image model as described above.
(III) advantageous effects
The invention has the beneficial effects that: according to the intelligent identification method for the running state of the wind driven generator by adopting the image model, the running image model of the wind driven generator is identified by adopting the deep convolution neural network which is trained by the preset image model training sample, so that the aim of identifying the running mode of the wind driven generator is fulfilled. The intelligent identification method for the running state of the wind driven generator adopting the image model analyzes in a brand new angle, can effectively solve the problem of information overlapping in different running states, amplifies the influence caused by the change of input variables, and has higher robustness and generalization capability for analyzing the dynamic running process of WT.
Drawings
FIG. 1 is a flow chart of an intelligent identification method for the operating state of a wind turbine generator using an image model according to the present invention;
FIG. 2 is a schematic flow chart of constructing a dynamic process image model according to a second embodiment of the present invention;
fig. 3 is a schematic diagram illustrating a DCNN migration learning network identification process taking Xception as an example in the second embodiment of the present invention.
Detailed Description
For the purpose of better explaining the present invention and to facilitate understanding, the present invention will be described in detail by way of specific embodiments with reference to the accompanying drawings.
In order to better understand the above technical solutions, exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
Example one
Referring to fig. 1, the present embodiment provides a method for intelligently identifying an operating state of a wind turbine generator using an image model, including:
s1, obtaining an image model of the operation of the wind generating set;
s2, identifying the image model by adopting a deep convolution neural network, and determining the running state of the wind driven generator;
the deep convolutional neural network is trained by preset image model training samples in advance.
In this embodiment, it is preferable that, before S1, the method further includes:
and S0, obtaining an image model training sample according to the pre-obtained observation data of the wind generating set operation.
Preferably, in this embodiment, the S0 includes:
s01, acquiring auxiliary variables according to the pre-acquired observation data of the wind generating set operation;
s02, establishing a three-dimensional point cloud in a three-dimensional space according to the auxiliary variable;
s03, acquiring a three-dimensional characteristic curved surface based on the three-dimensional point cloud;
s04, representing the amplitude of the curved surface by gray scale, and converting the three-dimensional characteristic curved surface into a 2D gray scale image;
and S05, stacking the RGB three-channel 2D gray level images, and obtaining an image model training sample containing the state information of the wind driven generator.
Preferably in this embodiment, the step S01 specifically includes:
selecting an auxiliary variable from the S according to a preset selection rule according to the pre-acquired observation data of the operation of the wind generating set;
wherein
Figure BDA0002821377380000081
Figure BDA0002821377380000082
Wherein the content of the first and second substances,
Figure BDA0002821377380000083
is the initial SCADA data, containing wind speed v and power P, and r variables, which are r +2 observed variables each having nsSampling points;
wherein, the preset selection rule is as follows:
Figure BDA0002821377380000084
wherein, PkIs the power of the kth sample point, k is 1, …, nsIs the line index, sqIs the qth auxiliary variable, q is 1, …, r is the column index, sq,kIs the qth variable of the kth sampling point, and the preset threshold values are epsilon and gamma; KLD (P | | s)q) Is a variable sqAnd the value of the K-L divergence of the power P, wherein
Figure BDA0002821377380000085
Wherein F (-) is a probability distribution function;
Δqis that
Figure BDA0002821377380000086
And
Figure BDA0002821377380000087
the amount of change of (d);
Figure BDA0002821377380000088
represents ρgIn the normal case of (a) a normal case,
Figure BDA0002821377380000089
represents ρgA fault condition of (2);
Figure BDA0002821377380000091
ρqis a variable sqAnd the spearman correlation of power P,
Figure BDA0002821377380000092
d isq,kIs P and sqRespectively arranged in ascending order PkAnd sq,kThe rank difference of (2).
As is preferred in the present embodiment, the first,
the auxiliary variables are:
Figure BDA0002821377380000093
preferably in this embodiment, the step S02 specifically includes:
s021, according to the auxiliary variable, in v-P-S*Point (v, P, s) in the coordinate system*) Form a three-dimensional point cloud distribution map as C*(v,P,s*) Wherein
Figure BDA0002821377380000094
And the number of the first and second electrodes,
Figure BDA0002821377380000095
wherein the subscript b+And b-Respectively represent a maximum value and a theoretical minimum value;
Figure BDA0002821377380000096
is the maximum value of the wind speed v;
Figure BDA0002821377380000097
is the minimum value of the wind speed v;
Figure BDA0002821377380000098
is the maximum value of the power P;
Figure BDA0002821377380000099
is the minimum value of the power P;
Figure BDA00028213773800000910
is s is*A minimum value;
Figure BDA00028213773800000911
is s is*A maximum value.
S022, according to the mapping rule, converting v-P-S*Discretizing and packaging the coordinate system into a discrete coordinate system i-j-k*Obtaining i-j-k*Three-dimensional point cloud picture C under coordinate system*(i,j,k*);
Wherein the mapping rule is:
Figure BDA0002821377380000101
region(s)
Figure BDA0002821377380000102
Is equally divided into niParts, areas
Figure BDA0002821377380000103
Is equally divided into nPA part, a region
Figure BDA0002821377380000104
Is equally divided into
Figure BDA0002821377380000105
A plurality of portions;
after the coordinate conversion, the coordinate conversion is carried out,
Figure BDA0002821377380000106
preferably, in this embodiment, the step S03 includes:
according to i-j-k*Three-dimensional point cloud picture C under coordinate system*(i,j,k*) Extracting k from pixel (i, j)*The centroid characteristic in the direction is extracted from the centroid characteristic in the i-j-k*Forming a three-dimensional characteristic surface under a coordinate system;
wherein each channel of the three-dimensional feature surface
Figure BDA0002821377380000107
Comprises the following steps:
Figure BDA0002821377380000108
wherein at (i, j) along k*The centroid of the direction is characterized in that:
Figure BDA0002821377380000109
wherein the content of the first and second substances,
Figure BDA00028213773800001010
is along k at (i, j)*The number of points in the direction of the direction,
Figure BDA00028213773800001011
in order to correspond to the point(s),
Figure BDA00028213773800001012
preferably, in this embodiment, the step S04 includes:
s041, and converting each channel of the three-dimensional feature surface on (i, j) according to a quantization formula
Figure BDA00028213773800001013
Amplitude of
Figure BDA00028213773800001014
Converting the gray scale into the gray scale of (i, j), and acquiring a single-channel 2D gray scale foreground image;
the formula for the quantization is such that,
Figure BDA00028213773800001015
l is the level of the gray scale;
Figure BDA00028213773800001016
and
Figure BDA00028213773800001017
is composed of
Figure BDA00028213773800001018
Maximum and minimum values of;
wherein
Figure BDA0002821377380000111
Is the gray value of (i, j), Round (·) is the rounding function;
wherein the single-channel 2D gray level foreground image
Figure BDA0002821377380000112
Comprises the following steps:
Figure BDA0002821377380000113
s042, quantizing single-channel 2D gray image background image in sampling window
Figure BDA0002821377380000114
Is arranged as
Figure BDA0002821377380000115
Wherein the subscript b+And b-Respectively representing theoretical maximum and minimum, average ambient temperature
Figure BDA0002821377380000116
Wind speed xivMean square error of (1) and wind direction xi within the sampling windowωThe mean square error of (a) is,
Figure BDA0002821377380000117
ζv,ζvindicating the degree of environmental change.
Preferably, in this embodiment, the step S05 includes:
s051, 2D gray level foreground image based on single channel
Figure BDA0002821377380000118
And single-channel 2D gray level image background image
Figure BDA0002821377380000119
Obtaining 2D grayscale images
Figure BDA00028213773800001110
Wherein the 2D grayscale image
Figure BDA00028213773800001111
Comprises the following steps:
Figure BDA00028213773800001112
s052, stacking the 2D grayscale images of the three channels to obtain an image model training sample containing the state information of the wind driven generator, namely:
Figure BDA00028213773800001113
the STACK (-) is a stacking process from a gray space to a corresponding color space.
In the intelligent identification method for the running state of the wind driven generator by using the image model in the embodiment, the running image model of the wind driven generator set is identified by using the deep convolution neural network which is trained by the preset image model training sample, so that the aim of identifying the running mode of the wind driven generator is fulfilled. The intelligent identification method for the running state of the wind driven generator adopting the image model is used for analyzing in a brand-new angle, can effectively solve the problem of information overlapping in different running states, amplifies the influence caused by the change of input variables, and has higher robustness and generalization capability for analyzing the dynamic running process of WT.
Example two
Referring to fig. 1, the present embodiment provides a method for intelligently identifying an operating state of a wind turbine generator using an image model, including:
s1, obtaining an image model of the operation of the wind generating set;
s2, identifying the image model by adopting a deep convolution neural network, and determining the running state of the wind driven generator;
the deep convolutional neural network is trained by preset image model training samples in advance.
In this embodiment, it is preferable that, before S1, the method further includes:
and S0, obtaining an image model training sample according to the pre-obtained observation data of the wind generating set operation.
Preferably, in this embodiment, the S0 includes:
s01, acquiring auxiliary variables according to the pre-acquired observation data of the wind generating set operation;
in the practical application of the embodiment, the three-dimensional point cloud model is constructed by the main variables of wind speed v and power P and the auxiliary variable s*In the three-dimensional space formed. Auxiliary variable s*Two requirements need to be met: the redundancy with the variable P is low and positive help is given to the operation pattern recognition. The auxiliary variable s*Selected from the combination of K-L divergence (KLD) and spearman correlation variation (SRCC). The KLD is used for judging the redundancy between the variable and the power P, the larger the KLD value is, the smaller the redundancy between the variable and the power P is, and the selected auxiliary variable meets the requirement. Spearman correlation is used to assess the sensitivity of variables to different operating conditions. The larger the value of SRCC for the variable and power P in different states, the more information the variable contains.
Definition of
Figure BDA0002821377380000121
For the initial SCADA data, the wind speed v and power P are included, and the other r variables, the r +2 observed variables each have nsAnd (4) sampling points.
Figure BDA0002821377380000131
Wherein
Figure BDA0002821377380000132
The auxiliary variable is selected from S. PkIs the power of the kth sample point, k is 1, …, nsIs the row index. sqIs the qth auxiliary variable, q is 1, …, and r is the column index. sq,kIs the qth variable of the kth sample point. Variable sqAnd the value of the K-L divergence of the power P is:
Figure BDA0002821377380000133
where F (-) is the probability distribution function. Variable sqThe spearman correlation with power P is:
Figure BDA0002821377380000134
d isq,kIs P and sqRespectively arranged in ascending order PkAnd sq,kThe rank difference of (2).
Calculating rho using SCADA data setsqBy using
Figure BDA0002821377380000135
Represents ρgUnder normal conditions, using
Figure BDA0002821377380000136
Represents ρgTo a fault condition.
Figure BDA0002821377380000137
And
Figure BDA0002821377380000138
the change amount of (c) is:
Figure BDA0002821377380000139
the given thresholds are epsilon and gamma. The selection rule of the auxiliary variable combining KLD and SRCC is as follows:
Figure BDA00028213773800001310
in this embodiment, since 3 channels are selected, 3 auxiliary variables need to be selected for dimension increasing.
Figure BDA00028213773800001311
S02, establishing a three-dimensional point cloud in a three-dimensional space according to the auxiliary variable;
in the practical application of the embodiment, the auxiliary variable s*After determination, v-P-s*Point (v, P, s) in the coordinate system*) Form a 3D point cloud distribution map, which is marked as C*(v,P,s*) That is to say that,
Figure BDA0002821377380000141
due to v, P and s*Is a three-dimensional point cloud C*(v,P,s*) Distributed in a bounding box B bounded by a theoretical maximum and a theoretical minimum, i.e.,
Figure BDA0002821377380000142
wherein the subscript b+And b-Representing the theoretical maximum and minimum values, respectively.
Since the pixel coordinate system is a discrete coordinate system, the variable coordinate system is discretized and packaged as a discrete coordinate systemCoordinate system i-j-k*. Region(s)
Figure BDA0002821377380000143
Is equally divided into niAnd (4) partial. Region(s)
Figure BDA0002821377380000144
Is equally divided into nPAnd (4) a plurality of parts. Region(s)
Figure BDA0002821377380000145
Is equally divided into
Figure BDA0002821377380000146
And (4) a plurality of parts. The mapping rule is as follows,
Figure BDA0002821377380000147
as shown in fig. 2, after the coordinate conversion,
Figure BDA0002821377380000148
s03, acquiring a three-dimensional characteristic curved surface based on the three-dimensional point cloud;
in the practical application of the embodiment, i-j-k*The three-dimensional point cloud picture under the coordinate system is not convenient to be used as the input of a deep convolution neural network, and k is extracted from a pixel point (i, j)*The centroid characteristic in the direction is extracted from the centroid characteristic in the i-j-k*And forming a three-dimensional characteristic curved surface under the coordinate system. Let us assume that k is followed at (i, j)*Number of points in direction of
Figure BDA0002821377380000149
The corresponding point is
Figure BDA00028213773800001410
At (i, j) along k*The centroid of the direction is characterized in that,
Figure BDA00028213773800001411
all of
Figure BDA00028213773800001412
At i-j-k*The coordinate system forms a three-dimensional characteristic surface. In each of the channels, there is a channel,
Figure BDA0002821377380000151
comprises the following steps:
Figure BDA0002821377380000152
s04, representing the amplitude of the curved surface by gray scale, and converting the three-dimensional characteristic curved surface into a 2D gray scale image;
in practical application of the embodiment, since the DCNN cannot recognize the three-dimensional characteristic curved surface, the two-dimensional RGB image G2dFor representing the three-dimensional characteristic surface of each channel. Adding (i, j) to
Figure BDA0002821377380000153
Amplitude of
Figure BDA0002821377380000154
And (3) converting the image into the gray scale of (i, j), thus completing image modeling, and using a two-dimensional RGB image to represent the three-dimensional characteristic curved surface without losing information.
Assuming that the gray scale has L levels, to avoid data overlap, it is necessary to quantify the 3D feature surface at k*Amplitude in the direction to match the gray scale.
Setting up
Figure BDA0002821377380000155
Respectively, is an ideal maximum value and an ideal minimum value
Figure BDA0002821377380000156
And
Figure BDA0002821377380000157
the formula for quantization is:
Figure BDA0002821377380000158
wherein
Figure BDA0002821377380000159
Is the gray value of (i, j) and Round (·) is the rounding function. Single-channel 2D gray level foreground image
Figure BDA00028213773800001510
In order to realize the purpose,
Figure BDA00028213773800001511
setting of single-channel 2D gray image background:
the operating state of the wind park is also related to the external environment, such as ambient temperature and wind direction, etc. When the wind speed received by the wind generating set exceeds the limit, the fan abandons the wind to cause the output power P to be 0. The state is the same as the shutdown state of the fan. In order to avoid generating misjudgment, an environment variable is introduced as a background of the 2D gray-scale image so as to contain more information. The external environment variable comprises an average ambient temperature
Figure BDA00028213773800001512
Wind speed ζvMean square error of (1) and wind direction xi within the sampling windowωThe mean square error of (c).
Figure BDA00028213773800001513
ζv,ζvSingle-channel 2D gray image background representing degree of environmental change and quantized in sampling window
Figure BDA00028213773800001514
The method comprises the following steps:
Figure BDA0002821377380000161
wherein the subscript b+And b-Representing theoretical maximum and minimum values, respectively.
And S05, stacking the RGB three-channel 2D gray level images, and obtaining an image model training sample containing the state information of the wind driven generator.
In practical application of the embodiment, a two-dimensional RGB image G of a dynamic process is modeled2dStacked from three channels of 2D grayscale images. 2D grayscale image composed of foreground and background for each channel
Figure BDA0002821377380000162
Is composed of
Figure BDA0002821377380000163
The grayscale images of the three channels are then stacked, i.e.:
Figure BDA0002821377380000164
the STACK (-) is a stacking process from a gray space to a corresponding color space.
In the practical application of this embodiment, the present invention uses the DCNN of the transfer learning. The network is trained by offline data to form a pre-training model, and the pre-training model is directly adopted to identify the operation mode of the wind generating set in the online process. As shown in fig. 3, the migration depth convolution learning network based on Xception is taken as an example. f. ofXceptionIs the original Xception, f(conv)Is fXceptionThe above-mentioned convolutional layer. f. of(optional)Is a fully connected layer. The original network is
y°=fXception(x)=f(optional)(f(conv)(x°)) (19);
Where x is the output image with pixels 299 x 3 and y is the corresponding output.
To meet the requirements of the present invention, fXceptionFine tuning is required. First, the convolutional layer f needs to be frozen(conv)The parameter (c) of (c). Then, the output size should be adjusted to nv×nP. Then at the convolution layer f(conv)Rear connection 3 layers of full connection layer z(1),z(2)And z(out). Thus, for the training of the network, f(conv)Feature extraction as pre-training, and training only z(1),z(2),z(…)And z(out)The parameter (c) of (c). Will f isXceptionThe trimmed network is denoted by z;
y=z(x)=z(out)(z(…)(z(2)(z(1)(f(conv)(G2d))))) (20);
to speed up the convergence speed of the model, the parameters were optimized using a stochastic gradient descent with momentum (mSGD). To prevent overfitting, Dropout is added after each fully connected layer. In each training batch, half of the hidden layer nodes are randomly discarded to reduce coupling between hidden layer nodes, thereby mitigating overfitting conditions.
In an off-line process, the image models and corresponding labels generated from the SCADA data set are stored in the image set Ω. The image set Ω may be divided into a training set ΩtrainAnd test set Ωtest. Accordingly, the tag set Y is divided into YtrainAnd Ytest. In the online process, parameters trained in the offline process are directly adopted to directly identify the dynamic process image model containing the running information of the wind driven generator, so that the running state of the wind driven generator is identified and the fault of the wind driven generator is predicted.
Compared with the existing method, the method provided by the invention is realized based on the viewpoint of the geometric distribution characteristics of the point cloud formed by the sampling points, rather than analyzing based on the numerical characteristics of the sampling points, and provides a new angle for the running state identification and the fault prediction of the wind driven generator. Through the verification of measured data, the method provided by the invention can better realize the expected effect.
In the intelligent identification method for the running state of the wind driven generator by using the image model in the embodiment, the running image model of the wind driven generator set is identified by using the deep convolution neural network which is trained by the preset image model training sample, so that the aim of identifying the running mode of the wind driven generator is fulfilled. The wind generating set operation state identification method provided by the embodiment analyzes in a brand new angle, can effectively solve the problem of information overlapping in different operation states, amplifies the influence caused by the change of input variables, and has higher robustness and generalization capability for analyzing the WT dynamic operation process.
Since the system described in the above embodiment of the present invention is a system used for implementing the method of the above embodiment of the present invention, a person skilled in the art can understand the specific structure and the modification of the system based on the method described in the above embodiment of the present invention, and thus the detailed description is omitted here. All systems adopted by the method of the above embodiments of the present invention are within the intended scope of the present invention.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the terms first, second, third and the like are for convenience only and do not denote any order. These words are to be understood as part of the name of the component.
Furthermore, it should be noted that in the description of the present specification, the description of the term "one embodiment", "some embodiments", "examples", "specific examples" or "some examples", etc., means that a specific feature, structure, material or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, the claims should be construed to include preferred embodiments and all changes and modifications that fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention should also include such modifications and variations.

Claims (9)

1. An intelligent identification method for the running state of a wind driven generator by adopting an image model is characterized by comprising the following steps:
s1, obtaining an image model of the operation of the wind generating set;
s2, identifying the image model by adopting a deep convolution neural network, and determining the running state of the wind driven generator;
the deep convolutional neural network is trained by preset image model training samples in advance.
2. The method of claim 1, further comprising, prior to S1:
and S0, obtaining an image model training sample according to the pre-obtained observation data of the wind generating set operation.
3. The method according to claim 2, wherein the S0 includes:
s01, acquiring auxiliary variables according to the pre-acquired observation data of the wind generating set operation;
s02, establishing a three-dimensional point cloud in a three-dimensional space according to the auxiliary variable;
s03, acquiring a three-dimensional characteristic curved surface based on the three-dimensional point cloud;
s04, representing the amplitude of the curved surface by gray scale, and converting the three-dimensional characteristic curved surface into a 2D gray scale image;
and S05, stacking the RGB three-channel 2D gray level images, and obtaining an image model training sample containing the state information of the wind driven generator.
4. The method according to claim 3, wherein the step S01 specifically includes:
selecting an auxiliary variable from the S according to a preset selection rule according to the pre-acquired observation data of the operation of the wind generating set;
wherein
Figure FDA0002821377370000011
Figure FDA0002821377370000012
Wherein the content of the first and second substances,
Figure FDA0002821377370000013
for the initial SCADA data, the wind speed v and power P are included, and r variables, the r +2 observed variables each have nsSampling points;
wherein, the preset selection rule is as follows:
Figure FDA0002821377370000021
wherein, PkIs the power of the kth sample point, k is 1, …, nsIs the line index, sqIs the qth auxiliary variable, q is 1, …, r is the column index, sq,kIs the qth variable of the kth sampling point, and the preset threshold values are epsilon and gamma; KLD (P | | s)q) Is a variable sqAnd the value of the K-L divergence of the power P, wherein
Figure FDA0002821377370000022
Wherein F (-) is a probability distribution function;
Δqis that
Figure FDA0002821377370000023
And
Figure FDA0002821377370000024
the amount of change of (d);
Figure FDA0002821377370000025
represents ρgIn the normal case of (a) a normal case,
Figure FDA0002821377370000026
represents ρgA fault condition of (2);
Figure FDA0002821377370000027
ρqis a variable sqAnd the spearman correlation of power P,
Figure FDA0002821377370000028
d isq,kIs P and sqRespectively arranged in ascending order PkAnd sq,kThe rank difference of (2).
5. The method of claim 4,
the auxiliary variables are:
Figure FDA0002821377370000029
6. the method according to claim 5, wherein the step S02 specifically includes:
s021, according to the auxiliary variable, in v-P-S*Point (v, P, s) in the coordinate system*) Form a three-dimensional point cloud distribution map as C*(v,P,s*) Wherein
Figure FDA00028213773700000212
And the number of the first and second electrodes,
Figure FDA00028213773700000210
wherein the subscript b+And b-Respectively represent a maximum value and a theoretical minimum value;
Figure FDA00028213773700000211
is the maximum value of the wind speed v;
Figure FDA0002821377370000031
is the minimum value of the wind speed v;
Figure FDA0002821377370000032
is the maximum value of the power P;
Figure FDA0002821377370000033
is the minimum value of the power P;
Figure FDA0002821377370000034
is s is*A minimum value;
Figure FDA0002821377370000035
is s is*A maximum value;
s022, according to the mapping rule, converting v-P-S*Discretizing and packaging the coordinate system into a discrete coordinate system i-j-k*Obtaining i-j-k*Three-dimensional point cloud picture C under coordinate system*(i,j,k*);
Wherein the mapping rule is:
Figure FDA0002821377370000036
region(s)
Figure FDA0002821377370000037
Is equally divided into niParts, areas
Figure FDA0002821377370000038
Is equally divided into nPA part, a region
Figure FDA0002821377370000039
Is equally divided into
Figure FDA00028213773700000310
A plurality of portions;
after the coordinate conversion, the coordinate conversion is carried out,
Figure FDA00028213773700000318
7. the method according to claim 6, wherein the step S03 includes:
according to i-j-k*Three-dimensional point cloud picture C under coordinate system*(i,j,k*) Extracting k from pixel (i, j)*The centroid characteristic in the direction is extracted from the centroid characteristic in the i-j-k*Forming a three-dimensional characteristic surface under a coordinate system;
wherein each channel of the three-dimensional feature surface
Figure FDA00028213773700000311
Comprises the following steps:
Figure FDA00028213773700000312
wherein at (i, j) along k*The centroid of the direction is characterized in that:
Figure FDA00028213773700000313
wherein the content of the first and second substances,
Figure FDA00028213773700000314
is along k at (i, j)*The number of points in the direction of the direction,
Figure FDA00028213773700000315
in order to correspond to the point(s),
Figure FDA00028213773700000316
8. the method according to claim 7, wherein the step S04 includes:
s041, and converting each channel of the three-dimensional feature surface on (i, j) according to a quantization formula
Figure FDA00028213773700000317
Amplitude of
Figure FDA0002821377370000041
Converting the gray scale into the gray scale of (i, j), and acquiring a single-channel 2D gray scale foreground image;
the formula for the quantization is such that,
Figure FDA0002821377370000042
l is the level of the gray scale;
Figure FDA0002821377370000043
and
Figure FDA0002821377370000044
is composed of
Figure FDA0002821377370000045
Maximum and minimum values of;
wherein
Figure FDA0002821377370000046
Is the gray value of (i, j), Round (·) is the rounding function;
wherein the single-channel 2D gray level foreground image
Figure FDA0002821377370000047
Comprises the following steps:
Figure FDA0002821377370000048
s042, quantizing single-channel 2D gray image background image in sampling window
Figure FDA0002821377370000049
Is arranged as
Figure FDA00028213773700000410
Wherein the subscript b+And b-represents the theoretical maximum and minimum, respectively, the average ambient temperature
Figure FDA00028213773700000411
Wind speed xivMean square error of (1) and wind direction xi within the sampling windowωThe mean square error of (a) is,
Figure FDA00028213773700000412
ζv,ζvindicating the degree of environmental change.
9. The method according to claim 8, wherein the step S05 includes:
s051, 2D gray level foreground image based on single channel
Figure FDA00028213773700000413
And single-channel 2D gray level image background image
Figure FDA00028213773700000414
Obtaining 2D grayscale images
Figure FDA00028213773700000415
Wherein the 2D grayscale image
Figure FDA00028213773700000416
Comprises the following steps:
Figure FDA00028213773700000417
s052, stacking the 2D grayscale images of the three channels to obtain an image model training sample containing the state information of the wind driven generator, namely:
Figure FDA00028213773700000418
the STACK (-) is a stacking process from a gray space to a corresponding color space.
CN202011438196.4A 2020-12-07 2020-12-07 Intelligent identification method for running state of wind driven generator by adopting image model Pending CN112561288A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011438196.4A CN112561288A (en) 2020-12-07 2020-12-07 Intelligent identification method for running state of wind driven generator by adopting image model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011438196.4A CN112561288A (en) 2020-12-07 2020-12-07 Intelligent identification method for running state of wind driven generator by adopting image model

Publications (1)

Publication Number Publication Date
CN112561288A true CN112561288A (en) 2021-03-26

Family

ID=75060463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011438196.4A Pending CN112561288A (en) 2020-12-07 2020-12-07 Intelligent identification method for running state of wind driven generator by adopting image model

Country Status (1)

Country Link
CN (1) CN112561288A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109214356A (en) * 2018-09-29 2019-01-15 南京东振测控技术有限公司 A kind of fan transmission system intelligent fault diagnosis method based on DCNN model
WO2019048597A1 (en) * 2017-09-08 2019-03-14 Sulzer & Schmid Laboratories Ag Method for analysis of sensor data related to a wind turbine
CN109614981A (en) * 2018-10-17 2019-04-12 东北大学 The Power System Intelligent fault detection method and system of convolutional neural networks based on Spearman rank correlation
CN109657839A (en) * 2018-11-22 2019-04-19 天津大学 A kind of wind power forecasting method based on depth convolutional neural networks
CN110111328A (en) * 2019-05-16 2019-08-09 上海中认尚科新能源技术有限公司 A kind of blade crack of wind driven generator detection method based on convolutional neural networks
CN111458144A (en) * 2020-03-04 2020-07-28 华北电力大学 Wind driven generator fault diagnosis method based on convolutional neural network
US20210174543A1 (en) * 2018-07-03 2021-06-10 Promaton Holding B.V. Automated determination of a canonical pose of a 3d objects and superimposition of 3d objects using deep learning
US20210180571A1 (en) * 2017-06-26 2021-06-17 Beijing Goldwind Science & Creation Windpower Equipment Co., Ltd. Method and apparatus for monitoring formation of ice on wind turbine blade

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210180571A1 (en) * 2017-06-26 2021-06-17 Beijing Goldwind Science & Creation Windpower Equipment Co., Ltd. Method and apparatus for monitoring formation of ice on wind turbine blade
WO2019048597A1 (en) * 2017-09-08 2019-03-14 Sulzer & Schmid Laboratories Ag Method for analysis of sensor data related to a wind turbine
US20210174543A1 (en) * 2018-07-03 2021-06-10 Promaton Holding B.V. Automated determination of a canonical pose of a 3d objects and superimposition of 3d objects using deep learning
CN109214356A (en) * 2018-09-29 2019-01-15 南京东振测控技术有限公司 A kind of fan transmission system intelligent fault diagnosis method based on DCNN model
CN109614981A (en) * 2018-10-17 2019-04-12 东北大学 The Power System Intelligent fault detection method and system of convolutional neural networks based on Spearman rank correlation
CN109657839A (en) * 2018-11-22 2019-04-19 天津大学 A kind of wind power forecasting method based on depth convolutional neural networks
CN110111328A (en) * 2019-05-16 2019-08-09 上海中认尚科新能源技术有限公司 A kind of blade crack of wind driven generator detection method based on convolutional neural networks
CN111458144A (en) * 2020-03-04 2020-07-28 华北电力大学 Wind driven generator fault diagnosis method based on convolutional neural network

Similar Documents

Publication Publication Date Title
CN113935406B (en) Mechanical equipment unsupervised fault diagnosis method based on countermeasure flow model
US20210174149A1 (en) Feature fusion and dense connection-based method for infrared plane object detection
EP3690714A1 (en) Method for acquiring sample images for inspecting label among auto-labeled images to be used for learning of neural network and sample image acquiring device using the same
US11741356B2 (en) Data processing apparatus by learning of neural network, data processing method by learning of neural network, and recording medium recording the data processing method
CN106446896A (en) Character segmentation method and device and electronic equipment
CN110598693A (en) Ship plate identification method based on fast-RCNN
Sony et al. Multiclass damage identification in a full-scale bridge using optimally tuned one-dimensional convolutional neural network
US20200125930A1 (en) Artificial neural network and method of training an artificial neural network with epigenetic neurogenesis
CN114332578A (en) Image anomaly detection model training method, image anomaly detection method and device
DK201770681A1 (en) A method for (re-)training a machine learning component
CN113221852B (en) Target identification method and device
WO2020240808A1 (en) Learning device, classification device, learning method, classification method, learning program, and classification program
CN114842343A (en) ViT-based aerial image identification method
CN116186633A (en) Power consumption abnormality diagnosis method and system based on small sample learning
CN114972904B (en) Zero sample knowledge distillation method and system based on fighting against triplet loss
CN112800934A (en) Behavior identification method and device for multi-class engineering vehicle
CN112084860A (en) Target object detection method and device and thermal power plant detection method and device
CN112819024A (en) Model processing method, user data processing method and device and computer equipment
CN109101984B (en) Image identification method and device based on convolutional neural network
CN113052103A (en) Electrical equipment defect detection method and device based on neural network
CN116821730A (en) Fan fault detection method, control device and storage medium
CN112561288A (en) Intelligent identification method for running state of wind driven generator by adopting image model
CN116206275A (en) Knowledge distillation-based recognition model training method and device
CN110750876A (en) Bearing data model training and using method
CN114998194A (en) Product defect detection method, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination