CN103428282A - On-line energy-saving control method and device for cloud computing data center - Google Patents

On-line energy-saving control method and device for cloud computing data center Download PDF

Info

Publication number
CN103428282A
CN103428282A CN2013103395855A CN201310339585A CN103428282A CN 103428282 A CN103428282 A CN 103428282A CN 2013103395855 A CN2013103395855 A CN 2013103395855A CN 201310339585 A CN201310339585 A CN 201310339585A CN 103428282 A CN103428282 A CN 103428282A
Authority
CN
China
Prior art keywords
energy consumption
energy saving
neural network
network model
saving strategy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013103395855A
Other languages
Chinese (zh)
Other versions
CN103428282B (en
Inventor
亓开元
刘俊朋
刘正伟
张东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inspur Beijing Electronic Information Industry Co Ltd
Original Assignee
Inspur Beijing Electronic Information Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inspur Beijing Electronic Information Industry Co Ltd filed Critical Inspur Beijing Electronic Information Industry Co Ltd
Priority to CN201310339585.5A priority Critical patent/CN103428282B/en
Publication of CN103428282A publication Critical patent/CN103428282A/en
Application granted granted Critical
Publication of CN103428282B publication Critical patent/CN103428282B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an on-line energy-saving control method and device for a cloud computing data center. The method comprises the steps of monitoring and collecting load energy consumption data of every node in the cloud computing data center; filtering the load energy consumption data according to preset performance constraint conditions, and enabling the filtered data to serve as an input sample of an adaptive neural network model; adopting the adaptive neural network model to generate an energy-saving strategy according to the input sample, and performing energy-saving control operation on every node according to the energy-saving strategy. According to the method and the device, simulation matching is performed on the load energy consumption data of the nodes and the energy-saving strategy on the basis of the adaptive neural network model, the appropriate energy-saving strategy is selected for the cloud computing data center, the input sample which is the load energy consumption data is filtered, input data not needing to undergo energy-saving planning are removed, the model input sample space scale is reduced, and on-line energy-saving control efficiency of the data center is improved.

Description

Online energy-saving control method and the device of a kind of cloud computing data center
Technical field
The present invention relates to the communications field, be specifically related to online energy-saving control method and the device of a kind of cloud computing data center.
Background technology
The important morphological of information infrastructure and application service model Internet era that cloud computing being is the trend of the technology-intensive development of generation information.Data center is the basis of cloud computing, and China cloud data center generally adopts server, storage, the network equipment at present, is aided with the construction mode of virtualization software, management software and operating system, application software.On average, the resource extent that China's prior art can be managed and dispatch is less more than 10 times than Google, and the large-scale data disposal ability differs from more than 100 times.A major reason that causes above-mentioned gap is that Google has carried out large quantity research aspect data center's power-saving technology, the PUE of its data center (power usage effectiveness, energy consumption efficiency) dropped to 1.1, and domestic data center can only reach 1.4-1.5, comprehensive energy consumption is higher than more than Google40%.Therefore, in the urgent need to launching research aspect data center's energy conservation, solve the bottleneck problem of cloud computing data center.
The predefine mode is taked in the conventional data centers administration of energy conservation, determines control strategy in deployment, fully according to fixed mode and flow process running; If adjust the predefine strategy, need the keeper to be reconfigured.Complexity due to data center self framework, the diversity that operation is applied in addition, more complicated variation can occur in the temperature of data center, energy consumption and load etc., fixed mode management strategy is difficult to meet the lasting usability requirements of data center under complex environment, and rely on, manually reconfigured, along with the raising of data center's complexity and the frequent variations of running status, inefficiency, error rate also increases greatly.In addition, for meeting the demand of load peak, current data center can retain the resource redundancy of significant proportion usually, and in reality load mostly in relatively low level.In this case, a large amount of hardware devices do not provide effective performance output, but energy consumption does not have reduction.
Monitoring and the load information of acquisition system keystone resources and the running state data of each keystone resources ADMINISTRATION SUBSYSTEM of operating system, by neural net to non-linear, effective analysis and the identification of complication system running and feature, set up performance, the simulation model of load and energy consumption, can produce automatically applicable energy-saving run strategy, the energy-conservation primary control program adjustment of guidance system and the operational mode of various kinds of equipment in system is set, and configuration load between multiple devices in a concentrated manner, finally realize when guaranteeing that system stability provides the performance that meets application demand, effectively reduce the energy resource consumption of data center.
According to above-mentioned thinking, existing method is used BP (Back Propagation) algorithm and Hopfield algorithm to set up data center's Energy Saving Control strategy.Yet BP and Hopfield belong to the off-line training method, need training pattern again when adding new sample, can't meet the On-line Control demand.Adaptive neural network (Adaptive Resonance Theory) can be trained separately new sample data, there is very strong online adaptive capacity, but adopting the adaptive neural network model to carry out efficient data center Energy Saving Control still exists following problem:
In data center initial go-live period, often lack the experience historical data of Energy Saving Control, can only rely on a small amount of self-defined strategy, the data sample of the neural network model that also just lacks training, cause data center can't carry out adaptive Energy Saving Control.
Characteristic parameter classification, One's name is legion due to characterization data center resources, load and running status, cause sample space huge, even under the complex situations of non-linear relation the convergence effect preferably the adaptive neural network model also need the training for a long time.
Summary of the invention
The technical issues that need to address of the present invention are to provide online energy-saving control method and the device of a kind of cloud computing data center, based on the adaptive neural network model, for suitable Energy Saving Strategy is selected by cloud computing data center, reduce mode input sample space scale, improve the online Energy Saving Control efficiency of data center.
In order to solve the problems of the technologies described above, the invention provides the online Energy Saving Control of a kind of cloud computing data center, comprising:
The load energy consumption data of each node in monitoring and collection cloud computing data center;
According to default performance constraints condition, described load energy consumption data is filtered, the data after filtering are as the input sample of adaptive neural network model;
According to described input sample, adopt the adaptive neural network model to produce Energy Saving Strategy, and according to described Energy Saving Strategy, each node is carried out to the Energy Saving Control operation.
Further, describedly according to default performance constraints condition, described load energy consumption data is filtered, comprising:
By energy consumption in described load energy consumption data lower than default energy consumption threshold value and actual performance higher than the default capabilities index, and/or higher than default energy consumption threshold value and actual performance, the load energy consumption data lower than the default capabilities index filters out energy consumption.
Further, described, adopt according to described input sample before the adaptive neural network model produces Energy Saving Strategy, also comprise:
Judge whether described adaptive neural network model restrains, if not convergence, according to the load energy consumption data of described each node, adopt planning type algorithm model to search out the Energy Saving Strategy of near-optimization, and using the Energy Saving Strategy of described near-optimization as tactful sample to the training of adaptive neural network model, set up the adaptive neural network model; If convergence, adopt current adaptive neural network model generation Energy Saving Strategy.
Further, the described energy consumption data of the load according to described each node adopts planning type algorithm model to search out the Energy Saving Strategy of near-optimization, comprising:
By energy consumption in the load energy consumption data of described each node lower than default energy consumption threshold value and actual performance higher than the default capabilities index or, energy consumption filters out higher than presetting energy consumption threshold value and the actual performance load energy consumption data lower than the default capabilities index, data after filtering, as the input sample of described planning type algorithm model, search out the Energy Saving Strategy of near-optimization.
Further, described Energy Saving Strategy of usining described near-optimization, also comprises the training of adaptive neural network model as sample:
After the Energy Saving Strategy of described near-optimization being adopted to the pattern classification of coarseness, as tactful sample to the training of adaptive neural network model.
Further, described method also comprises: to described Energy Saving Strategy, adopt heuristic greedy algorithm to generate final Energy Saving Strategy, specifically comprise:
Node in each node group is sorted from small to large according to resource utilization;
For the node after sequence, set successively the shutdown strategy, until meet the shutdown number of nodes that described Energy Saving Strategy comprises; And/or, set successively the frequency reducing strategy, until meet the frequency reducing number of nodes in described Energy Saving Strategy.
In order to solve the problems of the technologies described above, the present invention also provides the online energy-saving control device of a kind of cloud computing data center, comprising:
Data acquisition module, for the load energy consumption data of each node in monitoring and collection cloud computing data center;
Filtering module, filter described load energy consumption data for the performance constraints condition according to default, and the data after filtering are as the input sample of adaptive neural network model;
The strategy generation module, for adopting the adaptive neural network model to produce Energy Saving Strategy according to described input sample;
Executive Module, for carrying out the Energy Saving Control operation according to described Energy Saving Strategy to each node.
Further, described filtering module, filter described load energy consumption data for the performance constraints condition according to default, comprising:
By energy consumption in described load energy consumption data lower than default energy consumption threshold value and actual performance higher than the default capabilities index, and/or higher than default energy consumption threshold value and actual performance, the load energy consumption data lower than the default capabilities index filters out energy consumption.
Further, this device also comprises the adaptive neural network model building module be connected with described tactful generation module, for adopting according to described input sample before the adaptive neural network model produces Energy Saving Strategy tactful generation module is described, judge whether described adaptive neural network model restrains, if not convergence, according to the load energy consumption data of described each node, adopt planning type algorithm model to search out the Energy Saving Strategy of near-optimization, and using the Energy Saving Strategy of described near-optimization as tactful sample to the training of adaptive neural network model, set up the adaptive neural network model, if convergence, notify described tactful generation module to adopt current adaptive neural network model generation Energy Saving Strategy.
Further, described adaptive neural network model building module, adopt planning type algorithm model to search out the Energy Saving Strategy of near-optimization for the load energy consumption data according to described each node, comprising:
By energy consumption in the load energy consumption data of described each node lower than default energy consumption threshold value and actual performance higher than the default capabilities index or, energy consumption filters out higher than presetting energy consumption threshold value and the actual performance load energy consumption data lower than the default capabilities index, data after filtering, as the input sample of described planning type algorithm model, search out the Energy Saving Strategy of near-optimization.
Further, described adaptive neural network model building module, for using the Energy Saving Strategy of described near-optimization as sample to the training of adaptive neural network model, also comprise:
After the Energy Saving Strategy of described near-optimization being adopted to the pattern classification of coarseness, as tactful sample to the training of adaptive neural network model.
Further, described tactful generation module also, for to described Energy Saving Strategy, adopting heuristic greedy algorithm to generate final Energy Saving Strategy, specifically comprises:
Node in each node group is sorted from small to large according to resource utilization;
For the node after sequence, set successively the shutdown strategy, until meet the shutdown number of nodes that described Energy Saving Strategy comprises; And/or, set successively the frequency reducing strategy, until meet the frequency reducing number of nodes in described Energy Saving Strategy.
Compared with prior art, online energy-saving control method and the device of cloud computing provided by the invention data center, based on the adaptive neural network model, load energy consumption data and the Energy Saving Strategy of node are simulated to coupling, for suitable Energy Saving Strategy is selected by cloud computing data center, reduce mode input sample space scale, improve the online Energy Saving Control efficiency of data center; In addition, input sample (being the load energy consumption data) is filtered, eliminating is without the input data of carrying out energy conservation plan, further output sample (being Energy Saving Strategy) is adopted the Clustering of coarseness, reduce the output policy sample space, reduce the required sample size of model training, accelerate the model convergence rate; And further adopt heuristic greedy algorithm further to plan the strategy of model output, and produce final strategy, reducing the required sample of pattern drill, accelerating not affect online Energy Saving Control efficiency in the model convergence rate.And, adopt the mode of adaptive neural network algorithm and the combination of Optimizing Search algorithm, before the model convergence, adopt the Energy Saving Strategy of simulated annealing method search near-optimization, and as sample, the neural network model of nucleus module is trained, solve the neural network model that lacks training of the online Energy Saving Control of the existing cloud data center of restriction tactful sample, cause data center can't carry out the problem of adaptive Energy Saving Control.
The accompanying drawing explanation
Fig. 1 is the online energy-saving control method flow chart of the cloud computing data center in embodiment;
Fig. 2 adopts the simulated annealing model to try to achieve the flow chart of the tactful sample of training adaptive neural network model in embodiment;
Fig. 3 is adaptive neural network structure chart in embodiment;
Fig. 4 is the online energy-saving control method flow chart at medium cloud calculated data center in an application example;
Fig. 5 is load sample two-dimensional space schematic diagram in embodiment;
Fig. 6 is the online energy-saving control method flow chart at medium cloud calculated data center in an application example.
Embodiment
For making the purpose, technical solutions and advantages of the present invention clearer, hereinafter in connection with accompanying drawing, embodiments of the invention are elaborated.It should be noted that, in the situation that do not conflict, the embodiment in the application and the feature in embodiment be combination in any mutually.
Embodiment:
As shown in Figure 1, the present embodiment provides the online energy-saving control method of a kind of cloud computing data center, comprising:
S101: the load energy consumption data of each node in monitoring and collection cloud computing data center;
Wherein, the load energy consumption data comprises: the load of each node of data center's supervisory control system timing sampling (utilance of CPU, internal memory, network, disk), performance constraints condition and energy consumption are controlled target.
S102: according to default performance constraints condition, described load energy consumption data is filtered, the data after filtering are as the input sample of adaptive neural network model;
Wherein, describedly according to default performance constraints condition, described load energy consumption data is filtered, comprising:
By energy consumption in described load energy consumption data lower than default energy consumption threshold value and actual performance higher than the default capabilities index, or higher than default energy consumption threshold value and actual performance, the load energy consumption data lower than the default capabilities index filters out energy consumption.
S103: according to described input sample, adopt the adaptive neural network model to produce Energy Saving Strategy;
S104: and according to described Energy Saving Strategy, each node is carried out to the Energy Saving Control operation.
Wherein, before step S103, further comprising the steps of:
Judge whether described adaptive neural network model restrains, if not convergence, according to the load energy consumption data of described each node, adopt planning type algorithm model to search out the Energy Saving Strategy of near-optimization, and using described Energy Saving Strategy as tactful sample to the training of adaptive neural network model, set up the adaptive neural network model; If convergence, adopt current adaptive neural network model generation Energy Saving Strategy.
Here, planning type algorithm model comprises: simulated annealing, ant group algorithm and greedy algorithm etc., adopt these planning type algorithm models to search out the Energy Saving Strategy of near-optimization, refer to that trying to achieve not is optimum but the pratical and feasible and comparatively gratifying Energy Saving Strategy of effect.
Preferably, can also be further to the input data of planning type algorithm model, it is the load energy consumption data of each node, further filtered, specifically comprise: by energy consumption in the load energy consumption data of described each node lower than default energy consumption threshold value and actual performance higher than the default capabilities index or, energy consumption filters out higher than presetting energy consumption threshold value and the actual performance load energy consumption data lower than the default capabilities index, data after filtering, as the input sample of planning type algorithm model, search out the Energy Saving Strategy of near-optimization.
Due to existing adaptive neural network model, still exist carrying out efficient data center Energy Saving Control: in data center initial go-live period, the experience historical data that often lacks Energy Saving Control, can only rely on a small amount of self-defined strategy, also with regard to the problem of the data sample of the neural network model that lacks training, as a preferred mode, can adopt the simulated annealing model to try to achieve the output policy sample space of training adaptive neural network model, the process of as shown in Figure 2, trying to achieve tactful sample specifically comprises:
S201: input load energy consumption data;
S202: design temperature T, maximum iteration time L, iterations L ' is controlled in cooling, constructs initial Energy Saving Strategy x 0, enter the annealing iterative process;
S203: judge whether to reach maximum iteration time, if reach, export Energy Saving Strategy; If do not reach, execution step S204, enter interior loop iteration search;
S204: whether degree of sentencing reaches cooling iteration control number of times, if reach, minute book time energy-conservation strategy is optimum Energy Saving Strategy, the control of lowering the temperature, and return to step S203, rejudge and whether reach maximum iteration time, if reach, export this optimum Energy Saving Strategy; If do not reach, continue iterative search, execution step S205;
S205: adjust Energy Saving Strategy, be specially: the energy efficient of the relatively newer Energy Saving Strategy produced and the energy efficient of current optimum Energy Saving Strategy, if the energy-saving effect of the new Energy Saving Strategy produced is more excellent,, energy efficient is greater than current optimum Energy Saving Strategy, will be somebody's turn to do the new Energy Saving Strategy produced as current optimum Energy Saving Strategy, execution step S207; Otherwise, execution step S206;
S206: judge whether
Figure BDA00003626106900081
If be greater than accept the new Energy Saving Strategy produced as current optimum Energy Saving Strategy, wherein, the difference of the energy efficient that Δ U is the new Energy Saving Strategy produced and the energy efficient of current optimum Energy Saving Strategy, T is design temperature; Otherwise execution step S207;
Certain probability of usining is accepted the new Energy Saving Strategy produced as current optimum Energy Saving Strategy, if to avoid algorithm to be absorbed in locally optimal solution.
S207: iteration count, return to step S204, after the value of iteration count reaches cooling control iterations, and then rejudge whether reach maximum iteration time, if reach, export this optimum Energy Saving Strategy.
Try to achieve the output policy sample space of training adaptive neural network model by this simulated annealing model, using this tactful sample space as training adaptive neural network model, in the sample situation that lacks training, the embodiment of the present invention takes full advantage of the characteristics that adaptive neural network can be trained online, in conjunction with the Optimizing Search algorithm, produce Energy Saving Strategy by searching algorithm and carry out Energy Saving before the neural network model convergence, and the formation new samples is trained online, thereby solved existing adaptive neural network model in data center initial go-live period, the experience historical data that lacks Energy Saving Control, the problem of tactful sample of neural network model lacks training.
In addition, preferably, using the Energy Saving Strategy of described near-optimization as sample to the training of adaptive neural network model, also comprise:
After described Energy Saving Strategy being adopted to the pattern classification of coarseness, as tactful sample to the training of adaptive neural network model.
Adaptive neural network can be trained separately new sample data, there is very strong online adaptive capacity, therefore, the adaptive neural network model can generate Energy Saving Strategy in training limit, limit, as shown in Figure 3, adaptive neural network comprises input layer, computation layer and output layer, and each node of input layer represents an input sample, each node of output layer represents a pattern, and a pattern is corresponding with an Energy Saving Strategy.
In step S103, adopt this adaptive neural network model according to described input Sample producing Energy Saving Strategy, specifically comprise the following steps, be also the process of adaptive neural network model training simultaneously:
A, described adaptive neural network model are accepted described input sample at its input layer, computation layer is according to pattern under weights coefficient calculations input sample, and mated at its output layer and all patterns of having stored, obtain the pattern that matching degree is the highest, the input sample is the load energy consumption data of each node, the corresponding Energy Saving Strategy of pattern;
B, judge whether the similarity between the pattern that pattern and described matching degree are the highest under described input sample reaches with reference to thresholding, if reach, using this matching degree, the highest Energy Saving Strategy corresponding to pattern exported as Energy Saving Strategy, then enters learning phase D, otherwise enters search phase C;
C, described search phase comprise: utilize simulated annealing to try to achieve a new Energy Saving Strategy according to described input sample, a new pattern node is set to should Energy Saving Strategy at output layer, set up the weights that are connected with this pattern, in order to set up the mapping of this pattern to this strategy simultaneously.
D, described learning phase comprise: from described similarity, surpass the pattern the highest with reference to all matching degrees of thresholding the highest pattern of similarity of selecting, and adjust the pattern relevant weights the highest to this similarity, so that the affiliated pattern that the later input sample calculation similar to described input sample obtains can obtain larger similarity during again with this pattern matching of having stored.
In said process, if similarity surpasses with reference to thresholding, this adaptive neural network model is exactly Energy Saving Strategy corresponding to pattern that matching degree is the highest according to described input Sample producing Energy Saving Strategy, in step D, and then to the model training, by similarity after the highest mode adjustment weights, as the tactful sample of model;
If similarity does not surpass with reference to thresholding, this adaptive neural network model is exactly the new Energy Saving Strategy of trying to achieve in step C according to described input Sample producing Energy Saving Strategy, in addition, in step C further to the model training, foundation and the weights that this pattern is connected, set up the mapping of this pattern to New Policy.
In the Energy Saving Strategy produced in step S103, usually indicated such as closing several nodes or to several node frequency reducings, but how or which node is closed or frequency reducing concrete indication, therefore, after step S103 produces Energy Saving Strategy, also further comprise: to described Energy Saving Strategy, adopt heuristic greedy algorithm to generate final Energy Saving Strategy, specifically comprise:
Node in each node group is sorted from small to large according to resource utilization;
For the node after sequence, set successively the shutdown strategy, until meet the shutdown number of nodes that described Energy Saving Strategy comprises; And/or, set successively the frequency reducing strategy, until meet the frequency reducing number of nodes in described Energy Saving Strategy.
By said process, just can determine which node shutdown or frequency reducing, for example, the Energy Saving Strategy produced in step 103 comprises 2 node shutdown, by above-mentioned greedy algorithm, can obtain closing two nodes of resource utilization minimum.
In an application example, as shown in Figure 4, provide the online energy-saving control method of a kind of cloud computing data center, comprise the following steps:
S401: input load sample;
Wherein, the load (utilance of CPU, internal memory, network, disk) of each node that load sample (being the load energy consumption data) is data center's supervisory control system timing sampling, performance constraints condition and energy consumption are controlled target, the final Energy Saving Strategy of exporting data center, Energy Saving Strategy specifically comprises certain node frequency reducing or shutdown.
S402: the load sample is carried out to filtering;
As a kind of preferred mode, the load energy consumption data of input is modeled as to a two-dimensional space, as shown in Figure 5, the input data sampling can be divided into to four quadrants.Filter out energy consumption lower than default energy consumption threshold value and actual performance higher than the default capabilities index, and/or, energy consumption, higher than presetting energy consumption threshold value and the actual performance load energy consumption data lower than the default capabilities index, retains energy consumption and carries out follow-up strategy output higher than default capabilities index and energy consumption lower than presetting energy consumption threshold value and the actual performance load energy consumption data lower than the default capabilities index higher than default energy consumption threshold value and actual performance.
By above-mentioned filtering algorithm, can be filled at most 50% invalid input load sample, reduced the tactful quantity of adaptive neural network model output, improve the speed of strategy output.
S403: judge whether adaptive model restrains, if restrain, perform step S404; Otherwise, execution step S405;
Wherein, judge that the method whether adaptive model restrains is exactly the process in top step B:
Under judgement input sample, whether pattern reaches with reference to thresholding with the similarity between the highest pattern of described matching degree, if reach, illustrate that this adaptive model restrains, otherwise explanation does not restrain.
S404: adopt current adaptive neural network model output Energy Saving Strategy;
Herein, with this adaptive neural network model of employing in step S103, according to described input Sample producing Energy Saving Strategy, be also that the process of adaptive neural network model training is similar simultaneously, repeat no more herein.
S405: adopt simulated annealing output Energy Saving Strategy, and with this Energy Saving Strategy neural network training model;
Herein, similar with the process of trying to achieve tactful sample shown in Fig. 5, repeat no more herein.Wherein, when adopting simulated annealing output Energy Saving Strategy, preferably, can also carry out filtering to the input sample of simulated annealing, by energy consumption in the load energy consumption data of described each node lower than default energy consumption threshold value and actual performance higher than the default capabilities index or, energy consumption filters out higher than presetting energy consumption threshold value and the actual performance load energy consumption data lower than the default capabilities index, and the data after filtering, as the input sample of simulated annealing model, search out the Energy Saving Strategy of near-optimization.
By above-mentioned filtering algorithm, can be filled at most 50% invalid input load sample, reduce the tactful sample space that training adaptive neural network model needs, thereby improve the convergence rate of adaptive neural network model.
S406: the output policy of this adaptive neural network model is adopted to the pattern output of coarseness;
Wherein, in existing adaptive neural network model, the Energy Saving Strategy of output is for each node, described Energy Saving Strategy comprises: certain node is taked to close or the operation of frequency reducing, and in the embodiment of the present invention, adopting the pattern output of coarseness to refer to: the Energy Saving Strategy of output is no longer for each node, but for the node group, wherein, data center is the node group according to different node types by node division, the node group is comprised of the node with isomorphism structure, Energy Saving Strategy may be that some node in the node group is closed or frequency reducing, the output mode of coarseness is that node group 1 (comprises: the frequency reducing number of nodes, closed node quantity), node group 2 (comprises: the frequency reducing number of nodes, closed node quantity), and compound mode.
So far, this adaptive neural network model convergence (foundation completes);
S407: after the adaptive neural network model has been set up, give this neural network model input load sample, adopt this adaptive neural network model output Energy Saving Strategy;
S408: the strategy to output further adopts heuristic greedy algorithm to generate final Energy Saving Strategy;
Due to the output Energy Saving Strategy that has adopted the coarseness pattern, therefore, Energy Saving Strategy is for each node group, also do not refine to each node, therefore, further adopt heuristic greedy algorithm further to plan the strategy of model output, producing final strategy closes or frequency reducing to each node, reducing the required sample of pattern drill, accelerating not affect online Energy Saving Control efficiency in the model convergence rate, as shown in Figure 6, adopt heuristic greedy algorithm to generate final Energy Saving Strategy, specifically comprise:
At first, the node in each node group is sorted from small to large according to resource utilization;
Then, for the node after sequence, set successively the shutdown strategy, until meet the shutdown number of nodes that described Energy Saving Strategy comprises; And/or, set successively the frequency reducing strategy, until meet the frequency reducing number of nodes in described Energy Saving Strategy.
Finally determine specifically by greedy algorithm which node is adopted to the shutdown strategy, which node is adopted to the frequency reducing strategy.
S409: each node is carried out to the Energy Saving Control operation according to described Energy Saving Strategy.
In an application example, as shown in Figure 6, also provide the online Energy Saving Strategy generation method of a kind of cloud computing data center, comprising:
S601: input load sample;
S602: calculate the affiliated pattern of input sample, and mated at output layer and all patterns of having stored, obtain the pattern that matching degree is the highest;
S603: judge whether the similarity between the pattern that the affiliated pattern of described input sample and described matching degree are the highest reaches with reference to thresholding, if reach perform step S604, otherwise execution step S605;
S604: enter learning phase, described similarity is surpassed to all patterns with reference to thresholding, therefrom select the pattern that similarity is the highest, and the adjustment weights relevant to this pattern, by this pattern storage, so that the affiliated pattern that later the input sample calculation similar to described input sample obtains can obtain larger similarity during again with this pattern matching of having stored;
S605: utilize simulated annealing to try to achieve a new Energy Saving Strategy according to described input sample;
S606: output Energy Saving Strategy.
Can carry out the Energy Saving Control operation to each node according to this Energy Saving Strategy.
The present embodiment also provides the online energy-saving control device of a kind of cloud computing data center, comprising:
Data acquisition module, for the load energy consumption data of each node in monitoring and collection cloud computing data center;
Filtering module, filter described load energy consumption data for the performance constraints condition according to default, and the data after filtering are as the input sample of adaptive neural network model;
The strategy generation module, for adopting the adaptive neural network model to produce Energy Saving Strategy according to described input sample;
Executive Module, for carrying out the Energy Saving Control operation according to described Energy Saving Strategy to each node.
Wherein, described filtering module, filter described load energy consumption data for the performance constraints condition according to default, comprising:
By energy consumption in described load energy consumption data lower than default energy consumption threshold value and actual performance higher than the default capabilities index, and/or higher than default energy consumption threshold value and actual performance, the load energy consumption data lower than the default capabilities index filters out energy consumption.
Wherein, this device also comprises the adaptive neural network model building module be connected with described tactful generation module, for adopting according to described input sample before the adaptive neural network model produces Energy Saving Strategy tactful generation module is described, judge whether described adaptive neural network model restrains, if not convergence, according to the load energy consumption data of described each node, adopt planning type algorithm model to search out the Energy Saving Strategy of near-optimization, and using the Energy Saving Strategy of described near-optimization as tactful sample to the training of adaptive neural network model, set up the adaptive neural network model, if convergence, notify described tactful generation module to adopt current adaptive neural network model generation Energy Saving Strategy.
Preferably, described adaptive neural network model building module, adopt planning type algorithm model to search out the Energy Saving Strategy of near-optimization for the load energy consumption data according to described each node, comprising:
By energy consumption in the load energy consumption data of described each node lower than default energy consumption threshold value and actual performance higher than the default capabilities index or, energy consumption filters out higher than presetting energy consumption threshold value and the actual performance load energy consumption data lower than the default capabilities index, data after filtering, as the input sample of described planning type algorithm model, search out the Energy Saving Strategy of near-optimization.
Preferably, described adaptive neural network model building module, for using the Energy Saving Strategy of described near-optimization as sample to the training of adaptive neural network model, also comprise:
After the Energy Saving Strategy of described near-optimization being adopted to the pattern classification of coarseness, as tactful sample to the training of adaptive neural network model.
Preferably, described tactful generation module also, for to described Energy Saving Strategy, adopting heuristic greedy algorithm to generate final Energy Saving Strategy, specifically comprises:
Node in each node group is sorted from small to large according to resource utilization;
For the node after sequence, set successively the shutdown strategy, until meet the shutdown number of nodes that described Energy Saving Strategy comprises; And/or, set successively the frequency reducing strategy, until meet the frequency reducing number of nodes in described Energy Saving Strategy.
From above-described embodiment, can find out, with respect to prior art, online energy-saving control method and the device of the cloud computing data center provided in above-described embodiment, based on the adaptive neural network model, load energy consumption data and the Energy Saving Strategy of node are simulated to coupling, for suitable Energy Saving Strategy is selected by cloud computing data center, reduce mode input sample space scale, improve the online Energy Saving Control efficiency of data center;
In addition, input sample (and load energy consumption data) is filtered, eliminating is without the input data of carrying out energy conservation plan, further output sample (and Energy Saving Strategy) is adopted the Clustering of coarseness, reduce the output policy sample space, reduce the required sample size of model training, accelerate the model convergence rate;
And further adopt heuristic greedy algorithm further to plan the strategy of model output, and produce final strategy, reducing the required sample of pattern drill, accelerating not affect online Energy Saving Control efficiency in the model convergence rate.
And, adopt the mode of adaptive neural network algorithm and the combination of Optimizing Search algorithm, before the model convergence, adopt the Energy Saving Strategy of simulated annealing method search near-optimization, and as sample, the neural network model of nucleus module is trained, solve the neural network model that lacks training of the online Energy Saving Control of the existing cloud data center of restriction tactful sample, cause data center can't carry out the problem of adaptive Energy Saving Control.
One of ordinary skill in the art will appreciate that all or part of step in said method can come the instruction related hardware to complete by program, described program can be stored in computer-readable recording medium, as read-only memory, disk or CD etc.Alternatively, all or part of step of above-described embodiment also can realize with one or more integrated circuits.Correspondingly, each the module/unit in above-described embodiment can adopt the form of hardware to realize, also can adopt the form of software function module to realize.The present invention is not restricted to the combination of the hardware and software of any particular form.
The foregoing is only the preferred embodiments of the present invention, be not intended to limit protection scope of the present invention.According to summary of the invention of the present invention; also other various embodiments can be arranged; in the situation that do not deviate from spirit of the present invention and essence thereof; those of ordinary skill in the art are when making according to the present invention various corresponding changes and distortion; within the spirit and principles in the present invention all; any modification of doing, be equal to replacement, improvement etc., within protection scope of the present invention all should be included in.

Claims (12)

1. the online energy-saving control method of a cloud computing data center comprises:
The load energy consumption data of each node in monitoring and collection cloud computing data center;
According to default performance constraints condition, described load energy consumption data is filtered, the data after filtering are as the input sample of adaptive neural network model;
According to described input sample, adopt the adaptive neural network model to produce Energy Saving Strategy, and according to described Energy Saving Strategy, each node is carried out to the Energy Saving Control operation.
2. the method for claim 1 is characterized in that:
Describedly according to default performance constraints condition, described load energy consumption data is filtered, comprising:
By energy consumption in described load energy consumption data lower than default energy consumption threshold value and actual performance higher than the default capabilities index, and/or higher than default energy consumption threshold value and actual performance, the load energy consumption data lower than the default capabilities index filters out energy consumption.
3. the method for claim 1 is characterized in that: adopt according to described input sample before the adaptive neural network model produces Energy Saving Strategy described, also comprise:
Judge whether described adaptive neural network model restrains, if not convergence, according to the load energy consumption data of described each node, adopt planning type algorithm model to search out the Energy Saving Strategy of near-optimization, and using the Energy Saving Strategy of described near-optimization as tactful sample to the training of adaptive neural network model, set up the adaptive neural network model; If convergence, adopt current adaptive neural network model generation Energy Saving Strategy.
4. method as claimed in claim 3 is characterized in that:
The described energy consumption data of the load according to described each node adopts planning type algorithm model to search out the Energy Saving Strategy of near-optimization, comprising:
By energy consumption in the load energy consumption data of described each node lower than default energy consumption threshold value and actual performance higher than the default capabilities index or, energy consumption filters out higher than presetting energy consumption threshold value and the actual performance load energy consumption data lower than the default capabilities index, data after filtering, as the input sample of described planning type algorithm model, search out the Energy Saving Strategy of near-optimization.
5. method as claimed in claim 4 is characterized in that:
Described Energy Saving Strategy of usining described near-optimization, also comprises the training of adaptive neural network model as sample:
After the Energy Saving Strategy of described near-optimization being adopted to the pattern classification of coarseness, as tactful sample to the training of adaptive neural network model.
6. as the described method of claim 1 to 5 any one claim, it is characterized in that:
Described method also comprises: to described Energy Saving Strategy, adopt heuristic greedy algorithm to generate final Energy Saving Strategy, specifically comprise:
Node in each node group is sorted from small to large according to resource utilization;
For the node after sequence, set successively the shutdown strategy, until meet the shutdown number of nodes that described Energy Saving Strategy comprises; And/or, set successively the frequency reducing strategy, until meet the frequency reducing number of nodes in described Energy Saving Strategy.
7. the online energy-saving control device of a cloud computing data center comprises:
Data acquisition module, for the load energy consumption data of each node in monitoring and collection cloud computing data center;
Filtering module, filter described load energy consumption data for the performance constraints condition according to default, and the data after filtering are as the input sample of adaptive neural network model;
The strategy generation module, for adopting the adaptive neural network model to produce Energy Saving Strategy according to described input sample;
Executive Module, for carrying out the Energy Saving Control operation according to described Energy Saving Strategy to each node.
8. device as claimed in claim 7 is characterized in that:
Described filtering module, filter described load energy consumption data for the performance constraints condition according to default, comprising:
By energy consumption in described load energy consumption data lower than default energy consumption threshold value and actual performance higher than the default capabilities index, and/or higher than default energy consumption threshold value and actual performance, the load energy consumption data lower than the default capabilities index filters out energy consumption.
9. device as claimed in claim 7, it is characterized in that: also comprise the adaptive neural network model building module be connected with described tactful generation module, for adopting according to described input sample before the adaptive neural network model produces Energy Saving Strategy tactful generation module is described, judge whether described adaptive neural network model restrains, if not convergence, according to the load energy consumption data of described each node, adopt planning type algorithm model to search out the Energy Saving Strategy of near-optimization, and using the Energy Saving Strategy of described near-optimization as tactful sample to the training of adaptive neural network model, set up the adaptive neural network model, if convergence, notify described tactful generation module to adopt current adaptive neural network model generation Energy Saving Strategy.
10. device as claimed in claim 9 is characterized in that:
Described adaptive neural network model building module, adopt planning type algorithm model to search out the Energy Saving Strategy of near-optimization for the load energy consumption data according to described each node, comprising:
By energy consumption in the load energy consumption data of described each node lower than default energy consumption threshold value and actual performance higher than the default capabilities index or, energy consumption filters out higher than presetting energy consumption threshold value and the actual performance load energy consumption data lower than the default capabilities index, data after filtering, as the input sample of described planning type algorithm model, search out the Energy Saving Strategy of near-optimization.
11. device as claimed in claim 10 is characterized in that:
Described adaptive neural network model building module, for using the Energy Saving Strategy of described near-optimization as sample to the training of adaptive neural network model, also comprise:
After the Energy Saving Strategy of described near-optimization being adopted to the pattern classification of coarseness, as tactful sample to the training of adaptive neural network model.
12., as the described device of claim 7 to 11 any one claim, it is characterized in that:
Described tactful generation module also, for to described Energy Saving Strategy, adopting heuristic greedy algorithm to generate final Energy Saving Strategy, specifically comprises:
Node in each node group is sorted from small to large according to resource utilization;
For the node after sequence, set successively the shutdown strategy, until meet the shutdown number of nodes that described Energy Saving Strategy comprises; And/or, set successively the frequency reducing strategy, until meet the frequency reducing number of nodes in described Energy Saving Strategy.
CN201310339585.5A 2013-08-06 2013-08-06 Online energy-saving control method and the device of a kind of cloud computing data center Active CN103428282B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310339585.5A CN103428282B (en) 2013-08-06 2013-08-06 Online energy-saving control method and the device of a kind of cloud computing data center

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310339585.5A CN103428282B (en) 2013-08-06 2013-08-06 Online energy-saving control method and the device of a kind of cloud computing data center

Publications (2)

Publication Number Publication Date
CN103428282A true CN103428282A (en) 2013-12-04
CN103428282B CN103428282B (en) 2016-05-18

Family

ID=49652446

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310339585.5A Active CN103428282B (en) 2013-08-06 2013-08-06 Online energy-saving control method and the device of a kind of cloud computing data center

Country Status (1)

Country Link
CN (1) CN103428282B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104331520A (en) * 2014-11-28 2015-02-04 北京奇艺世纪科技有限公司 Performance optimization method and device of Hadoop cluster and node state recognition method and device
CN104391560A (en) * 2014-09-28 2015-03-04 浪潮(北京)电子信息产业有限公司 Hopfield neural network-based server energy-saving method and device for cloud data center
CN104503847A (en) * 2015-01-22 2015-04-08 浪潮(北京)电子信息产业有限公司 Data center energy saving method and device
CN104866698A (en) * 2014-02-23 2015-08-26 倪成胜 Data center basic operation effectiveness judgment method
CN105045096A (en) * 2015-08-26 2015-11-11 福建恒天晨光节能服务有限公司 Intelligent energy saving method based on neural network and system thereof
CN105204978A (en) * 2015-06-23 2015-12-30 北京百度网讯科技有限公司 Data center operation data analysis system based on machine learning
CN105227410A (en) * 2015-11-04 2016-01-06 浪潮(北京)电子信息产业有限公司 Based on the method and system that the server load of adaptive neural network detects
CN105868073A (en) * 2016-03-25 2016-08-17 山东超越数控电子有限公司 Data center energy saving strategy implementation method based on time series analysis
CN107154143A (en) * 2017-06-21 2017-09-12 国网山东省电力公司诸城市供电公司 A kind of electric energy meter remote reading system of platform of internet of things
CN109871268A (en) * 2019-01-10 2019-06-11 暨南大学 A kind of energy-saving scheduling method based on air current composition at data-oriented center
CN110276496A (en) * 2019-06-27 2019-09-24 成都慧众云信息技术有限公司 Combustion gas energy consumption data processing method, system and gas appliance based on cloud computing
WO2020108371A1 (en) * 2018-11-30 2020-06-04 Alibaba Group Holding Limited Partitioning of deep learning inference with dynamic offloading
CN111260146A (en) * 2020-01-22 2020-06-09 华南理工大学 Method, device, equipment and medium for locating power system edge cloud data center
CN111786824A (en) * 2020-06-23 2020-10-16 中国电力科学研究院有限公司 Data center energy efficiency ratio optimization method, system, equipment and readable storage medium
CN112486767A (en) * 2020-11-25 2021-03-12 中移(杭州)信息技术有限公司 Intelligent monitoring method, system, server and storage medium for cloud resources
CN113007482A (en) * 2021-02-04 2021-06-22 中国科学院高能物理研究所 Automatic cooling method for long-distance low-temperature conveying pipeline
CN115586952A (en) * 2022-09-09 2023-01-10 上海交通大学 Computing process control system of space on-orbit data center
WO2023155904A1 (en) * 2022-02-21 2023-08-24 华为技术有限公司 Method and apparatus for adjusting operating state of network device, and related device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1647107A (en) * 2002-04-19 2005-07-27 计算机联合思想公司 Automatic neural-net model generation and maintenance
CN101782743A (en) * 2010-02-11 2010-07-21 浙江大学 Neural network modeling method and system
CN102063327A (en) * 2010-12-15 2011-05-18 中国科学院深圳先进技术研究院 Application service scheduling method with power consumption consciousness for data center
CN103164742A (en) * 2013-04-02 2013-06-19 南京邮电大学 Server performance prediction method based on particle swarm optimization nerve network

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1647107A (en) * 2002-04-19 2005-07-27 计算机联合思想公司 Automatic neural-net model generation and maintenance
CN101782743A (en) * 2010-02-11 2010-07-21 浙江大学 Neural network modeling method and system
CN102063327A (en) * 2010-12-15 2011-05-18 中国科学院深圳先进技术研究院 Application service scheduling method with power consumption consciousness for data center
CN103164742A (en) * 2013-04-02 2013-06-19 南京邮电大学 Server performance prediction method based on particle swarm optimization nerve network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
叶可江: "虚拟化云计算平台的能耗管理", 《计算机学报》 *
王文军: "基于神经网络的自适应控制研究综述", 《计算机仿真》 *

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866698A (en) * 2014-02-23 2015-08-26 倪成胜 Data center basic operation effectiveness judgment method
CN104391560A (en) * 2014-09-28 2015-03-04 浪潮(北京)电子信息产业有限公司 Hopfield neural network-based server energy-saving method and device for cloud data center
CN104331520A (en) * 2014-11-28 2015-02-04 北京奇艺世纪科技有限公司 Performance optimization method and device of Hadoop cluster and node state recognition method and device
CN104331520B (en) * 2014-11-28 2018-08-07 北京奇艺世纪科技有限公司 Hadoop clustering performances optimization method and device and node state recognition methods and device
CN104503847A (en) * 2015-01-22 2015-04-08 浪潮(北京)电子信息产业有限公司 Data center energy saving method and device
CN105204978A (en) * 2015-06-23 2015-12-30 北京百度网讯科技有限公司 Data center operation data analysis system based on machine learning
CN105045096A (en) * 2015-08-26 2015-11-11 福建恒天晨光节能服务有限公司 Intelligent energy saving method based on neural network and system thereof
CN105227410A (en) * 2015-11-04 2016-01-06 浪潮(北京)电子信息产业有限公司 Based on the method and system that the server load of adaptive neural network detects
CN105868073A (en) * 2016-03-25 2016-08-17 山东超越数控电子有限公司 Data center energy saving strategy implementation method based on time series analysis
CN107154143A (en) * 2017-06-21 2017-09-12 国网山东省电力公司诸城市供电公司 A kind of electric energy meter remote reading system of platform of internet of things
CN113169990A (en) * 2018-11-30 2021-07-23 阿里巴巴集团控股有限公司 Segmentation of deep learning inference with dynamic offload
CN113169990B (en) * 2018-11-30 2024-04-05 阿里巴巴集团控股有限公司 Segmentation of deep learning reasoning with dynamic offloading
WO2020108371A1 (en) * 2018-11-30 2020-06-04 Alibaba Group Holding Limited Partitioning of deep learning inference with dynamic offloading
CN109871268A (en) * 2019-01-10 2019-06-11 暨南大学 A kind of energy-saving scheduling method based on air current composition at data-oriented center
CN110276496B (en) * 2019-06-27 2020-05-08 成都慧众云信息技术有限公司 Gas energy consumption data processing method and system based on cloud computing and gas appliance
CN110276496A (en) * 2019-06-27 2019-09-24 成都慧众云信息技术有限公司 Combustion gas energy consumption data processing method, system and gas appliance based on cloud computing
CN111260146A (en) * 2020-01-22 2020-06-09 华南理工大学 Method, device, equipment and medium for locating power system edge cloud data center
CN111260146B (en) * 2020-01-22 2022-03-25 华南理工大学 Method, device, equipment and medium for locating power system edge cloud data center
CN111786824A (en) * 2020-06-23 2020-10-16 中国电力科学研究院有限公司 Data center energy efficiency ratio optimization method, system, equipment and readable storage medium
CN112486767A (en) * 2020-11-25 2021-03-12 中移(杭州)信息技术有限公司 Intelligent monitoring method, system, server and storage medium for cloud resources
CN113007482A (en) * 2021-02-04 2021-06-22 中国科学院高能物理研究所 Automatic cooling method for long-distance low-temperature conveying pipeline
WO2023155904A1 (en) * 2022-02-21 2023-08-24 华为技术有限公司 Method and apparatus for adjusting operating state of network device, and related device
CN115586952B (en) * 2022-09-09 2023-09-29 上海交通大学 Calculation flow control system of space on-orbit data center
CN115586952A (en) * 2022-09-09 2023-01-10 上海交通大学 Computing process control system of space on-orbit data center

Also Published As

Publication number Publication date
CN103428282B (en) 2016-05-18

Similar Documents

Publication Publication Date Title
CN103428282A (en) On-line energy-saving control method and device for cloud computing data center
CN103577926B (en) A kind of method that realizes the calculating in real time of large scale electric network theory wire loss and high accuracy
CN107330056A (en) Wind power plant SCADA system and its operation method based on big data cloud computing platform
WO2016078268A1 (en) Energy saving method and device
CN103645795A (en) Cloud computing data center energy saving method based on ANN (artificial neural network)
CN103927693A (en) Distribution network line loss management system
Zhang et al. A new energy efficient VM scheduling algorithm for cloud computing based on dynamic programming
CN104794532A (en) Automatic demand response system and automatic demand response method based on cloud computing PaaS platform
Zhang et al. Smart DC: an AI and digital twin-based energy-saving solution for data centers
CN112884358B (en) Electric heating equipment ordered power utilization optimized scheduling method and terminal
Zhang et al. GreenDRL: managing green datacenters using deep reinforcement learning
CN106201658A (en) A kind of migration virtual machine destination host multiple-objection optimization system of selection
Pei et al. The real‐time state identification of the electricity‐heat system based on Borderline‐SMOTE and XGBoost
CN106780747B (en) A kind of method that Fast Segmentation CFD calculates grid
Xu et al. Distributed power optimization of large wind farms using ADMM for real-time control
CN102509168A (en) Improved integer programming method-based optimum PMU (Power Management Unit) placement method
Bai et al. Automatic modeling and optimization for the digital twin of a regional multi-energy system
WO2023179076A1 (en) Mixed integer programming-based load decomposition method and apparatus for industrial facility
Carnerero et al. Particle based optimization for predictive energy efficient data center management
Ge et al. Improved harris hawks optimization for configuration of PV intelligent edge terminals
Wang Optimization of wireless network node deployment in smart city based on adaptive particle swarm optimization
CN103812120B (en) A kind of var Optimization Method in Network Distribution based on highway network design function
Li et al. Application of Energy Consumption Model and Energy Conservation Technology in New Infrastructure
Nie et al. Technological innovation management for sustainable energy development associated with industrial development and legal duties
Deng Resource sharing system of college English education based on wireless sensor network

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant