CN108256626A - The Forecasting Methodology and device of time series - Google Patents
The Forecasting Methodology and device of time series Download PDFInfo
- Publication number
- CN108256626A CN108256626A CN201611234162.7A CN201611234162A CN108256626A CN 108256626 A CN108256626 A CN 108256626A CN 201611234162 A CN201611234162 A CN 201611234162A CN 108256626 A CN108256626 A CN 108256626A
- Authority
- CN
- China
- Prior art keywords
- result
- original temporal
- temporal data
- default feature
- calculation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/18—Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Computational Mathematics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Optimization (AREA)
- General Health & Medical Sciences (AREA)
- Mathematical Analysis (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Operations Research (AREA)
- Probability & Statistics with Applications (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The present invention is suitable for data mining technology field, provides the Forecasting Methodology and device of time series, including:Extract default feature respectively from the collected original temporal data of multiple sensors;By the default feature input shot and long term memory LSTM recurrent neural networks models, result of calculation is exported;Based on linear autoregression algorithm, the result of calculation is handled, and using obtained handling result as prediction result.The present invention extracts feature to original temporal data using vector auto regression method, and recurrent neural network is remembered to obtain result of calculation with reference to shot and long term, finally predicted using linear autoregression method, Linear Statistical Model in compared with the prior art, the prediction result of this programme have higher reliability and accuracy.
Description
Technical field
The invention belongs to data mining technology field more particularly to the Forecasting Methodologies and device of time series.
Background technology
With the fast development of information technology, modern industry system applies a large amount of intellectualized technologies, to ensure production process
It is safe and efficient.Due to the complexity of entire engineering system, need to consider a system is established to monitor in real time and prediction production shape
State needs largely to dispose sensor in the system, these sensors can generate the time series data of magnanimity, ordinal number during being based on these
According to realizing the prediction of time series.
In the prior art, it is that time series is predicted based on Linear Statistical Model mostly, and in practical production
In the process, many scenes are not particularly suited for linear model to be modeled to it, using linear model can cause default result and
The deviation of actual result is larger.
Invention content
In view of this, an embodiment of the present invention provides the Forecasting Methodology of time series and device, to solve in the prior art
The problem of prediction result of time series and larger actual result deviation.
In a first aspect, a kind of Forecasting Methodology of time series is provided, including:
Extract default feature respectively from the collected original temporal data of multiple sensors;
By the default feature input shot and long term memory LSTM recurrent neural networks models, result of calculation is exported;
Based on linear autoregression algorithm, the result of calculation is handled, and using obtained handling result as prediction
As a result.
Second aspect provides a kind of prediction meanss of time series, including:
Extraction unit, for extracting default feature respectively from the collected original temporal data of multiple sensors;
Output unit, for by the default feature input shot and long term memory LSTM recurrent neural networks models, output to be counted
Calculate result;
Predicting unit for being based on linear autoregression algorithm, the result of calculation is handled, and the processing that will be obtained
As a result it is used as prediction result.
The embodiment of the present invention extracts feature to original temporal data using vector auto regression method, and combines shot and long term and remember
Recurrent neural network obtains result of calculation, is finally predicted using linear autoregression method, compared with the prior art in
Linear Statistical Model, the prediction result of this programme have higher reliability and accuracy.
Description of the drawings
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art
Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description be only the present invention some
Embodiment, for those of ordinary skill in the art, without having to pay creative labor, can also be according to these
Attached drawing obtains other attached drawings.
Fig. 1 is the realization flow chart of the Forecasting Methodology of time series provided in an embodiment of the present invention;
Fig. 2 is the specific implementation flow chart of the Forecasting Methodology S102 of time series provided in an embodiment of the present invention;
Fig. 3 is LSTM Recursive Neural Network Structures schematic diagram provided in an embodiment of the present invention;
Fig. 4 is the structure diagram of the prediction meanss of time series provided in an embodiment of the present invention.
Specific embodiment
In being described below, in order to illustrate rather than in order to limit, it is proposed that such as tool of particular system structure, technology etc
Body details, to understand thoroughly the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specifically
The present invention can also be realized in the other embodiments of details.In other situations, it omits to well-known system, device, electricity
Road and the detailed description of method, in case unnecessary details interferes description of the invention.
The embodiment of the present invention uses Method for Feature Selection, analyzes the correlation of multiple time serieses and removes redundancy feature.By
In in different types of recurrent neural networks model, long-term short-term memory network can solve the problems, such as to rely on for a long time, so
Recurrent neural network is remembered using shot and long term respectively to the feature after selection to calculate an index, then the institute to calculating
There is the method that index reuses linear autoregression to be calculated, finally using the result after autoregression as final prediction result.
Fig. 1 shows the realization flow of the Forecasting Methodology of time series provided in an embodiment of the present invention, and details are as follows:
In S101, default feature is extracted respectively from the collected original temporal data of multiple sensors.
In embodiments of the present invention, each sensor and its peripheral circuit collect original respectively as a subsystem
Beginning time series data.
To collected original temporal data carry out feature extraction when, it is preferable that in embodiments of the present invention, using to
Autoregressive method is measured, the original temporal data of standardization are sent into Vector Autoression Models, obtain current signature variable pair
Several lagged variables of all characteristic variables, are returned.Wherein, Vector Autoression Models are that the statistical property based on data is built
Vertical model, using characteristic variable raw in each in system as the letter of the lagged value of interior raw characteristic variables all in system
Number, carry out tectonic model, so as to by single characteristic variable autoregression model be generalized to by multivariate time series variable form vector from
Regression model.
Since in industrial system, data source has sensibility, therefore, before S101, when from industrial system
After sensor collects original temporal data, can original temporal data be subjected to load operation in the form of anonymous, that is, do not mark
The data source of bright original temporal data, so, just can not learn currently processed original temporal data is passed by which
What sensor collected.
In S102, the default feature input shot and long term memory (Long Short Term Memory, LSTM) is passed
Return neural network model, export result of calculation.
As shown in Fig. 2, S102 is specially:
The importance ranking list of the original temporal data is calculated using vector auto regression algorithm by S201.
The corresponding default feature of the original temporal data to sort in most preceding preset quantity is inputted institute by S202
LSTM recurrent neural networks models are stated, export result of calculation.
Wherein, the feature quantity of input LSTM recurrent neural networks models can adjust.In LSTM recurrent neural network moulds
In the processing procedure of type, multiple original temporal data will be used as input, and use storage of the hidden state as network
Device captures the information occurred in time in the past section.Usage history memory calculates the output of the model, and obtains current time stamp
Sensor reading, complete calculate after, memory state will automatically update in subsequent time period and be newly entered data.
In embodiments of the present invention, the neuron of LSTM recurrent neural networks is made of three doors and a storage unit,
Using a sigmoid activation primitive, input gate and state of memory cells would generally be converted door using tanh functions.Tool
Body, three doors of LSTM models are defined respectively by following equation:
it=g (Wxixt+Whiht-1+bi),
ft=g (Wxfxt+whfht-1+bf),
ot=g (Wxoxt+Whoht-1+bo)。
The Input transformation of LSTM models can be defined by following equation:
c_int=tanh (Wxcxt+Whcht-1+bc_in)。
The state update of LSTM models can be defined by following equation:
ct=ft·ct-1+it·c_int,
ht=ot·tanh(ct)。
The structure of the LSTM models can be as shown in Figure 3.
In above-mentioned LSTM models, for each storage unit, the weight W of three doors is obtained by inputting training, including
Complete hidden state in previous time step, one is fed to input node c_int, and one is fed to input gate it, another
It is fed to out gate ot, each node is associated with an activation primitive, and activation primitive is usually sigmoid functions.
In S103, based on linear autoregression algorithm, the result of calculation is handled, and the handling result that will be obtained
As prediction result.
The embodiment of the present invention extracts feature to original temporal data using vector auto regression method, and combines shot and long term and remember
Recurrent neural network obtains result of calculation, is finally predicted using linear autoregression method.Since original temporal data exist
A period of time, an inherent relatively low value moved about, and may beat whithin a period of time very violent, and at one section
In time, the viscosity of original temporal data is relatively low, and within another a period of time, viscosity is again higher, therefore, in this bright embodiment
In, traditional neural network is replaced using LSTM recurrent neural networks, compared with the prior art in Linear Statistical Model, this
The prediction result of scheme has higher reliability and accuracy.
Corresponding to the Forecasting Methodology of the time series described in foregoing embodiments, Fig. 4 shows provided in an embodiment of the present invention
The structure diagram of the prediction meanss of time series for convenience of description, illustrates only part related to the present embodiment.
With reference to Fig. 4, which includes:
Extraction unit 41 extracts default feature respectively from the collected original temporal data of multiple sensors;
Output unit 42, by the default feature input shot and long term memory LSTM recurrent neural networks models, output calculates
As a result;
Based on linear autoregression algorithm, the result of calculation is handled for predicting unit 43, and the processing knot that will be obtained
Fruit is as prediction result.
Optionally, the prediction meanss further include:
Processing unit carries out anonymous processing to the sensor for acquiring the original temporal data.
Optionally, the extraction unit 41 is specifically used for:
Based on Vector Autoression Models, extraction is default special respectively from the collected original temporal data of multiple sensors
Sign.
Optionally, the prediction meanss further include:
Construction unit, builds the Vector Autoression Models, and the Vector Autoression Models become feature raw in each
Measure the function of the lagged value as all interior raw characteristic variables.
Optionally, the output unit 42 includes:
Computation subunit, the importance ranking that the original temporal data are calculated using vector auto regression algorithm are arranged
Table;
Subelement is exported, by the corresponding default feature of the original temporal data in most preceding preset quantity that sorts
The LSTM recurrent neural networks models are inputted, export result of calculation.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each work(
Can unit, module division progress for example, in practical application, can be as needed and by above-mentioned function distribution by different
Functional unit, module are completed, i.e., the internal structure of described device are divided into different functional units or module, more than completion
The all or part of function of description.Each functional unit, module in embodiment can be integrated in a processing unit, also may be used
To be that each unit is individually physically present, can also two or more units integrate in a unit, it is above-mentioned integrated
The form that hardware had both may be used in unit is realized, can also be realized in the form of SFU software functional unit.In addition, each function list
Member, the specific name of module are not limited to the protection domain of the application also only to facilitate mutually distinguish.Above system
The specific work process of middle unit, module can refer to the corresponding process in preceding method embodiment, and details are not described herein.
Those of ordinary skill in the art may realize that each exemplary lists described with reference to the embodiments described herein
Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually
It is performed with hardware or software mode, specific application and design constraint depending on technical solution.Professional technician
Described function can be realized using distinct methods to each specific application, but this realization is it is not considered that exceed
The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed device and method can pass through others
Mode is realized.For example, system embodiment described above is only schematical, for example, the division of the module or unit,
Only a kind of division of logic function, can there is an other dividing mode in actual implementation, such as multiple units or component can be with
With reference to or be desirably integrated into another system or some features can be ignored or does not perform.Another point, it is shown or discussed
Mutual coupling or direct-coupling or communication connection can be by some interfaces, the INDIRECT COUPLING of device or unit or
Communication connection can be electrical, machinery or other forms.
The unit illustrated as separating component may or may not be physically separate, be shown as unit
The component shown may or may not be physical unit, you can be located at a place or can also be distributed to multiple
In network element.Some or all of unit therein can be selected according to the actual needs to realize the mesh of this embodiment scheme
's.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it can also
That each unit is individually physically present, can also two or more units integrate in a unit.Above-mentioned integrated list
The form that hardware had both may be used in member is realized, can also be realized in the form of SFU software functional unit.
If the integrated unit is realized in the form of SFU software functional unit and is independent product sale or uses
When, it can be stored in a computer read/write memory medium.Based on such understanding, the technical solution of the embodiment of the present invention
The part substantially to contribute in other words to the prior art or all or part of the technical solution can be with software products
Form embody, which is stored in a storage medium, including some instructions use so that one
Computer equipment (can be personal computer, server or the network equipment etc.) or processor (processor) perform this hair
The all or part of step of bright each embodiment the method for embodiment.And aforementioned storage medium includes:USB flash disk, mobile hard disk,
Read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic
The various media that can store program code such as dish or CD.
Embodiment described above is merely illustrative of the technical solution of the present invention, rather than its limitations;Although with reference to aforementioned reality
Example is applied the present invention is described in detail, it will be understood by those of ordinary skill in the art that:It still can be to aforementioned each
Technical solution recorded in embodiment modifies or carries out equivalent replacement to which part technical characteristic;And these are changed
Or replace, the spirit and scope for various embodiments of the present invention technical solution that it does not separate the essence of the corresponding technical solution should all
It is included within protection scope of the present invention.
Claims (10)
1. a kind of Forecasting Methodology of time series, which is characterized in that including:
Extract default feature respectively from the collected original temporal data of multiple sensors;
By the default feature input shot and long term memory LSTM recurrent neural networks models, result of calculation is exported;
Based on linear autoregression algorithm, the result of calculation is handled, and using obtained handling result as prediction result.
2. Forecasting Methodology as described in claim 1, which is characterized in that described from the collected original temporal of multiple sensors
Before extracting default feature respectively in data, the Forecasting Methodology further includes:
Anonymous processing is carried out to the sensor for acquiring the original temporal data.
3. Forecasting Methodology as described in claim 1, which is characterized in that described from the collected original temporal number of multiple sensors
Default feature is extracted respectively according to middle, including:
Based on Vector Autoression Models, default feature is extracted respectively from the collected original temporal data of multiple sensors.
4. Forecasting Methodology as claimed in claim 3, which is characterized in that described based on Vector Autoression Models, from multiple biographies
Before extracting default feature respectively in the collected original temporal data of sensor, the Forecasting Methodology further includes:
The Vector Autoression Models are built, the Vector Autoression Models are using characteristic variable raw in each as all interior lifes
The function of the lagged value of characteristic variable.
5. Forecasting Methodology as described in claim 1, which is characterized in that described that the default feature is inputted into LSTM recurrent neurals
Network model exports result of calculation, including:
The importance ranking list of the original temporal data is calculated using vector auto regression algorithm;
The corresponding default feature of the original temporal data to sort in most preceding preset quantity is inputted the LSTM to pass
Return neural network model, export result of calculation.
6. a kind of prediction meanss of time series, which is characterized in that including:
Extraction unit, for extracting default feature respectively from the collected original temporal data of multiple sensors;
Output unit, for by the default feature input shot and long term memory LSTM recurrent neural networks models, output to calculate knot
Fruit;
Predicting unit for being based on linear autoregression algorithm, the result of calculation is handled, and the handling result that will be obtained
As prediction result.
7. prediction meanss as claimed in claim 6, which is characterized in that the prediction meanss further include:
Processing unit, for carrying out anonymous processing to the sensor for acquiring the original temporal data.
8. prediction meanss as claimed in claim 6, which is characterized in that the extraction unit is specifically used for:
Based on Vector Autoression Models, default feature is extracted respectively from the collected original temporal data of multiple sensors.
9. prediction meanss as claimed in claim 8, which is characterized in that the prediction meanss further include:
Construction unit, for building the Vector Autoression Models, the Vector Autoression Models become feature raw in each
Measure the function of the lagged value as all interior raw characteristic variables.
10. prediction meanss as claimed in claim 6, which is characterized in that the output unit includes:
Computation subunit, for the importance ranking of original temporal data row to be calculated using vector auto regression algorithm
Table;
Subelement is exported, for that will sort in the corresponding default feature of the original temporal data of most preceding preset quantity
The LSTM recurrent neural networks models are inputted, export result of calculation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611234162.7A CN108256626A (en) | 2016-12-28 | 2016-12-28 | The Forecasting Methodology and device of time series |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611234162.7A CN108256626A (en) | 2016-12-28 | 2016-12-28 | The Forecasting Methodology and device of time series |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108256626A true CN108256626A (en) | 2018-07-06 |
Family
ID=62719391
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611234162.7A Pending CN108256626A (en) | 2016-12-28 | 2016-12-28 | The Forecasting Methodology and device of time series |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108256626A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109445970A (en) * | 2018-09-18 | 2019-03-08 | 北京工业大学 | A kind of software reliability Time Series Forecasting Methods and application |
CN109710488A (en) * | 2018-12-14 | 2019-05-03 | 北京工业大学 | A kind of time series generation method based on block chain technology |
CN110162987A (en) * | 2019-05-29 | 2019-08-23 | 华南师范大学 | Based on big data and the recursive information concealing method of dynamic time and robot system |
CN110530876A (en) * | 2019-09-04 | 2019-12-03 | 西南交通大学 | Insulator dirty degree development prediction method based on shot and long term Memory Neural Networks |
CN110555413A (en) * | 2019-09-05 | 2019-12-10 | 第四范式(北京)技术有限公司 | method and device for processing time sequence signal, equipment and readable medium |
-
2016
- 2016-12-28 CN CN201611234162.7A patent/CN108256626A/en active Pending
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109445970A (en) * | 2018-09-18 | 2019-03-08 | 北京工业大学 | A kind of software reliability Time Series Forecasting Methods and application |
CN109710488A (en) * | 2018-12-14 | 2019-05-03 | 北京工业大学 | A kind of time series generation method based on block chain technology |
CN110162987A (en) * | 2019-05-29 | 2019-08-23 | 华南师范大学 | Based on big data and the recursive information concealing method of dynamic time and robot system |
CN110162987B (en) * | 2019-05-29 | 2023-04-14 | 华南师范大学 | Information hiding method based on big data and dynamic time recursion and robot system |
CN110530876A (en) * | 2019-09-04 | 2019-12-03 | 西南交通大学 | Insulator dirty degree development prediction method based on shot and long term Memory Neural Networks |
CN110530876B (en) * | 2019-09-04 | 2020-08-18 | 西南交通大学 | Insulator pollution degree development prediction method based on long-term and short-term memory neural network |
CN110555413A (en) * | 2019-09-05 | 2019-12-10 | 第四范式(北京)技术有限公司 | method and device for processing time sequence signal, equipment and readable medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108256626A (en) | The Forecasting Methodology and device of time series | |
CN108665175A (en) | A kind of processing method, device and the processing equipment of insurance business risk profile | |
CN107818344A (en) | The method and system that user behavior is classified and predicted | |
Majhi et al. | New robust forecasting models for exchange rates prediction | |
CN110147911B (en) | Social influence prediction model and prediction method based on content perception | |
CN108090686B (en) | Medical event risk assessment analysis method and system | |
CN109492838A (en) | A kind of stock index price expectation method based on deep-cycle neural network | |
CN110264270A (en) | A kind of behavior prediction method, apparatus, equipment and storage medium | |
CN109829478A (en) | One kind being based on the problem of variation self-encoding encoder classification method and device | |
CN111666494A (en) | Clustering decision model generation method, clustering processing method, device, equipment and medium | |
CN107909141A (en) | A kind of data analysing method and device based on grey wolf optimization algorithm | |
CN111369258A (en) | Entity object type prediction method, device and equipment | |
CN112102011A (en) | User grade prediction method, device, terminal and medium based on artificial intelligence | |
Lee et al. | Hidden markov models for forex trends prediction | |
Pradeepkumar et al. | Forex rate prediction using chaos, neural network and particle swarm optimization | |
CN112767190B (en) | Method and device for identifying phase sequence of transformer area based on multilayer stacked neural network | |
CN113110961A (en) | Equipment abnormality detection method and device, computer equipment and readable storage medium | |
Nagamani et al. | Dissipativity and passivity analysis of Markovian jump impulsive neural networks with time delays | |
Chaturvedi | Soft computing techniques and their applications | |
CN115409262A (en) | Railway data center key performance index trend prediction method and abnormity identification method | |
CN114090797A (en) | Intelligent recommendation-based component retrieval method and device | |
CN107463564A (en) | The characteristic analysis method and device of data in server | |
Cyranka et al. | Unified Long-Term Time-Series Forecasting Benchmark | |
CN109145254A (en) | A kind of calculation method and calculating equipment of accuracy rate | |
Aguilar-Ruiz et al. | Generation of management rules through system dynamics and evolutionary computation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180706 |