CN112185104A - Traffic big data restoration method based on countermeasure autoencoder - Google Patents

Traffic big data restoration method based on countermeasure autoencoder Download PDF

Info

Publication number
CN112185104A
CN112185104A CN202010855606.9A CN202010855606A CN112185104A CN 112185104 A CN112185104 A CN 112185104A CN 202010855606 A CN202010855606 A CN 202010855606A CN 112185104 A CN112185104 A CN 112185104A
Authority
CN
China
Prior art keywords
data
generator
matrix
traffic data
traffic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010855606.9A
Other languages
Chinese (zh)
Other versions
CN112185104B (en
Inventor
张伟斌
张蒲璘
余英豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202010855606.9A priority Critical patent/CN112185104B/en
Publication of CN112185104A publication Critical patent/CN112185104A/en
Application granted granted Critical
Publication of CN112185104B publication Critical patent/CN112185104B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a traffic big data restoration method based on a confrontation self-encoder, which comprises the following steps: determining a road section needing traffic data restoration, and collecting historical traffic data of the road section; constructing a mask matrix based on historical traffic data; constructing an antagonistic neural network, comprising: constructing a self-encoder model combined with an attention mechanism as a generator, and constructing a discriminator combined with the attention mechanism; training an antagonistic neural network based on historical traffic data and a mask matrix to generate a data restoration model; and repairing the traffic data acquired in real time on the road section by using the data repairing model. The invention introduces structures such as a self-encoder, a multi-head attention mechanism, a clue matrix and the like on the basis of the GAN, effectively learns the distribution characteristics of the traffic big data by utilizing the structure of the anti-neural network, generates complete traffic data by utilizing the self-encoder according to the missing traffic data, and can effectively improve the accuracy of model data restoration by utilizing the multi-head attention mechanism and the clue matrix.

Description

Traffic big data restoration method based on countermeasure autoencoder
Technical Field
The invention relates to the technical field of traffic data restoration, in particular to a traffic big data restoration method based on a confrontation self-encoder.
Background
Many practical traffic analysis model deployments involve analyzing multi-sensor acquired time series data with spatially distributed signatures. Such as geotagged air temperature data collected by air temperature sensors, air contaminant monitoring data, traffic data collected by road traffic sensors, etc. Due to the reasons of sensor failure, communication error, storage loss and the like, data acquired by the sensor is difficult to avoid and has the problem of data loss. Tasks such as classification of data, regression prediction, and traffic control optimization. Due to the diversity of data missing modes, the task of repairing data is a very challenging task, and a proper algorithm needs to be designed to extract the change rule of data from multidimensional data, especially to find out the interdependence relationship among data.
Traditional data restoration methods tend to have respective limitations, for example, data restoration methods based on statistical analysis and using methods such as mean, mode, median and the like tend to ignore interdependencies in data; the statistical-based machine learning model imposes strong constraints on the data, such as assumptions of linearity, smoothness of the data, and the like.
Dalca et al propose a variational approximation learning algorithm based on Convolutional Neural Network (CNN) and sparse perception. The HI-VAE algorithm proposed by nazaba et al, which is based on a modified algorithm of variational self-coding, can accurately fill in a variety of missing data. Fortuin et al propose a new depth order latent variable model GP-VAE for descent and data restoration. Models based on Recurrent Neural Networks (RNNs) often assume that the relationships between data are of a sequential type, which cannot be processed in parallel and it is difficult to directly model the interdependencies between different time-stamped input data.
Disclosure of Invention
The present invention aims to provide a method for repairing the traffic flow missing data of the countermeasure type self-encoder based on the self-attention mechanism, which aims to overcome the defects of the prior art. The method can effectively capture the incidence relation among different sensors, and can generate the repair data approximate to the real data distribution through a antagonism training mode.
The technical solution for realizing the purpose of the invention is as follows: a method for traffic big data restoration based on an antagonistic self-encoder, the method comprising the steps of:
step 1, determining a road section needing traffic data restoration, and collecting historical traffic data of the road section;
step 2, constructing a mask matrix based on the historical traffic data;
step 3, constructing an antagonistic neural network, comprising: constructing a self-encoder model combined with an attention mechanism as a generator, and constructing a discriminator combined with the attention mechanism;
step 4, training the confrontation neural network based on the historical traffic data and a mask matrix to generate a data restoration model;
and 5, repairing the traffic data acquired in real time on the road section by using the data repairing model.
Further, the step 2 of constructing a mask matrix based on the historical traffic data includes:
step 2-1, constructing a traffic data matrix, wherein the data of the ith row and the jth column of the matrix represent historical traffic data acquired by a jth traffic data acquisition device at the ith time point or time period;
and 2-2, constructing a mask matrix, wherein the dimension of the matrix is the same as that of the traffic data matrix, if data are missing at a certain position in the traffic data matrix, setting the corresponding position in the mask matrix as 0, and otherwise, setting the corresponding position as 1.
Further, the generator in step 3 comprises sequentially arranged position coding modules and sequentially arranged N groups of first modules, wherein the first modules comprise sequentially connected multi-head attention structures and fully-connected neural networks; n is more than or equal to 1; adding the input and output of the multi-head attention structure and inputting the sum into a full-connection neural network;
the position coding module uses sampling values of sine and cosine functions with different frequencies as position coding information:
Figure BDA0002645405230000021
Figure BDA0002645405230000022
where pos is the position of the input data, i represents the dimension, dmodelRepresenting the length of the time dimension of the input; PE (polyethylene)(pos,2i)、PE(pos,2i+1)Respectively representing 2i th and 2i +1 th position codes;
the multi-head attention structure comprises a plurality of scaling point product attention mechanisms;
the discriminator comprises M groups of first modules and two layers of fully-connected networks which are arranged in sequence, the output is a score value, and the input attribute is represented; m is more than or equal to 1.
Further, in step 4, training the antagonistic neural network based on the historical traffic data and the mask matrix to generate a data restoration model, and the specific process includes:
step 4-1, setting the loss function of the discriminator as:
Figure BDA0002645405230000023
where x is the true data, PrIn order to be able to distribute the real data,
Figure BDA0002645405230000024
for the data after the generator is repaired, m is a mask matrix, PcDistributing the repaired data; d is a discriminator, G is a generator, D (x) represents the evaluation score of the discriminator on the real data,
Figure BDA0002645405230000031
representing the evaluation score of the arbiter on the repair data;
Figure BDA0002645405230000032
showing that in the training process of the discriminator, learning parameters of a neural network of the generator are frozen, and E shows an average value;
step 4-2, setting the loss function of the generator as:
Figure BDA0002645405230000033
wherein the content of the first and second substances,
Figure BDA0002645405230000034
in the formula (I), the compound is shown in the specification,
Figure BDA0002645405230000035
the mean error MSE of the data which are really collected in the traffic data matrix and the data which are produced by the generator at the corresponding position is represented, and the reconstruction loss of the repair data is represented;
Figure BDA0002645405230000036
a score value representing the output of the discriminator to the generator, representing the distribution loss of the repair data;
Figure BDA0002645405230000037
representing the result produced by the generator at the known true value position; x' represents known real data in the input data,
Figure BDA0002645405230000038
indicating that the learning parameter of the neural network of the discriminator is frozen during the training process of the generator, which is the Hammett multiplication;
step 4-3, training the generator: inputting real data, namely a traffic data matrix and a mask matrix, into a generator, and outputting a repaired traffic data matrix;
step 4-4, training a discriminator: adding noise to data in the traffic data matrix to be used as a positive sample, using the repaired traffic data matrix as a negative sample, inputting the positive sample and the negative sample into a discriminator, and outputting a judgment result, namely a score value output to a generator;
and repeating the steps 4-3 to 4-4 to enable the generator and the discriminator to play and optimize continuously until the data restoration precision reaches a preset threshold value.
Compared with the prior art, the invention has the following remarkable advantages: 1) judging the generated traffic data and real data by using a GAN network structure, and forcing a generator to generate data which is closer to real distribution; 2) compared with other data restoration methods, the method has higher restoration accuracy in three data deletion modes of time dimension deletion, space dimension deletion and block deletion; 3) by adopting an unsupervised learning mode, the problems of incomplete historical data and missing in the actual situation can be effectively solved; 4) by utilizing the attention mechanism, different weights can be given to the sensors at different positions and data at different time, so that the aim of accurate repair is fulfilled.
The present invention is described in further detail below with reference to the attached drawing figures.
Drawings
FIG. 1 is a flow chart of a method for repairing traffic big data based on a countermeasure self-encoder in one embodiment.
FIG. 2 is a diagram of an exemplary embodiment of an autoencoder model.
FIG. 3 is a diagram of a scaled dot product attention mechanism in one embodiment.
FIG. 4 is a diagram of a multi-headed attention structure in one embodiment.
Fig. 5 is a diagram showing the structure of the generator and the discriminator in one embodiment, and fig. (a) and (b) are a diagram showing the structure of the generator and the structure of the discriminator, respectively.
FIG. 6 is a graph comparing the results of the repair model in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, in conjunction with fig. 1, there is provided a method for repairing traffic big data based on a confrontation self-encoder, the method comprising the following steps:
step 1, determining a road section needing traffic data restoration, and collecting historical traffic data of the road section;
here, the acquisition data may be acquired at a certain time step.
Here, the historical traffic data includes road flow, speed, and occupancy.
Step 2, constructing a mask matrix based on the historical traffic data;
step 3, constructing an antagonistic neural network, comprising: constructing a self-encoder model combined with an attention mechanism as a generator, and constructing a discriminator combined with the attention mechanism;
step 4, training the confrontation neural network based on the historical traffic data and a mask matrix to generate a data restoration model;
and 5, repairing the traffic data acquired in real time on the road section by using the data repairing model.
Further, in one embodiment, with reference to fig. 2, the step 2 of building a self-coding model based on the historical traffic data includes:
step 2-1, constructing a traffic data matrix, wherein the data of the ith row and the jth column of the matrix represent historical traffic data acquired by a jth traffic data acquisition device at the ith time point or time period;
and 2-2, constructing a mask matrix, wherein the dimension of the matrix is the same as that of the traffic data matrix, if data are missing at a certain position in the traffic data matrix, setting the corresponding position in the mask matrix as 0, and otherwise, setting the corresponding position as 1.
Further, in one embodiment, with reference to fig. 5(a), the generator in step 3 includes sequentially arranged position coding modules, and sequentially arranged N groups of first modules, where the first modules include sequentially connected multi-head attention structure, fully-connected neural network; n is more than or equal to 1; adding the input and output of the multi-head attention structure and inputting the sum into a full-connection neural network;
here, N is preferably 3.
The position coding module uses sampling values of sine and cosine functions with different frequencies as position coding information:
Figure BDA0002645405230000051
Figure BDA0002645405230000052
where pos is the position of the input data, i represents the dimension, dmodelRepresenting the length of the time dimension of the input; PE (polyethylene)(pos,2i)、PE(pos,2i+1)Respectively representing 2i th and 2i +1 th position codes;
this function was chosen because it allows the model to easily learn to participate in the computation by relative position, for any fixed offset k, PE(pos+k,i)Can be expressed as PE(pos,i)A function of (a). In the traffic data restoration problem, different position codes are applied to different acquisition moments in one day, so that the self-coding network can use the information of the extracted position codes for constructing missing parts of input data, the position input information can be used as one type of condition information generated by model data to be input, and the missing data generation problem is converted into a similar condition data generation problem.
With reference to fig. 4, the multi-headed attention structure includes a plurality of scaled dot product attention mechanisms. Here preferably 6 attention functions are included. The scaling and dot product attention mechanism is a special attention mechanism as shown in FIG. 3, and the dimension d is obtained by calculation according to the inputk"query", "key" and dimension of dvA "value" vector of. Multiply and divide the points of "query" and "key" by
Figure BDA0002645405230000053
And obtaining the attention of the current input to each different time point input, applying a flexible maximum transfer function (Softmax) to carry out normalization processing on the different time point inputs to obtain an attention weight value, and finally multiplying the corresponding attention by the value vector to obtain the attention output.
In an actual calculation process, the calculation process of attention among the whole input vectors can be operated in parallel, and is represented as the following process:
Figure BDA0002645405230000054
for each mapped query, key and value, the model obtains each attention function generation d by executing the attention functions in parallelvThe output value of the dimension. Connecting the values and performing weighted projection again to obtain a final attention feature map, and mapping the query, the key and the value to the same h d numbers by using h times of linear transformationk、dkAnd dvDimension, finally, the output is spliced and mapped to d through weightingmodelAnd (5) maintaining. The calculation process can be expressed as:
MultiHead(Q,K,V)=Concat(head1,head2,...,headn)WO
headi=Attention(QWi Q,KWi KVWi V)
with reference to fig. 5(b), the discriminator includes M groups of first modules and two layers of fully connected networks, the output is a score value, which represents the attribute of the input, and the larger the output value is, the more the discriminator deems the input at this time is the more "true"; m is more than or equal to 1.
Here, preferably, M takes 3.
Further, in one embodiment, the training of the antagonistic neural network based on the historical traffic data and the mask matrix in step 4 to generate a data restoration model includes:
step 4-1, the output of the generator is either completely real or completely false in the standard GAN by the discriminator, while in the data recovery problem, the output is composed of a part of real components and a part of generated components. The arbiter attempts to tell which inputs are completely true (observed) and which contain a generating component (repaired). For this feature, the loss function of the discriminator is set as:
Figure BDA0002645405230000061
where x is the true data, PrIn order to be able to distribute the real data,
Figure BDA0002645405230000062
for the data after the generator is repaired, m is a mask matrix, PcDistributing the repaired data; d is a discriminator, G is a generator, D (x) represents the evaluation score of the discriminator on the real data,
Figure BDA0002645405230000063
representing the evaluation score of the arbiter on the repair data;
Figure BDA0002645405230000064
showing that in the training process of the discriminator, learning parameters of a neural network of the generator are frozen, and E shows an average value;
as can be seen from the formula, the goal of the discriminator is to increase as much as possible the score given by the discriminator to the real sample (maximum 1), and to decrease the score given to the false sample from the generator (minimum 0)
Step 4-2, setting the loss function of the generator as:
Figure BDA0002645405230000065
wherein the content of the first and second substances,
Figure BDA0002645405230000066
in the formula (I), the compound is shown in the specification,
Figure BDA0002645405230000067
data representing true acquisitions in a traffic data matrix andthe absolute average error MSE of the data generated by the generator corresponding to the position represents the reconstruction loss of the repair data;
Figure BDA0002645405230000068
a score value representing the output of the discriminator to the generator, representing the distribution loss of the repair data;
Figure BDA0002645405230000071
representing the result produced by the generator at the known true value position; x' represents known real data in the input data,
Figure BDA0002645405230000072
indicating that the learning parameter of the neural network of the discriminator is frozen during the training process of the generator, which is the Hammett multiplication;
step 4-3, training the generator: inputting real data, namely a traffic data matrix and a mask matrix, into a generator, and outputting a repaired traffic data matrix;
step 4-4, training a discriminator: adding noise to data in the traffic data matrix to be used as a positive sample, using the repaired traffic data matrix as a negative sample, inputting the positive sample and the negative sample into a discriminator, and outputting a judgment result, namely a score value output to a generator;
and repeating the steps 4-3 to 4-4 to enable the generator and the discriminator to play and optimize continuously until the data restoration precision reaches a preset threshold value.
As a specific example, comparing the model of the present invention with several other data recovery models (historical average model, K-nearest neighbor model, depth self-encoder, bidirectional recurrent neural network), the recovery accuracy is shown in fig. 6, and it can be known from fig. 6 that the recovery accuracy of the method of the present invention is higher than that of other data recovery models for different data loss rates.
The invention provides an attention mechanism-based unsupervised traffic data missing data repairing method, which is used for estimating missing values in a multivariate time sequence by utilizing a generated countermeasure network. The self-attention mechanism self-encoder generates a new complete sample closest to the original incomplete sample by means of discriminant loss and the loss of the squared error of the known data. Therefore, the missing values can be estimated with the new complete samples generated in a short training time. Besides, the introduction of the attention mechanism can effectively improve the traffic flow data repairing effect, can capture the interdependence relation between input data in a parallel computing mode among different time stamps, and ensures the accuracy of the data generated by a generator by utilizing the square error loss on the known partial data.
The foregoing illustrates and describes the principles, general features, and advantages of the present invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (5)

1. A method for repairing traffic big data based on an antagonistic self-encoder is characterized by comprising the following steps:
step 1, determining a road section needing traffic data restoration, and collecting historical traffic data of the road section;
step 2, constructing a mask matrix based on the historical traffic data;
step 3, constructing an antagonistic neural network, comprising: constructing a self-encoder model combined with an attention mechanism as a generator, and constructing a discriminator combined with the attention mechanism;
step 4, training the confrontation neural network based on the historical traffic data and a mask matrix to generate a data restoration model;
and 5, repairing the traffic data acquired in real time on the road section by using the data repairing model.
2. The method for restoring the traffic big data based on the antagonistic self-encoder as claimed in claim 1, wherein the historical traffic data in step 1 comprises road flow, speed and occupancy.
3. The method for repairing traffic big data based on the antagonistic self-encoder according to the claim 1 or 2, characterized in that the step 2 constructs a mask matrix based on the historical traffic data, and the specific process comprises:
step 2-1, constructing a traffic data matrix, wherein the data of the ith row and the jth column of the matrix represent historical traffic data acquired by a jth traffic data acquisition device at the ith time point or time period;
and 2-2, constructing a mask matrix, wherein the dimension of the matrix is the same as that of the traffic data matrix, if data are missing at a certain position in the traffic data matrix, setting the corresponding position in the mask matrix as 0, and otherwise, setting the corresponding position as 1.
4. The method for repairing the traffic big data based on the antagonistic self-encoder according to the claim 3, wherein the generator in the step 3 comprises sequentially arranged position encoding modules and sequentially arranged N groups of first modules, wherein the first modules comprise sequentially connected multi-head attention structure and fully connected neural network; n is more than or equal to 1; adding the input and output of the multi-head attention structure and inputting the sum into a full-connection neural network;
the position coding module uses sampling values of sine and cosine functions with different frequencies as position coding information:
Figure FDA0002645405220000011
Figure FDA0002645405220000012
where pos is the position of the input data, i represents the dimension, dmodelRepresenting the length of the time dimension of the input; PE (polyethylene)(pos,2i)、PE(pos,2i+1)Respectively representing 2i th and 2i +1 th position codes;
the multi-head attention structure comprises a plurality of scaling point product attention mechanisms;
the discriminator comprises M groups of first modules and two layers of fully-connected networks which are arranged in sequence, the output is a score value, and the input attribute is represented; m is more than or equal to 1.
5. The method for repairing the traffic big data based on the antagonistic self-encoder as claimed in claim 4, wherein the step 4 trains the antagonistic neural network based on the historical traffic data and the mask matrix to generate a data repairing model, and the specific process comprises:
step 4-1, setting the loss function of the discriminator as:
Figure FDA0002645405220000021
where x is the true data, PrIn order to be able to distribute the real data,
Figure FDA0002645405220000022
for the repair data generated by the generator, m is a mask matrix, PcDistributing the repaired data; d is a discriminator, G is a generator, D (x) represents the evaluation score of the discriminator on the real data,
Figure FDA0002645405220000023
representing the evaluation score of the arbiter on the repair data;
Figure FDA0002645405220000024
showing that in the training process of the discriminator, learning parameters of a neural network of the generator are frozen, and E shows an average value;
step 4-2, setting the loss function of the generator as:
Figure FDA0002645405220000025
wherein the content of the first and second substances,
x'=x⊙m,
Figure FDA0002645405220000026
in the formula (I), the compound is shown in the specification,
Figure FDA0002645405220000027
the mean error MSE of the data which are really collected in the traffic data matrix and the data which are produced by the generator at the corresponding position is represented, and the reconstruction loss of the repair data is represented;
Figure FDA0002645405220000028
a score value representing the output of the discriminator to the generator, representing the distribution loss of the repair data;
Figure FDA0002645405220000029
representing the result produced by the generator at the known true value position; x' represents known real data in the input data,
Figure FDA00026454052200000210
indicating that the learning parameter of the neural network of the discriminator is frozen during the training process of the generator, which is the Hammett multiplication;
step 4-3, training the generator: inputting real data, namely a traffic data matrix and a mask matrix, into a generator, and outputting a repaired traffic data matrix;
step 4-4, training a discriminator: adding noise to data in the traffic data matrix to be used as a positive sample, using the repaired traffic data matrix as a negative sample, inputting the positive sample and the negative sample into a discriminator, and outputting a judgment result, namely a score value output to a generator;
and repeating the steps 4-3 to 4-4 to enable the generator and the discriminator to play and optimize continuously until the data restoration precision reaches a preset threshold value.
CN202010855606.9A 2020-08-22 2020-08-22 Traffic big data restoration method based on countermeasure autoencoder Active CN112185104B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010855606.9A CN112185104B (en) 2020-08-22 2020-08-22 Traffic big data restoration method based on countermeasure autoencoder

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010855606.9A CN112185104B (en) 2020-08-22 2020-08-22 Traffic big data restoration method based on countermeasure autoencoder

Publications (2)

Publication Number Publication Date
CN112185104A true CN112185104A (en) 2021-01-05
CN112185104B CN112185104B (en) 2021-12-10

Family

ID=73925361

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010855606.9A Active CN112185104B (en) 2020-08-22 2020-08-22 Traffic big data restoration method based on countermeasure autoencoder

Country Status (1)

Country Link
CN (1) CN112185104B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112905379A (en) * 2021-03-10 2021-06-04 南京理工大学 Traffic big data restoration method based on graph self-encoder of self-attention mechanism
CN113190997A (en) * 2021-04-29 2021-07-30 贵州数据宝网络科技有限公司 Big data terminal data restoration method and system
CN113643564A (en) * 2021-07-27 2021-11-12 中国科学院深圳先进技术研究院 Parking data restoration method and device, computer equipment and storage medium
CN114996625A (en) * 2022-04-26 2022-09-02 西南石油大学 Logging data completion method based on Bayesian optimization and self-encoder
CN115019510A (en) * 2022-06-29 2022-09-06 华南理工大学 Traffic data restoration method based on dynamic self-adaptive generation countermeasure network
CN115659797A (en) * 2022-10-24 2023-01-31 大连理工大学 Self-learning method for generating anti-multi-head attention neural network aiming at aeroengine data reconstruction
CN116542438A (en) * 2023-03-28 2023-08-04 大连海事大学 Bus passenger starting and stopping point estimation and repair method based on non-reference real phase

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108520503A (en) * 2018-04-13 2018-09-11 湘潭大学 A method of based on self-encoding encoder and generating confrontation network restoration face Incomplete image
US20180293496A1 (en) * 2017-04-06 2018-10-11 Pixar Denoising monte carlo renderings using progressive neural networks
CN109493599A (en) * 2018-11-16 2019-03-19 南京航空航天大学 A kind of Short-time Traffic Flow Forecasting Methods based on production confrontation network
CN109492232A (en) * 2018-10-22 2019-03-19 内蒙古工业大学 A kind of illiteracy Chinese machine translation method of the enhancing semantic feature information based on Transformer
WO2019090213A1 (en) * 2017-11-03 2019-05-09 Siemens Aktiengesellschaft Segmenting and denoising depth images for recognition applications using generative adversarial neural networks
CN110018927A (en) * 2019-01-28 2019-07-16 北京工业大学 Based on the traffic data restorative procedure for generating confrontation network
CN110288537A (en) * 2019-05-20 2019-09-27 湖南大学 Facial image complementing method based on the depth production confrontation network from attention
CN110597963A (en) * 2019-09-23 2019-12-20 腾讯科技(深圳)有限公司 Expression question-answer library construction method, expression search method, device and storage medium
CN110838288A (en) * 2019-11-26 2020-02-25 杭州博拉哲科技有限公司 Voice interaction method and system and dialogue equipment
CN110942624A (en) * 2019-11-06 2020-03-31 浙江工业大学 Road network traffic data restoration method based on SAE-GAN-SAD
US20200151081A1 (en) * 2017-11-13 2020-05-14 The Charles Stark Draper Laboratory, Inc. Automated Repair Of Bugs And Security Vulnerabilities In Software
CN111311729A (en) * 2020-01-18 2020-06-19 西安电子科技大学 Natural scene three-dimensional human body posture reconstruction method based on bidirectional projection network

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180293496A1 (en) * 2017-04-06 2018-10-11 Pixar Denoising monte carlo renderings using progressive neural networks
WO2019090213A1 (en) * 2017-11-03 2019-05-09 Siemens Aktiengesellschaft Segmenting and denoising depth images for recognition applications using generative adversarial neural networks
US20200151081A1 (en) * 2017-11-13 2020-05-14 The Charles Stark Draper Laboratory, Inc. Automated Repair Of Bugs And Security Vulnerabilities In Software
CN108520503A (en) * 2018-04-13 2018-09-11 湘潭大学 A method of based on self-encoding encoder and generating confrontation network restoration face Incomplete image
CN109492232A (en) * 2018-10-22 2019-03-19 内蒙古工业大学 A kind of illiteracy Chinese machine translation method of the enhancing semantic feature information based on Transformer
CN109493599A (en) * 2018-11-16 2019-03-19 南京航空航天大学 A kind of Short-time Traffic Flow Forecasting Methods based on production confrontation network
CN110018927A (en) * 2019-01-28 2019-07-16 北京工业大学 Based on the traffic data restorative procedure for generating confrontation network
CN110288537A (en) * 2019-05-20 2019-09-27 湖南大学 Facial image complementing method based on the depth production confrontation network from attention
CN110597963A (en) * 2019-09-23 2019-12-20 腾讯科技(深圳)有限公司 Expression question-answer library construction method, expression search method, device and storage medium
CN110942624A (en) * 2019-11-06 2020-03-31 浙江工业大学 Road network traffic data restoration method based on SAE-GAN-SAD
CN110838288A (en) * 2019-11-26 2020-02-25 杭州博拉哲科技有限公司 Voice interaction method and system and dialogue equipment
CN111311729A (en) * 2020-01-18 2020-06-19 西安电子科技大学 Natural scene three-dimensional human body posture reconstruction method based on bidirectional projection network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王万良,李卓蓉: "生成式对抗网络研究进展", 《通信学报》 *
王力 等: "基于生成式对抗网络的路网交通流数据补全方法", 《交通运输***工程与信息》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112905379A (en) * 2021-03-10 2021-06-04 南京理工大学 Traffic big data restoration method based on graph self-encoder of self-attention mechanism
CN113190997A (en) * 2021-04-29 2021-07-30 贵州数据宝网络科技有限公司 Big data terminal data restoration method and system
CN113643564A (en) * 2021-07-27 2021-11-12 中国科学院深圳先进技术研究院 Parking data restoration method and device, computer equipment and storage medium
CN113643564B (en) * 2021-07-27 2022-08-26 中国科学院深圳先进技术研究院 Parking data restoration method and device, computer equipment and storage medium
CN114996625A (en) * 2022-04-26 2022-09-02 西南石油大学 Logging data completion method based on Bayesian optimization and self-encoder
CN115019510A (en) * 2022-06-29 2022-09-06 华南理工大学 Traffic data restoration method based on dynamic self-adaptive generation countermeasure network
CN115019510B (en) * 2022-06-29 2024-01-30 华南理工大学 Traffic data restoration method based on dynamic self-adaptive generation countermeasure network
CN115659797A (en) * 2022-10-24 2023-01-31 大连理工大学 Self-learning method for generating anti-multi-head attention neural network aiming at aeroengine data reconstruction
CN115659797B (en) * 2022-10-24 2023-03-28 大连理工大学 Self-learning method for generating anti-multi-head attention neural network aiming at aeroengine data reconstruction
WO2024087129A1 (en) * 2022-10-24 2024-05-02 大连理工大学 Generative adversarial multi-head attention neural network self-learning method for aero-engine data reconstruction
CN116542438A (en) * 2023-03-28 2023-08-04 大连海事大学 Bus passenger starting and stopping point estimation and repair method based on non-reference real phase
CN116542438B (en) * 2023-03-28 2024-01-30 大连海事大学 Bus passenger starting and stopping point estimation and repair method based on non-reference real phase

Also Published As

Publication number Publication date
CN112185104B (en) 2021-12-10

Similar Documents

Publication Publication Date Title
CN112185104B (en) Traffic big data restoration method based on countermeasure autoencoder
Li et al. The emerging graph neural networks for intelligent fault diagnostics and prognostics: A guideline and a benchmark study
CN108960303B (en) Unmanned aerial vehicle flight data anomaly detection method based on LSTM
CN112101480B (en) Multivariate clustering and fused time sequence combined prediction method
JP2022524244A (en) Predictive classification of future behavior
CN110110809B (en) Fuzzy automaton construction method based on machine fault diagnosis
CN111768000A (en) Industrial process data modeling method for online adaptive fine-tuning deep learning
CN113485261B (en) CAEs-ACNN-based soft measurement modeling method
CN104634265B (en) A kind of mineral floating froth bed soft measurement method of thickness based on multiplex images Fusion Features
CN117784710B (en) Remote state monitoring system and method for numerical control machine tool
Yang et al. Remaining useful life prediction based on normalizing flow embedded sequence-to-sequence learning
CN115051929B (en) Network fault prediction method and device based on self-supervision target perception neural network
Chen et al. Discovering state variables hidden in experimental data
Mudronja et al. Data-based modelling of significant wave height in the Adriatic sea
Zhang et al. ES-Net: An integration model based on encoder–decoder and Siamese time series difference network for grade monitoring of zinc tailings and concentrate
CN113984389A (en) Rolling bearing fault diagnosis method based on multi-receptive-field and improved capsule map neural network
CN114239397A (en) Soft measurement modeling method based on dynamic feature extraction and local weighted deep learning
CN111061151B (en) Distributed energy state monitoring method based on multivariate convolutional neural network
CN117371321A (en) Internal plasticity depth echo state network soft measurement modeling method based on Bayesian optimization
CN116432359A (en) Variable topology network tide calculation method based on meta transfer learning
CN116068520A (en) Cognitive radar joint modulation recognition and parameter estimation method based on transducer
CN116662925A (en) Industrial process soft measurement method based on weighted sparse neural network
CN116403054A (en) Image optimization classification method based on brain-like network model
CN115359197A (en) Geological curved surface reconstruction method based on spatial autocorrelation neural network
CN114841063A (en) Aero-engine residual life prediction method based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant