Fully embedded time series generative adversarial networks

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Generative adversarial networks should produce synthetic data that fits the underlying distribution of the data being modeled. For real-valued time series data, this implies the need to simultaneously capture the static distribution of the data, but also the full temporal distribution of the data for any potential time horizon. This temporal element produces a more complex problem that can potentially leave current solutions under-constrained, unstable during training, or prone to varying degrees of mode collapse. In FETSGAN, entire sequences are translated directly to the generator’s sampling space using a seq2seq style adversarial autoencoder, where adversarial training is used to match the training distribution in both the feature space and the lower-dimensional sampling space. This additional constraint provides a loose assurance that the temporal distribution of the synthetic samples will not collapse. In addition, the First Above Threshold operator is introduced to supplement the reconstruction of encoded sequences, which improves training stability and the overall quality of the synthetic data being generated. These novel contributions demonstrate a significant improvement to the current state of the art for adversarial learners in qualitative measures of temporal similarity and quantitative predictive ability of data generated through FETSGAN.

Original languageEnglish
Pages (from-to)14885-14894
Number of pages10
JournalNeural Computing and Applications
Volume36
Issue number24
DOIs
StatePublished - Aug 2024
Externally publishedYes

Funding

Funding for this work was partially provided by the Collaborative Sciences Center for Road Safety (CSCRS), as well as the University of Tennessee, Knoxville.

Keywords

  • Adversarial autoencoder
  • Generative adversarial networks (GANs)
  • Synthetic time series data

Fingerprint

Dive into the research topics of 'Fully embedded time series generative adversarial networks'. Together they form a unique fingerprint.

Cite this