Latent Diffusion Transformer for Probabilistic Time Series Forecasting

35Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

Abstract

The probability prediction of multivariate time series is a notoriously challenging but practical task. This research proposes to condense high-dimensional multivariate time series forecasting into a problem of latent space time series generation, to improve the expressiveness of each timestamp and make forecasting more manageable. To solve the problem that the existing work is hard to extend to high-dimensional multivariate time series, we present a latent multivariate time series diffusion framework called Latent Diffusion Transformer (LDT), which consists of a symmetric statistics-aware autoencoder and a diffusion-based conditional generator, to implement this idea. Through careful design, the time series autoencoder can compress multivariate timestamp patterns into a concise latent representation by considering dynamic statistics. Then, the diffusion-based conditional generator is able to efficiently generate realistic multivariate timestamp values on a continuous latent space under a novel self-conditioning guidance which is modeled in a non-autoregressive way. Extensive experiments demonstrate that our model achieves state-of-the-art performance on many popular high-dimensional multivariate time series datasets.

Cite

CITATION STYLE

APA

Feng, S., Miao, C., Zhang, Z., & Zhao, P. (2024). Latent Diffusion Transformer for Probabilistic Time Series Forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 38, pp. 11979–11987). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v38i11.29085

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free