TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series Forecasting

4Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In recent years, Transformer-based models have achieved good results in the analysis and application of time series. In particular, the introduction of Autoformer has further improved the performance of the model in long-term sequence prediction. However, Transformer-based models, such as Autoformer, have not fully considered the local temporal features of the sequence, and have not addressed the impact of sequence anomalies on decomposition and the processing of trend terms. To address these issues, we combined the excellent performance of the time convolutional neural network (TCN) on time series data and the advantages of the STL inner-outer loop decomposition to design the TEDformer, a Transformer prediction model enhanced with global and local temporal features. The model decomposes the time series into trend and periodic terms using STL and extracts temporal features accordingly. We conducted experiments on six real-world datasets, and the results showed that our model improved by 10.8% on multivariate datasets and 15.7% on univariate datasets compared to state-of-the-art models.

Cite

CITATION STYLE

APA

Fan, J., Wang, B., & Bian, D. (2025). TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series Forecasting. IEEE Access, 13, 120821–120829. https://doi.org/10.1109/ACCESS.2023.3287893

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free