Robust LSTM With Tuned-PSO and Bifold-Attention Mechanism for Analyzing Multivariate Time-Series

57Citations
Citations of this article
109Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The need for accurate time-series results is badly demanding. LSTM has been applied for forecasting time series, which is generated when variables are observed at discrete and equal time intervals. Nevertheless, the problem of determining hyperparameters with a relatively high random rate will reduce the accuracy of the prediction results. This paper aims to promote LSTM with tuned-PSO and Bifold-Attention mechanism. PSO optimizes LSTM hyperparameters, and Bifold-attention mechanism selects the optimal input for LSTM. An accurate, adaptive, and robust time-series forecasting model is the main contribution, compared with ARIMA, MLP, LSTM, PSO-LSTM, A-LSTM, and PSO-A-LSTM. The model comparison is based on the accuracy of each model in forecasting Beijing PM2.5, Beijing Multi-Site, Air Quality, Appliances Energy, Wind Speed, and Traffic Flow. The Proposed model, LSTM with tuned-PSO and Bifold-Attention mechanism, has lower MAPE and RMSE than baselines. In other words, the model outperformed all LSTM base models in this study. The proposed model's accuracy is adaptable in daily, weekly, and monthly multivariate time-series datasets. This ground-breaking innovation is valuable for time-series analysis research, particularly the implementation of deep learning for time-series forecasting.

Cite

CITATION STYLE

APA

Pranolo, A., Mao, Y., Wibawa, A. P., Utama, A. B. P., & Dwiyanto, F. A. (2022). Robust LSTM With Tuned-PSO and Bifold-Attention Mechanism for Analyzing Multivariate Time-Series. IEEE Access, 10, 78423–78434. https://doi.org/10.1109/ACCESS.2022.3193643

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free