THE MEMORY CONCEPT BEHIND DEEP NEURAL NETWORK MODELS: AN APPLICATION IN TIME SERIES FORECASTING IN THE E-COMMERCE SECTOR

3Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

A good command of computational and statistical tools has proven advantageous when modelling and forecasting time series. According to recent literature, neural networks with long memory (e.g., Short-Term Long Memory) are a promising option in deep learning methods. However, only some works also consider the computational cost of these architectures compared to simpler architectures (e.g., Multilayer Perceptron). This work aims to provide insight into the memory performance of some Deep Neural Network architectures and their computational complexity. Another goal is to evaluate whether choosing more complex architectures with higher computational costs is justified. Error metrics are then used to assess the forecasting models' performance and computational cost. Two-time series related to e-commerce retail sales in the US were selected: (i) sales volume; (ii) e-commerce sales as a percentage of total sales. Although there are changes in data dynamics in both series, other existing characteristics lead to different conclusions. "Long memory" allows for significantly better forecasts in onetime series. In the other time series, this is not the case.

Cite

CITATION STYLE

APA

Ramos, F. R., Pereira, M. T., Oliveira, M., & Rubio, L. (2023). THE MEMORY CONCEPT BEHIND DEEP NEURAL NETWORK MODELS: AN APPLICATION IN TIME SERIES FORECASTING IN THE E-COMMERCE SECTOR. Decision Making: Applications in Management and Engineering, 6(2), 668–690. https://doi.org/10.31181/dmame622023695

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free