Time-Aware Multi-Scale RNNs for Time Series Modeling

27Citations
Citations of this article
41Readers
Mendeley users who have this article in their library.

Abstract

Multi-scale information is crucial for modeling time series. Although most existing methods consider multiple scales in the time-series data, they assume all kinds of scales are equally important for each sample, making them unable to capture the dynamic temporal patterns of time series. To this end, we propose Time-Aware Multi-Scale Recurrent Neural Networks (TAMS-RNNs), which disentangle representations of different scales and adaptively select the most important scale for each sample at each time step. First, the hidden state of the RNN is disentangled into multiple independently updated small hidden states, which use different update frequencies to model time-series multi-scale information. Then, at each time step, the temporal context information is used to modulate the features of different scales, selecting the most important time-series scale. Therefore, the proposed model can capture the multi-scale information for each time series at each time step adaptively. Extensive experiments demonstrate that the model outperforms state-of-the-art methods on multivariate time series classification and human motion prediction tasks. Furthermore, visualized analysis on music genre recognition verifies the effectiveness of the model.

Cite

CITATION STYLE

APA

Chen, Z., Ma, Q., & Lin, Z. (2021). Time-Aware Multi-Scale RNNs for Time Series Modeling. In IJCAI International Joint Conference on Artificial Intelligence (pp. 2285–2291). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2021/315

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free