Enhancing Multivariate Time Series Classifiers Through Self-Attention and Relative Positioning Infusion

2Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Time Series Classification (TSC) is an important and challenging task for many visual computing applications. Despite the extensive range of methods developed for TSC, only a few are based on Deep Neural Networks (DNNs). In this paper, we present two novel attention blocks: (Global Temporal Attention and Temporal Pseudo-Gaussian Augmented Self-Attention) that can enhance deep learning-based TSC approaches, even when such approaches are designed and optimized for specific datasets or tasks. We validate the performance of the proposed blocks using multiple state-of-the-art deep learning-based TSC models on the University of East Anglia (UEA) benchmark, including a standardized collection of 30 Multivariate Time Series Classification (MTSC) datasets. We demonstrate that adding the proposed attention blocks increases base models' average accuracy by up to 3.6%. Additionally, the proposed TPS block uses a new injection module to include the relative positional information in transformers. As a standalone unit with less computational complexity, it enables TPS to perform better than most of the state-of-the-art DNN-based TSC methods. The source codes for our setups and the attention blocks are publicly available (https://github.com/mehryar72/TimeSeriesClassification-TPS).

Cite

CITATION STYLE

APA

Abbasi, M., & Saeedi, P. (2024). Enhancing Multivariate Time Series Classifiers Through Self-Attention and Relative Positioning Infusion. IEEE Access, 12, 67273–67290. https://doi.org/10.1109/ACCESS.2024.3397783

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free