Attention-Based Neural Bag-of-Features Learning for Sequence Data

2Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper, we propose 2D-Attention (2DA), a generic attention formulation for sequence data, which acts as a complementary computation block that can detect and focus on relevant sources of information for the given learning objective. The proposed attention module is incorporated into the recently proposed Neural Bag of Feature (NBoF) model to enhance its learning capacity. Since 2DA acts as a plug-in layer, injecting it into different computation stages of the NBoF model results in different 2DA-NBoF architectures, each of which possesses a unique interpretation. We conducted extensive experiments in financial forecasting, audio analysis as well as medical diagnosis problems to benchmark the proposed formulations in comparison with existing methods, including the widely used Gated Recurrent Units. Our empirical analysis shows that the proposed attention formulations can not only improve performances of NBoF models but also make them resilient to noisy data.

Cite

CITATION STYLE

APA

Tran, D. T., Passalis, N., Tefas, A., Gabbouj, M., & Iosifidis, A. (2022). Attention-Based Neural Bag-of-Features Learning for Sequence Data. IEEE Access, 10, 45542–45552. https://doi.org/10.1109/ACCESS.2022.3169776

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free