Learning Deep Attention Network from Incremental and Decremental Features for Evolving Features

1Citations
Citations of this article
36Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In many real-world machine learning problems, the features are changing along the time, with some old features vanishing and some other new features augmented, while the remaining features survived. In this paper, we propose the cross-feature attention network to handle the incremental and decremental features. This network is composed of multiple cross-feature attention encoding-decoding layers. In each layer, the data samples are firstly encoded by the combination of other samples with vanished/augmented features and weighted by the attention weights calculated by the survived features. Then, the samples are encoded by the combination of samples with the survived features weighted by the attention weights calculated from the encoded vanished/augmented feature data. The encoded vanished/augmented/survived features are then decoded and fed to the next cross-feature attention layer. In this way, the incremental and decremental features are bridged by paying attention to each other, and the gap between data samples with a different set of features are filled by the attention mechanism. The outputs of the cross-feature attention network are further concatenated and fed to the class-specific attention and global attention network for the purpose of classification. We evaluate the proposed network with benchmark data sets of computer vision, IoT, and bio-informatics, with incremental and decremental features. Encouraging experimental results show the effectiveness of our algorithm.

Cite

CITATION STYLE

APA

Wang, C., & Mo, H. (2021). Learning Deep Attention Network from Incremental and Decremental Features for Evolving Features. Scientific Programming, 2021. https://doi.org/10.1155/2021/1492828

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free