Ensemble of Networks for Multilabel Classification

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Multilabel learning goes beyond standard supervised learning models by associating a sample with more than one class label. Among the many techniques developed in the last decade to handle multilabel learning best approaches are those harnessing the power of ensembles and deep learners. This work proposes merging both methods by combining a set of gated recurrent units, temporal convolutional neural networks, and long short-term memory networks trained with variants of the Adam optimization approach. We examine many Adam variants, each fundamentally based on the difference between present and past gradients, with step size adjusted for each parameter. We also combine Incorporating Multiple Clustering Centers and a bootstrap-aggregated decision trees ensemble, which is shown to further boost classification performance. In addition, we provide an ablation study for assessing the performance improvement that each module of our ensemble produces. Multiple experiments on a large set of datasets representing a wide variety of multilabel tasks demonstrate the robustness of our best ensemble, which is shown to outperform the state-of-the-art.

Cite

CITATION STYLE

APA

Nanni, L., Trambaiollo, L., Brahnam, S., Guo, X., & Woolsey, C. (2022). Ensemble of Networks for Multilabel Classification. Signals, 3(4), 911–931. https://doi.org/10.3390/signals3040054

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free