Deep N-ary Error Correcting Output Codes

2Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Ensemble learning consistently improves the performance of multi-class classification through aggregating a series of base classifiers. To this end, data-independent ensemble methods like Error Correcting Output Codes (ECOC) attract increasing attention due to its easiness of implementation and parallelization. Specifically, traditional ECOCs and its general extension N-ary ECOC decompose the original multi-class classification problem into a series of independent simpler classification subproblems. Unfortunately, integrating ECOCs, especially N-ary ECOC with deep neural networks, termed as Deep N-ary ECOC, is not straightforward and yet fully exploited in the literature, due to the high expense of training base learners. To facilitate the training of N-ary ECOC with deep learning base learners, we further propose three different variants of parameter sharing architectures for deep N-ary ECOC. To verify the generalization ability of deep N-ary ECOC, we conduct experiments by varying the backbone with different deep neural network architectures for both image and text classification tasks. Furthermore, extensive ablation studies on deep N-ary ECOC show its superior performance over other deep data-independent ensemble methods.

Cite

CITATION STYLE

APA

Zhang, H., Zhou, J. T., Wang, T., Tsang, I. W., & Goh, R. S. M. (2020). Deep N-ary Error Correcting Output Codes. In International Conference on Mobile Multimedia Communications (MobiMedia) (Vol. 2020-August). ICST. https://doi.org/10.4108/eai.27-8-2020.2299197

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free