A generalized mixture framework for multi-label classification

8Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

We develop a novel probabilistic ensemble framework for multi-label classification that is based on the mixtures-of-experts architecture. In this framework, we combine multi-label classification models in the classifier chains family that decompose the class posterior distribution P(Y1, Yd|X) using a product of posterior distributions over components of the output space. Our approach captures different input-output and output-output relations that tend to change across data. As a result, we can recover a rich set of dependency relations among inputs and outputs that a single multi-label classification model cannot capture due to its modeling simplifications. We develop and present algorithms for learning the mixtures-of-experts models from data and for performing multi-label predictions on unseen data instances. Experiments on multiple benchmark datasets demonstrate that our approach achieves highly competitive results and outperforms the existing state-of-the-art multi-label classification methods.

Cite

CITATION STYLE

APA

Hong, C., Batal, I., & Hauskrecht, M. (2015). A generalized mixture framework for multi-label classification. In SIAM International Conference on Data Mining 2015, SDM 2015 (pp. 712–720). Society for Industrial and Applied Mathematics Publications. https://doi.org/10.1137/1.9781611974010.80

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free