Random multiclass classification: generalizing random forests to random MNL and random NB

43Citations
Citations of this article
75Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Random Forests (RF) is a successful classifier exhibiting performance comparable to Adaboost, but is more robust. The exploitation of two sources of randomness, random inputs (bagging) and random features, make RF accurate classifiers in several domains. We hypothesize that methods other than classification or regression trees could also benefit from injecting randomness. This paper generalizes the RF framework to other multiclass classification algorithms like the well-established MultiNomial Logit (MNL) and Naive Bayes (NB). We propose Random MNL (RMNL) as a new bagged classifier combining a forest of MNLs estimated with randomly selected features. Analogously, we introduce Random Naive Bayes (RNB). We benchmark the predictive performance of RF, RMNL and RNB against state-ofthe-art SVM classifiers. RF, RMNL and RNB outperform SVM. Moreover, generalizing RF seems promising as reflected by the improved predictive performance of RMNL. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Prinzie, A., & Van Den Poel, D. (2007). Random multiclass classification: generalizing random forests to random MNL and random NB. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4653 LNCS, pp. 349–358). Springer Verlag. https://doi.org/10.1007/978-3-540-74469-6_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free