Multi criteria wrapper improvements to Naive Bayes learning

13Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Feature subset selection using a wrapper means to perform a search for an optimal set of attributes using the Machine Learning Algorithm as a black box. The Naive Bayes Classifier is based on the assumption of independence among the values of the attributes given the class value. Consequently, its effectiveness may decrease when the attributes are interdependent. We present FBL, a wrapper that uses information about dependencies to guide the search for the optimal subset of features and we use the Naive Bayes Classifier as the black-box Machine Learning algorithm. Experimental results show that FBL allows the Naive Bayes Classifier to achieve greater accuracies, and that FBL performs better than other classical filters and wrappers. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Cortizo, J. C., & Giraldez, I. (2006). Multi criteria wrapper improvements to Naive Bayes learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4224 LNCS, pp. 419–427). Springer Verlag. https://doi.org/10.1007/11875581_51

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free