CDEBMTE: Creation of diverse ensemble based on manipulation of training examples

3Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Ensemble methods like Bagging and Boosting which combine the decisions of multiple hypotheses are among the strongest existing machine learning methods. The diversity of the members of an ensemble is known to be an important factor in determining its generalization error. We present a new method for generating ensembles, named CDEBMTE (Creation of Diverse Ensemble Based on Manipulation of Training Examples), that directly constructs diverse hypotheses using manipulation of training examples in three ways: (1) sub-sampling training examples, (2) decreasing/increasing error-prone training examples and (3) decreasing/increasing neighbor samples of error-prone training examples. Experimental results using two well-known classifiers as two base learners demonstrate that this approach consistently achieves higher predictive accuracy than both the base classifier, Adaboost and Bagging. CDEBMTE also outperforms Adaboost more prominent when training data size is becomes larger. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Parvin, H., Parvin, S., Rezaei, Z., & Mohamadi, M. (2012). CDEBMTE: Creation of diverse ensemble based on manipulation of training examples. In Advances in Intelligent and Soft Computing (Vol. 151 AISC, pp. 113–120). https://doi.org/10.1007/978-3-642-28765-7_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free