In this work we propose a new method to create neural network ensembles. Our methodology develops over the conventional technique of bagging, where multiple classifiers are trained using a single training data set by generating multiple bootstrap samples from the training data. We propose a new method of sampling using the κ-nearest neighbor density estimates. Our sampling technique gives rise to more variability in the data sets than by bagging. We validate our method by testing on several real data sets and show that our method outperforms bagging. © 2009 Springer-Verlag Berlin Heidelberg.
CITATION STYLE
Chakraborty, D. (2009). Neural network ensembles from training set expansions. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5856 LNCS, pp. 629–636). https://doi.org/10.1007/978-3-642-10268-4_74
Mendeley helps you to discover research relevant for your work.