Exploiting diversity of neural network ensembles based on extreme learning machine

1Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Extreme learning machine (ELM) is an emergent method for training single hidden layer feedforward neural networks (SLFNs) with extremely fast training speed, easy implementation and good generalization performance. This work presents effective ensemble procedures for combining ELMs by exploiting diversity. A large number of ELMs are initially trained in three different scenarios: the original feature input space, the obtained feature subset by forward selection and different random subsets of features. The best combination of ELMs is constructed according to an exact ranking of the trained models and the useless networks are discarded. The experimental results on several regression problems show that robust ensemble approaches that exploit diversity can effectively improve the performance compared with the standard ELM algorithm and other recent ELM extensions. © CTU FTS 2013.

Cite

CITATION STYLE

APA

García-Laencina, P. J., Roca-González, J. L., Bueno-Crespo, A., & Sancho-Gómez, J. L. (2013). Exploiting diversity of neural network ensembles based on extreme learning machine. Neural Network World, 23(5), 395–409. https://doi.org/10.14311/NNW.2013.23.024

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free