Extending extreme learning machine with combination layer

2Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We consider the Extreme Learning Machine model for accurate regression estimation and the related problem of selecting the appropriate number of neurons for the model. Selection strategies that choose "the best" model from a set of candidate network structures neglect the issues of model selection uncertainty. To alleviate the problem, we propose to remove this selection phase with a combination layer that takes into account all considered models. The proposed method in this paper is the Extreme Learning Machine(Jackknife Model Averaging), where Jackknife Model Averaging is a combination method based on leave-one-out residuals of linear models. The combination approach is shown to have better predictive performance on several real-world data sets. © 2013 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Sovilj, D., Lendasse, A., & Simula, O. (2013). Extending extreme learning machine with combination layer. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7902 LNCS, pp. 417–426). https://doi.org/10.1007/978-3-642-38679-4_41

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free