A connection between extreme learning machine and neural network kernel

11Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We study a connection between extreme learning machine (ELM) and neural network kernel (NNK). NNK is derived from a neural network with an infinite number of hidden units. We interpret ELM as an approximation to this infinite network. We show that ELM and NNK can, to certain extent, replace each other. ELM can be used to form a kernel, and NNK can be decomposed into feature vectors to be used in the hidden layer of ELM. The connection reveals possible importance of weight variance as a parameter of ELM. Based on our experiments, we recommend that model selection on ELM should consider not only the number of hidden units, as is the current practice, but also the variance of weights. We also study the interaction of variance and the number of hidden units, and discuss some properties of ELM, that may have been too strongly interpreted previously. © 2013 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Parviainen, E., & Riihimäki, J. (2013). A connection between extreme learning machine and neural network kernel. In Communications in Computer and Information Science (Vol. 272 CCIS, pp. 122–135). Springer Verlag. https://doi.org/10.1007/978-3-642-29764-9_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free