A Fast Incremental Method Based on Regularized Extreme Learning Machine

  • Xu Z
  • Yao M
N/ACitations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Extreme Learning Machine(ELM) proposed by Huang et al is a new and simple algorithm for single hidden layer feedforward neural network(SLFN) with extreme fast learning speed and good generalization performance.When new hidden nodes are added to existing network retraining the network would be time consuming, and EM-ELM is proposed to calculate the output weight incrementally.However there are still two issues in EM-ELM:1.the initial hidden layer output matrix may be nearly singular thus the computation will loss accuracy;2.the algorithms can’t always get good generalization performance due to overfitting.So we propose the improved version of EM-ELM based on regularization method called Incremental Regularized Extreme Learning Machine(IR-ELM).When new hidden node is added one by one,IR-ELM can update output weight recursively in a fast way.Empirical studies on benchmark data sets for regression and classification problems have shown that IR-ELM always get better generalization performance than EM-ELM with the similar training time.

Cite

CITATION STYLE

APA

Xu, Z., & Yao, M. (2015). A Fast Incremental Method Based on Regularized Extreme Learning Machine (pp. 15–30). https://doi.org/10.1007/978-3-319-14063-6_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free