Extreme Learning Machine Ensemble Classifier for Large-Scale Data

  • Wang H
  • He Q
  • Shang T
  • et al.
N/ACitations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

For classification problem, extreme learning machine (ELM) can get better generalization performance at a much faster learning speed. Nevertheless, a single ELM is unstable in data classification. The Bagging-based ensemble classifier, i.e., Bagging-ELM has been studied popularly and proved to improve the performance of ELM significantly in terms of accuracy, however, it is inappropriate to deal with large-scale datasets due to the highly intensive computation. In this study, we propose a novel ELM ensemble classifier, namely b-ELM, which leverages the Bag of Little Bootstraps technique to obtain a scalable, efficient means of classification for large-scale data. Efficiency of classification is achieved as it only requires repeated training under consideration on quantities of data that can be much smaller than the original training data. Furthermore, b-ELM is suited to implementation on modern parallel and distributed computing platforms. The experimental results demonstrate that b-ELM can efficiently handle large-scale data with a good performance on prediction accuracy.

Cite

CITATION STYLE

APA

Wang, H., He, Q., Shang, T., Zhuang, F., & Shi, Z. (2015). Extreme Learning Machine Ensemble Classifier for Large-Scale Data (pp. 151–161). https://doi.org/10.1007/978-3-319-14063-6_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free