Ensemble one-class extreme learning machine based on overlapping data partition

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

One-class classification/data description plays a key roles in numerous applications such as anomaly detection. This paper presents a novel ensemble one-class extreme learning machine (EOCELM), which not only yields sound performance but also facilitates the parallel processing of training and testing. Instead of training on the entire training dataset, EOCELM first partitions the training data into overlapping clusters by k-medoids clustering and a simple Minimum Spanning Tree (MST) based heuristic rule. The proposed overlapping data partition makes it possible to describe the sub-structures within one-class training data more precisely without the risk of creating “clutser gap” that may degrade the generalization performance. Besides, the data partition can alleviate the matrix inversion problem of original extreme learning machine (OCELM) when dealing with massive training data. Next, an OCELM is trained for each data cluster as a sub-classifier, which can be implemented in a parallel way. Finally, OCELMs are combined into EOCELM by the simple maximum combining rule. Experiments on synthetic datasets, UCI datasets and MNIST datasets demonstrate the effectiveness of EOCELM when compared with other state-of-the-art one-class learning approaches.

Cite

CITATION STYLE

APA

Wang, S., Zhao, L., Zhu, E., Yin, J., & Yang, H. (2017). Ensemble one-class extreme learning machine based on overlapping data partition. In Communications in Computer and Information Science (Vol. 710, pp. 408–416). Springer Verlag. https://doi.org/10.1007/978-981-10-5230-9_40

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free