Incremental one-class learning with bounded computational complexity

12Citations
Citations of this article
27Readers
Mendeley users who have this article in their library.
Get full text

Abstract

An incremental one-class learning algorithm is proposed for the purpose of outlier detection. Outliers are identified by estimating -and thresholding - the probability distribution of the training data. In the early stages of training a non-parametric estimate of the training data distribution is obtained using kernel density estimation. Once the number of training examples reaches the maximum computationally feasible limit for kernel density estimation, we treat the kernel density estimate as a maximally-complex Gaussian mixture model, and keep the model complexity constant by merging a pair of components for each new kernel added. This method is shown to outperform a current state-of-the-art incremental one-class learning algorithm (Incremental SVDD [5]) on a variety of datasets, while requiring only an upper limit on model complexity to be specified. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Sillito, R. R., & Fisher, R. B. (2007). Incremental one-class learning with bounded computational complexity. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4668 LNCS, pp. 58–67). Springer Verlag. https://doi.org/10.1007/978-3-540-74690-4_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free