Taming the curse of dimensionality in kernels and novelty detection

45Citations
Citations of this article
56Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The curse of dimensionality is a well known but not entirely well understood phenomena. Too much data, in terms of the number of input variables, is not always a good thing. This is especially true when the problem involves unsupervised learning or supervised learning with unbalanced data (many negative observations but minimal positive observations). This paper addresses two issues involving high dimensional data: The first issue explores the behavior of kernels in high dimensional data. It is shown that variance, especially when contributed by meaningless noisy variables, confounds learning methods. The second part of this paper illustrates methods to overcome dimensionality problems with unsupervised learning utilizing subspace models. The modeling approach involves novelty detection with the one-class SVM. © 2006 Springer.

Cite

CITATION STYLE

APA

Evangelista, P. F., Embrechts, M. J., & Szymanski, B. K. (2006). Taming the curse of dimensionality in kernels and novelty detection. Advances in Soft Computing, 34, 425–438. https://doi.org/10.1007/3-540-31662-0_33

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free