Sparse Kernel Feature Analysis

  • Smola A
  • Mangasarian O
  • Schölkopf B
N/ACitations
Citations of this article
82Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Kernel Principal Component Analysis (KPCA) has proven to be a versatile tool for unsupervised learning, however at a high computational cost due to the dense expansions in terms of kernel functions. We overcome this problem by proposing a new class of feature extractors employing ℓ1 norms in coefficient space instead of the Reproducing Kernel Hilbert Space in which KPCA was originally formulated in. Moreover, the modified setting allows us to efficiently extract features which maximize criteria other than the variance in a way similar to projection pursuit.

Cite

CITATION STYLE

APA

Smola, A. J., Mangasarian, O. L., & Schölkopf, B. (2002). Sparse Kernel Feature Analysis (pp. 167–178). https://doi.org/10.1007/978-3-642-55991-4_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free