Accelerated kernel feature analysis

20Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A fast algorithm, Accelerated Kernel Feature Analysis (AKFA), that discovers salient features evidenced in a sample of n unclassified patterns, is presented. Like earlier kernel-based feature selection algorithms, AKFA implicitly embeds each pattern into a Hubert space, H, induced by a Mercer kernel. An ℓ-dimensional linear subspace of H is iteratively constructed by maximizing a variance condition for the nonlinearly transformed sample. This linear subspace can then be used to define more efficient data representations and pattern classifiers. AKFA requires O(ℓ/n 2) operations, as compared to O(n 3) for Schölkof, Smola, and Müller's Kernel Principal Component Analysis (KPCA), and O(ℓ 2n 2) for Smola, Mangasarian, and Schölkopf's Sparse Kernel Feature Analysis (SKFA). Numerical experiments show that AKFA can generate more concise feature representations than both KPCA and SKFA, and demonstrate that AKFA obtains similar classification performance as KPCA for a face recognition problem. © 2006 IEEE.

Cite

CITATION STYLE

APA

Xianhua, J., Snapp, R. R., Yuichi, M., & Xingquan, Z. (2006). Accelerated kernel feature analysis. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Vol. 1, pp. 109–116). https://doi.org/10.1109/CVPR.2006.43

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free