Non-Euclidean principal component analysis and Oja's learning rule - Theoretical aspects

8Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Principal component analysis based on Hebbian learning is originally designed for data processing in Euclidean spaces. We present in this contribution an extension of Oja's online learning approach for non-Euclidean spaces. First we review the kernel principal component approach. We show that for differentiable kernel this approach can be formulated as an online learning scheme. Hence, PCA can be explicitly carried out in the data space but now equipped with a non-Euclidean metric. Moreover, the theoretical framework can be extended to principal component learning in Banach spaces based on semi-inner products. This becomes particularly important when learning in l p-norm spaces with p ≠2 is considered. In this contribution we focus on the mathematics and theoretical justification of the approach. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

Biehl, M., Kästner, M., Lange, M., & Villmann, T. (2013). Non-Euclidean principal component analysis and Oja’s learning rule - Theoretical aspects. In Advances in Intelligent Systems and Computing (Vol. 198 AISC, pp. 23–33). Springer Verlag. https://doi.org/10.1007/978-3-642-35230-0_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free