From Covariance Matrices to Covariance Operators: Data Representation from Finite to Infinite-Dimensional Settings

10Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This chapter presents some of the recent developments in the generalization of the data representation framework using finite-dimensional covariance matrices to infinite-dimensional covariance operators in Reproducing Kernel Hilbert Spaces (RKHS). We show that the proper mathematical setting for covariance operators is the infinite-dimensional Riemannian manifold of positive definite Hilbert–Schmidt operators, which are the generalization of symmetric, positive definite (SPD) matrices. We then give the closed form formulas for the affine-invariant and Log-Hilbert–Schmidt distances between RKHS covariance operators on this manifold, which generalize the affine-invariant and Log-Euclidean distances, respectively, between SPD matrices. The Log-Hilbert–Schmidt distance in particular can be used to design a two-layer kernel machine, which can be applied directly to a practical application, such as image classification. Experimental results are provided to illustrate the power of this new paradigm for data representation.

Cite

CITATION STYLE

APA

Minh, H. Q., & Murino, V. (2016). From Covariance Matrices to Covariance Operators: Data Representation from Finite to Infinite-Dimensional Settings. In Advances in Computer Vision and Pattern Recognition (pp. 115–143). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-319-45026-1_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free