Positive Definite Matrices: Data Representation and Applications to Computer Vision

10Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Numerous applications in computer vision and machine learning rely on representations of data that are compact, discriminative, and robust while satisfying several desirable invariances. One such recently successful representation is offered by symmetric positive definite (SPD) matrices. However, the modeling power of SPD matrices comes at a price: rather than a flat Euclidean view, SPD matrices are more naturally viewed through curved geometry (Riemannian or otherwise) which often complicates matters. We focus on models and algorithms that rely on the geometry of SPD matrices, and make our discussion concrete by casting it in terms of covariance descriptors for images. We summarize various commonly used distance metrics on SPD matrices, before highlighting formulations and algorithms for solving sparse coding and dictionary learning problems involving SPD data. Through empirical results, we showcase the benefits of mathematical models that exploit the curved geometry of SPD data across a diverse set of computer vision applications.

Cite

CITATION STYLE

APA

Cherian, A., & Sra, S. (2016). Positive Definite Matrices: Data Representation and Applications to Computer Vision. In Advances in Computer Vision and Pattern Recognition (pp. 93–114). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-319-45026-1_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free