Latent Space Geometric Statistics

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deep generative models, e.g., variational autoencoders and generative adversarial networks, result in latent representation of observed data. The low dimensionality of the latent space provides an ideal setting for analysing high-dimensional data that would otherwise often be infeasible to handle statistically. The linear Euclidean geometry of the high-dimensional data space pulls back to a nonlinear Riemannian geometry on latent space where classical linear statistical techniques are no longer applicable. We show how analysis of data in their latent space representation can be performed using techniques from the field of geometric statistics. Geometric statistics provide generalisations of Euclidean statistical notions including means, principal component analysis, and maximum likelihood estimation of parametric distributions. Introduction to estimation procedures on latent space are considered, and the computational complexity of using geometric algorithms with high-dimensional data addressed by training a separate neural network to approximate the Riemannian metric and cometric tensor capturing the shape of the learned data manifold.

Cite

CITATION STYLE

APA

Kühnel, L., Fletcher, T., Joshi, S., & Sommer, S. (2021). Latent Space Geometric Statistics. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12666 LNCS, pp. 163–178). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-68780-9_16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free