Global Divergences Between Measures: From Hausdorff Distance to Optimal Transport

3Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The data fidelity term is a key component of shape registration pipelines: computed at every step, its gradient is the vector field that drives a deformed model towards its target. Unfortunately, most classical formulas are at most semi-local: their gradients saturate and stop being informative above some given distance, with appalling consequences on the robustness of shape analysis pipelines. In this paper, we build on recent theoretical advances on Sinkhorn entropies and divergences [6] to present a unified view of three fidelities between measures that alleviate this problem: the Energy Distance from statistics; the (weighted) Hausdorff distance from computer graphics; the Wasserstein distance from Optimal Transport theory. The Hausdorff and Sinkhorn divergences are positive fidelities that interpolate between these three quantities, and we implement them through efficient, freely available GPU routines. They should allow the shape analyst to handle large deformations without hassle.

Cite

CITATION STYLE

APA

Feydy, J., & Trouvé, A. (2018). Global Divergences Between Measures: From Hausdorff Distance to Optimal Transport. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11167 LNCS, pp. 102–115). Springer Verlag. https://doi.org/10.1007/978-3-030-04747-4_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free