On a variational definition for the jensen-shannon symmetrization of distances based on the information radius

20Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson’s information radius. The variational definition applies to any arbitrary distance and yields a new way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed families of probability measures, we get relative Jensen-Shannon divergences and their equivalent Jensen-Shannon symmetrizations of distances that generalize the concept of information projections. Finally, we touch upon applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures, including statistical mixtures.

Cite

CITATION STYLE

APA

Nielsen, F. (2021). On a variational definition for the jensen-shannon symmetrization of distances based on the information radius. Entropy, 23(4). https://doi.org/10.3390/e23040464

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free