Entropy and divergence associated with power function and the statistical application

33Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method. © 2010 by the authors; licensee Molecular Diversity Preservation International, Basel, Switzerland.

Cite

CITATION STYLE

APA

Eguchi, S., & Kato, S. (2010). Entropy and divergence associated with power function and the statistical application. Entropy, 12(2), 262–274. https://doi.org/10.3390/e12020262

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free