Abstract
In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method. © 2010 by the authors; licensee Molecular Diversity Preservation International, Basel, Switzerland.
Author supplied keywords
Cite
CITATION STYLE
Eguchi, S., & Kato, S. (2010). Entropy and divergence associated with power function and the statistical application. Entropy, 12(2), 262–274. https://doi.org/10.3390/e12020262
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.