Abstract
We show that the derivative of the relative entropy with respect to its parameters is lower and upper bounded. We characterize the conditions under which this derivative can reach zero. We use these results to explain when the minimum relative entropy and the maximum log likelihood approaches can be valid. We show that these approaches naturally activate in the presence of large data sets and that they are inherent properties of any density estimation process involving large numbers of random variables. © 2013 by the authors.
Author supplied keywords
Cite
CITATION STYLE
Zegers, P., Fuentes, A., & Alarćon, C. (2013). Relative entropy derivative bounds. Entropy, 15(7), 2861–2873. https://doi.org/10.3390/e15072861
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.