Interval entropy and informative distance

40Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds. © 2012 by the authors.

Cite

CITATION STYLE

APA

Misagh, F., & Yari, G. (2012). Interval entropy and informative distance. Entropy, 14(3), 480–490. https://doi.org/10.3390/e14030480

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free