Log likelihood spectral distance, entropy rate power, and mutual information with applications to speech coding

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

We provide a new derivation of the log likelihood spectral distance measure for signal processing using the logarithm of the ratio of entropy rate powers. Using this interpretation, we show that the log likelihood ratio is equivalent to the difference of two differential entropies, and further that it can be written as the difference of two mutual informations. These latter two expressions allow the analysis of signals via the log likelihood ratio to be extended beyond spectral matching to the study of their statistical quantities of differential entropy and mutual information. Examples from speech coding are presented to illustrate the utility of these new results. These new expressions allow the log likelihood ratio to be of interest in applications beyond those of just spectral matching for speech.

Cite

CITATION STYLE

APA

Gibson, J. D., & Mahadevan, P. (2017). Log likelihood spectral distance, entropy rate power, and mutual information with applications to speech coding. Entropy, 19(9). https://doi.org/10.3390/E19090496

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free