Probability and Information Theory

1Citations
Citations of this article
522Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This chapter serves as an introduction to concepts from elementary probability theory and information theory in the concrete context of the real line and multi-dimensional Euclidean space. The probabilistic concepts of mean, variance, expected value, marginalization, conditioning, and conditional expectation are reviewed. In this part of the presentation there is some overlap with the previous chapter, which has some pedagogical benefit. There will be no mention of Borel measurability, σ-algebras, filtrations, or martingales, as these are treated in numerous other books on probability theory and stochastic processes such as [1, 14, 15, 32, 27, 48]. The presentation here, while drawing from these excellent works, will be restricted only to those topics that are required either in the mathematical and computational modeling of stochastic physical systems, or the determination of properties of solutions to the equations in these models. Basic concepts of information theory are addressed such as measures of distance, or “divergence,” between probability density functions, and the properties of “information” and entropy. All pdfs treated here will be differentiable functions on Rn. Therefore the entropy and information measures addressed in this chapter are those that are referred to in the literature as the “differential” or “continuous” version.

Cite

CITATION STYLE

APA

Chirikjian, G. S. (2009). Probability and Information Theory. In Applied and Numerical Harmonic Analysis (pp. 63–99). Springer International Publishing. https://doi.org/10.1007/978-0-8176-4803-9_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free