Theory of information and its value

5Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This English version of Ruslan L. Stratonovich's Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics. Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.

Cite

CITATION STYLE

APA

Belavkin, R. V., Pardalos, P. M., & Principe, J. C. (2020). Theory of information and its value. Theory of Information and its Value (pp. 1–419). Springer International Publishing. https://doi.org/10.1007/978-3-030-22833-0

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free