Information theory underlies the design of codes. Claude Shannon Shannon, C.E. probably started the field with a seminal article (1948), in which he defined a measure of information: the entropy. In this chapter, we introduce essential concepts in information theory: entropy, optimal coding, cross entropy, and perplexity. Entropy is a very versatile measure of the average information content of symbol sequences and we will explore how it can help us design efficient encodings.
CITATION STYLE
Nugues, P. M. (2014). Topics in information theory and machine learning. In Cognitive Technologies (pp. 87–121). Springer Verlag. https://doi.org/10.1007/978-3-642-41464-0_4
Mendeley helps you to discover research relevant for your work.