Topics in information theory and machine learning

1Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Information theory underlies the design of codes. Claude Shannon Shannon, C.E. probably started the field with a seminal article (1948), in which he defined a measure of information: the entropy. In this chapter, we introduce essential concepts in information theory: entropy, optimal coding, cross entropy, and perplexity. Entropy is a very versatile measure of the average information content of symbol sequences and we will explore how it can help us design efficient encodings.

Cite

CITATION STYLE

APA

Nugues, P. M. (2014). Topics in information theory and machine learning. In Cognitive Technologies (pp. 87–121). Springer Verlag. https://doi.org/10.1007/978-3-642-41464-0_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free