Mutual information gain and linear/nonlinear redundancy for agent learning, sequence analysis, and modeling

6Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

In many applications, intelligent agents need to identify any structure or apparent randomness in an environment and respond appropriately. We use the relative entropy to separate and quantify the presence of both linear and nonlinear redundancy in a sequence and we introduce the new quantities of total mutual information gain and incremental mutual information gain. We illustrate how these new quantities can be used to analyze and characterize the structures and apparent randomness for purely autoregressive sequences and for speech signals with long and short term linear redundancies. The mutual information gain is shown to be an important new tool for capturing and quantifying learning for sequence modeling and analysis.

Cite

CITATION STYLE

APA

Gibson, J. D. (2020). Mutual information gain and linear/nonlinear redundancy for agent learning, sequence analysis, and modeling. Entropy, 22(6). https://doi.org/10.3390/E22060608

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free