Abstract
In this paper the principle of minimum relative entropy (PMRE) is proposed as a fundamental principle and idea that can be used in the field of AGI. It is shown to have a very strong mathematical foundation, that it is even more fundamental then Bayes rule or MaxEnt alone and that it can be related to neuroscience. Hierarchical structures, hierarchies in timescales and learning and generating sequences of sequences are some of the aspects that Friston (Fri09) described by using his free-energy principle. These are aspects of cognitive architectures that are in agreement with the foundations of hierarchical memory prediction frameworks (GHO9). The PMRE is very similar and often equivalent to Friston's free-energy principle (FriO9), however for actions and the definitions of surprise there is a difference. It is proposed to use relative entropy as the standard definition of surprise. Experiments have shown that this is currently the best indicator of human surprise(1B09). The learning rate or interestingness can be defined as the rate of decrease of relative entropy, so curiosity can then be implemented as looking for situations with the highest learning rate.
Cite
CITATION STYLE
Van De Ven, A., & Schouten, B. A. M. (2010). A minimum relative entropy principle for AGI. In Artificial General Intelligence - Proceedings of the Third Conference on Artificial General Intelligence, AGI 2010 (pp. 198–199). https://doi.org/10.2991/agi.2010.26
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.