Abstract
Hierarchical Temporal Memory is a biologically-inspired framework that can be used to learn invariant representations of patterns. Classical HTM learning is mainly unsupervised and once training is completed the network structure is frozen, thus making further training quite critical. In this paper we develop a novel technique for HTM (incremental) supervised learning based on error minimization. We prove that error backpropagation can be naturally and elegantly implemented through native HTM message passing based on Belief Propagation. Our experimental results show that a two stage training composed by unsupervised pre-training + supervised refinement is very effective. This is in line with recent findings on other deep architectures. © 2012 Springer-Verlag.
Author supplied keywords
Cite
CITATION STYLE
Maltoni, D., & Rehn, E. M. (2012). Incremental learning by message passing in hierarchical temporal memory. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7477 LNAI, pp. 24–35). https://doi.org/10.1007/978-3-642-33212-8_3
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.