A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity

6Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

As a human-cortex-inspired computing model, hierarchical temporal memory (HTM) has shown great promise in sequence learning and has been applied to various time-series applications. HTM uses the combination of columns and neurons to learn the temporal patterns within the sequence. However, the conventional HTM model compacts the input into two naive column states - active and nonactive, and uses a fixed learning strategy. This simplicity limits the representation capability of HTM and ignores the impacts of active columns on learning the temporal context. To address these issues, we propose a new HTM algorithm based on activation intensity. By introducing the column activation intensity, more useful and fine-grained information from the input is retained for sequence learning. Furthermore, a self-adaptive nonlinear learning strategy is proposed where the synaptic connections are dynamically adjusted according to the activation intensity of columns. Extensive experiments are carried out on two real-world time-series datasets. Compared to the conventional HTM and LSTM model, our method achieved higher accuracy and less time overhead.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Niu, D., Yang, L., Cai, T., Li, L., Wu, X., & Wang, Z. (2022). A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity. Computational Intelligence and Neuroscience, 2022. https://doi.org/10.1155/2022/6072316

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 3

50%

Lecturer / Post doc 2

33%

Researcher 1

17%

Readers' Discipline

Tooltip

Computer Science 5

83%

Economics, Econometrics and Finance 1

17%

Save time finding and organizing research with Mendeley

Sign up for free