Unsupervised Progressive Learning and the STAM Architecture

12Citations
Citations of this article
86Readers
Mendeley users who have this article in their library.

Abstract

We first pose the Unsupervised Progressive Learning (UPL) problem: an online representation learning problem in which the learner observes a non-stationary and unlabeled data stream, learning a growing number of features that persist over time even though the data is not stored or replayed. To solve the UPL problem we propose the Self-Taught Associative Memory (STAM) architecture. Layered hierarchies of STAM modules learn based on a combination of online clustering, novelty detection, forgetting outliers, and storing only prototypical features rather than specific examples. We evaluate STAM representations using clustering and classification tasks. While there are no existing learning scenarios that are directly comparable to UPL, we compare the STAM architecture with two recent continual learning models, Memory Aware Synapses (MAS) and Gradient Episodic Memories (GEM), after adapting them in the UPL setting.

Cite

CITATION STYLE

APA

Smith, J., Taylor, C., Baer, S., & Dovrolis, C. (2021). Unsupervised Progressive Learning and the STAM Architecture. In IJCAI International Joint Conference on Artificial Intelligence (pp. 2979–2987). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2021/410

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free