Modeling hebb learning rule for unsupervised learning

3Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents to model the Hebb learning rule and proposes a neuron learning machine (NLM). Hebb learning rule describes the plasticity of the connection between presynaptic and postsynaptic neurons and it is unsupervised itself. It formulates the updating gradient of the connecting weight in artificial neural networks. In this paper, we construct an objective function via modeling the Hebb rule. We make a hypothesis to simplify the model and introduce a correlation based constraint according to the hypothesis and stability of solutions. By analysis from the perspectives of maintaining abstract information and increasing the energy based probability of observed data, we find that this biologically inspired model has the capability of learning useful features. NLM can also be stacked to learn hierarchical features and reformulated into convolutional version to extract features from 2-dimensional data. Experiments on singlelayer and deep networks demonstrate the effectiveness of NLM in unsupervised feature learning.

Cite

CITATION STYLE

APA

Liu, J., Gong, M., & Miao, Q. (2017). Modeling hebb learning rule for unsupervised learning. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 0, pp. 2315–2321). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2017/322

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free