Hebbian learning rule is one of the earliest and the simplest learning rules for the neural networks. It was proposed by Donald Hebb. Hebb proposed that if two interconnected neurons are both ``on'' at the same time, then the weight between them should be increased. Hebbian network is a single layer neural network which consists of one input layer with many input units and one output layer with one output unit. This architecture is usually used for pattern classification. The bias which increases the net input has value 1. This chapter includes the Hebbian learning algorithm. Various Matlab coding have been done for different classification problems.
CITATION STYLE
Chakraverty, S., Sahoo, D. M., & Mahato, N. R. (2019). Hebbian Learning Rule. In Concepts of Soft Computing (pp. 175–182). Springer Singapore. https://doi.org/10.1007/978-981-13-7430-2_12
Mendeley helps you to discover research relevant for your work.