Hebbian Learning Rule

  • Chakraverty S
  • Sahoo D
  • Mahato N
N/ACitations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Hebbian learning rule is one of the earliest and the simplest learning rules for the neural networks. It was proposed by Donald Hebb. Hebb proposed that if two interconnected neurons are both ``on'' at the same time, then the weight between them should be increased. Hebbian network is a single layer neural network which consists of one input layer with many input units and one output layer with one output unit. This architecture is usually used for pattern classification. The bias which increases the net input has value 1. This chapter includes the Hebbian learning algorithm. Various Matlab coding have been done for different classification problems.

Cite

CITATION STYLE

APA

Chakraverty, S., Sahoo, D. M., & Mahato, N. R. (2019). Hebbian Learning Rule. In Concepts of Soft Computing (pp. 175–182). Springer Singapore. https://doi.org/10.1007/978-981-13-7430-2_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free