Perceptrons

  • Du K
  • Swamy M
N/ACitations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The perceptron [38], also referred to as a McCulloch-Pitts neuron or linear threshold gate, is the earliest and simplest neural network model. Rosenblatt used a single-layer perceptron for the classification of linearly separable patterns. For a one-neuron perceptron, the network topology is shown in Fig. 1.2, and the net input to the neuron is given by net = J 1 i=1 w i x i − θ = w T x − θ, (3.1) where all the symbols are as explained in Sect. 1.2. The one-neuron perceptron using the hard-limiter activation function is useful for classification of vector x into two classes. The two decision regions are separated by a hyperplane w T x − θ = 0, (3.2) where the threshold θ is a parameter used to shift the decision boundary away from the origin. The three popular activation functions are the hard limiter (threshold) function, φ(x) = 1, x ≥ 0 −1(or 0), x < 0 , (3.3) the logistic function φ(x) = 1 1 + e −βx , (3.4)

Cite

CITATION STYLE

APA

Du, K.-L., & Swamy, M. N. S. (2014). Perceptrons. In Neural Networks and Statistical Learning (pp. 67–81). Springer London. https://doi.org/10.1007/978-1-4471-5571-3_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free