Simulation of Learning Logical Functions Using Single-Layer Perceptron

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

As a simplest neural network, the perceptron computes a linear combination of real-valued labeled training samples and predicts the classes for unclassified testing samples. If two different sets of samples can be separated by a straight line, they are called linearly separable. The perceptron can be considered as the binary classifier of unclassified samples based on the supervised machine learning approach. The learning algorithm for the perceptron takes a static value for the learning rate as an input, which can affect the efficiency of the learning process. This work attempts to implement logical operations OR, AND, NOR, and NAND using a single-layer perceptron algorithm. A modified perceptron algorithm is proposed which finds the most optimal value of learning for the learning process. The learning algorithm is provided with a range of learning rate values, and it picks the most suitable learning rate by calculating the number of iterations taken by each of these learning rates and choosing the one which takes the minimum number of the epoch.

Cite

CITATION STYLE

APA

Ahamad, M. V., Ali, R., Naz, F., & Fatima, S. (2020). Simulation of Learning Logical Functions Using Single-Layer Perceptron. In Advances in Intelligent Systems and Computing (Vol. 1097, pp. 121–133). Springer. https://doi.org/10.1007/978-981-15-1518-7_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free