Design of Fully Analogue Artificial Neural Network with Learning Based on Backpropagation

6Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A fully analogue implementation of training algorithms would speed up the training of artificial neural networks. A common choice for training the feedforward networks is the backpropagation with stochastic gradient descent. However, the circuit design that would enable its analogue implementation is still an open problem. This paper proposes a fully analogue training circuit block concept based on the backpropagation for neural networks without clock control. Capacitors are used as memory elements for the presented example. The XOR problem is used as an example for concept-level system validation.

Cite

CITATION STYLE

APA

Paulu, F., & Hospodka, J. (2021). Design of Fully Analogue Artificial Neural Network with Learning Based on Backpropagation. Radioengineering, 30(2), 357–363. https://doi.org/10.13164/re.2021.0357

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free