A fully analogue implementation of training algorithms would speed up the training of artificial neural networks. A common choice for training the feedforward networks is the backpropagation with stochastic gradient descent. However, the circuit design that would enable its analogue implementation is still an open problem. This paper proposes a fully analogue training circuit block concept based on the backpropagation for neural networks without clock control. Capacitors are used as memory elements for the presented example. The XOR problem is used as an example for concept-level system validation.
CITATION STYLE
Paulu, F., & Hospodka, J. (2021). Design of Fully Analogue Artificial Neural Network with Learning Based on Backpropagation. Radioengineering, 30(2), 357–363. https://doi.org/10.13164/re.2021.0357
Mendeley helps you to discover research relevant for your work.