Sequential Bayesian estimation of recurrent neural networks

1Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This is short overview of the authors' research in the area of the sequential or recursive Bayesian estimation of recurrent neural networks. Our approach is founded on the joint estimation of synaptic weights, neuron outputs and structure of the recurrent neural networks. Joint estimation enables generalization of the training heuristic known as teacher forcing, which improves the training speed, to the sequential training on noisy data. By applying Gaussian mixture approximation of relevant probability density functions, we have derived training algorithms capable to deal with non-Gaussian (multi modal or heavy tailed) noise on training samples. Finally, we have used statistics, recursively updated during sequential Bayesian estimation, to derive criteria for growing and pruning of synaptic connections and hidden neurons in recurrent neural networks.

Cite

CITATION STYLE

APA

Todorović, B., Moraga, C., & Stanković, M. (2017). Sequential Bayesian estimation of recurrent neural networks. In Studies in Fuzziness and Soft Computing (Vol. 349, pp. 173–199). Springer Verlag. https://doi.org/10.1007/978-3-319-48317-7_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free