This is short overview of the authors' research in the area of the sequential or recursive Bayesian estimation of recurrent neural networks. Our approach is founded on the joint estimation of synaptic weights, neuron outputs and structure of the recurrent neural networks. Joint estimation enables generalization of the training heuristic known as teacher forcing, which improves the training speed, to the sequential training on noisy data. By applying Gaussian mixture approximation of relevant probability density functions, we have derived training algorithms capable to deal with non-Gaussian (multi modal or heavy tailed) noise on training samples. Finally, we have used statistics, recursively updated during sequential Bayesian estimation, to derive criteria for growing and pruning of synaptic connections and hidden neurons in recurrent neural networks.
CITATION STYLE
Todorović, B., Moraga, C., & Stanković, M. (2017). Sequential Bayesian estimation of recurrent neural networks. In Studies in Fuzziness and Soft Computing (Vol. 349, pp. 173–199). Springer Verlag. https://doi.org/10.1007/978-3-319-48317-7_11
Mendeley helps you to discover research relevant for your work.