We investigate the functioning of a classifying biological neural network from the perspective of statistical learning theory, modelled, in a simplified setting, as a continuous-time stochastic recurrent neural network (RNN) with the identity activation function. In the purely stochastic (robust) regime, we give a generalisation error bound that holds with high probability, thus showing that the empirical risk minimiser is the best-in-class hypothesis. We show that RNNs retain a partial signature of the paths they are fed as the unique information exploited for training and classification tasks. We argue that these RNNs are easy to train and robust and support these observations with numerical experiments on both synthetic and real data. We also show a trade-off phenomenon between accuracy and robustness.
CITATION STYLE
Boutaib, Y., Bartolomaeus, W., Nestler, S., & Rauhut, H. (2022). Path classification by stochastic linear recurrent neural networks. Advances in Continuous and Discrete Models, 2022(1). https://doi.org/10.1186/s13662-022-03686-9
Mendeley helps you to discover research relevant for your work.