In this paper two aspects of numerical dynamics are used for an artificial neural network (ANN) analysis. It is shown that topological conjugacy of gradient dynamical systems and both the shadowing and inverse shadowing properties have nontrivial implications in the analysis of a perceptron learning process. The main result is that, generically, any such process is stable under numerics and robust. Implementation aspects are discussed as well. The analysis is based on the theorem concerning global topological conjugacy of cascades generated by a gradient flow on a compact manifold without a boundary. © 2011 Springer Science+Business Media, LLC.
CITATION STYLE
Bielecki, A., & Ombach, J. (2011). Dynamical properties of a perceptron learning process: Structural stability under numerics and shadowing. Journal of Nonlinear Science, 21(4), 579–593. https://doi.org/10.1007/s00332-011-9094-1
Mendeley helps you to discover research relevant for your work.