Dynamical properties of a perceptron learning process: Structural stability under numerics and shadowing

9Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper two aspects of numerical dynamics are used for an artificial neural network (ANN) analysis. It is shown that topological conjugacy of gradient dynamical systems and both the shadowing and inverse shadowing properties have nontrivial implications in the analysis of a perceptron learning process. The main result is that, generically, any such process is stable under numerics and robust. Implementation aspects are discussed as well. The analysis is based on the theorem concerning global topological conjugacy of cascades generated by a gradient flow on a compact manifold without a boundary. © 2011 Springer Science+Business Media, LLC.

Cite

CITATION STYLE

APA

Bielecki, A., & Ombach, J. (2011). Dynamical properties of a perceptron learning process: Structural stability under numerics and shadowing. Journal of Nonlinear Science, 21(4), 579–593. https://doi.org/10.1007/s00332-011-9094-1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free