Differential and algebraic geometry of multilayer perceptrons

ISSN: 09168508
36Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Information geometry is applied to the manifold of neural networks called multilayer perceptrons. It is important to study a total family of networks as a geometrical manifold, because learning is represented by a trajectory in such a space. The manifold of perceptrons has a rich differential-geometrical structure represented by a Riemannian metric and singularities. An efficient learning method is proposed by using it. The parameter space of perceptrons includes a lot of algebraic singularities, which affect trajectories of learning. Such singularities are studied by using simple models. This poses an interesting problem of statistical inference and learning in hierarchical models including singularities.

Cite

CITATION STYLE

APA

Amarj, S. I., & Ozeki, T. (2001). Differential and algebraic geometry of multilayer perceptrons. IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, E84-A(1), 31–38.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free