Capacity of structured multilayer networks with shared weights

2Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The capacity or Vapnik-Chervonenkis dimension of a feedforward neural architecture is the maximum number of input patterns that can be mapped correctly to fixed arbitrary outputs. So far it is known that the upper bound for the capacity of two-layer feedforward architectures with independent weights depends on the number of connections in the neural architecture [1]. In this paper we focus on the capacity of multilayer feedforward networks structured by shared weights. We show that these structured architectures can be transformed into equivalent conventional multilayer feedforward architectures. Known estimations for the capacity are extended to achieve upper bounds for the capacity of these general multi-layer feedforward architectures. As a result an upper bound for the capacity of structured architectures is derived that increases with the number of independent network parameters. This means that weight sharing in a fixed neural architecture leads to a significant reduction of the upper bound of the capacity.

Cite

CITATION STYLE

APA

Kröner, S., & Moratz, R. (1996). Capacity of structured multilayer networks with shared weights. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1112 LNCS, pp. 544–550). Springer Verlag. https://doi.org/10.1007/3-540-61510-5_93

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free