Radial basis function neural networks have superlinear VC dimension

5Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We establish superlinear lower bounds on the Vapnik-Chervonenkis (VC) dimension of neural networks with one hidden layer and local receptive field neurons. As the main result we show that every reasonably sized standard network of radial basis function (RBF) neurons has VC dimension Ω(W log k), where W is the number of parameters and k the number of nodes. This significantly improves the previously known linear bound. We also derive superlinear lower bounds for networks of discrete and continuous variants of center-surround neurons. The constants in all bounds are larger than those obtained thus far for sigmoidal neural networks with constant depth. The results have several implications with regard to the computational power and learning capabilities of neural networks with local receptive fields. In particular, they imply that the pseudo dimension and the fatshattering dimension of these networks is superlinear as well, and they yield lower bounds even when the input dimension is fixed. The methods developed here appear suitable for obtaining similar results for other kernel-based function classes.

Cite

CITATION STYLE

APA

Schmitt, M. (2001). Radial basis function neural networks have superlinear VC dimension. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2111, pp. 14–30). Springer Verlag. https://doi.org/10.1007/3-540-44581-1_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free