Techniques from differential topology are used to give polynomial bounds for the VC-dimension of sigmoidal neural networks. The bounds are quadratic in ω, the dimension of the space of weights. Similar results are obtained for a wide class of Pfaffian activation functions. The obstruction (in differential topology) to improving the bound to an optimal bound O(ωlogω) is discussed, and attention is paid to the role of other parameters involved in the network architecture.
CITATION STYLE
Karpinski, M., & Macintyre, A. (1995). Bounding vc-dimension for neural networks: Progress and prospects. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 904, pp. 337–341). Springer Verlag. https://doi.org/10.1007/3-540-59119-2_189
Mendeley helps you to discover research relevant for your work.