Complexity of shallow networks representing functions with large variations

2Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Model complexities of networks representing multivariable functions is studied in terms of variational norms tailored to types of network units. It is shown that the size of the variational norm reflects both the number of hidden units and sizes of output weights. Lower bounds on growth of variational norms with increasing input dimension d are derived for Gaussian units and perceptrons. It is proven that variation of the d-dimensional parity with respect to Gaussian Support Vector Machines grows exponentially with d and for large values of d, almost any randomly-chosen Boolean function has variation with respect to perceptrons depending on d exponentially. © 2014 Springer International Publishing Switzerland.

Cite

CITATION STYLE

APA

Kůrková, V., & Sanguineti, M. (2014). Complexity of shallow networks representing functions with large variations. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8681 LNCS, pp. 331–338). Springer Verlag. https://doi.org/10.1007/978-3-319-11179-7_42

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free