Model complexities of networks representing multivariable functions is studied in terms of variational norms tailored to types of network units. It is shown that the size of the variational norm reflects both the number of hidden units and sizes of output weights. Lower bounds on growth of variational norms with increasing input dimension d are derived for Gaussian units and perceptrons. It is proven that variation of the d-dimensional parity with respect to Gaussian Support Vector Machines grows exponentially with d and for large values of d, almost any randomly-chosen Boolean function has variation with respect to perceptrons depending on d exponentially. © 2014 Springer International Publishing Switzerland.
CITATION STYLE
Kůrková, V., & Sanguineti, M. (2014). Complexity of shallow networks representing functions with large variations. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8681 LNCS, pp. 331–338). Springer Verlag. https://doi.org/10.1007/978-3-319-11179-7_42
Mendeley helps you to discover research relevant for your work.