An improved VC dimension bound for sparse polynomials

0Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We show that the function class consisting of k-sparse polynomials in n variables has Vapnik-Chervonenkis (VC) dimension at least nk + 1. This result supersedes the previously known lower bound via k-term monotone disjunctive normal form (DNF) formulas obtained by Littlestone (1988). Moreover, it implies that the VC dimension for k-sparse polynomials is strictly larger than the VC dimension for k-term monotone DNF. The new bound is achieved by introducing an exponential approach that employs Gaussian radial basis function (RBF) neural networks for obtaining classifications of points in terms of sparse polynomials.

Cite

CITATION STYLE

APA

Schmitt, M. (2004). An improved VC dimension bound for sparse polynomials. In Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 3120, pp. 393–407). Springer Verlag. https://doi.org/10.1007/978-3-540-27819-1_27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free