We address here the problem of architecture selection for an RBF network designed for classification purposes. Given a training set, the RBF network produces an estimate of the Probability Density Function (PDF) in terms of a mixture of I uncorrelated Gaussian functions, where l is the number of hidden neurons. Using uncorrelated Gaussians alleviates the heavy computational burden of estimating the full covariance matrix. However, the simplicity of such building blocks has to be paid for by the relatively large numbers of units needed to approximate the density of correlated data. We define two scalar parameters to describe the complexity of the data to be modelled and study the relationship between the complexity of the data and the complexity of the best approximating network.
CITATION STYLE
Sardo, L., & Kittler, J. (1996). Asymptotic complexity of an RBF NN for correlated data representation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1112 LNCS, pp. 71–76). Springer Verlag. https://doi.org/10.1007/3-540-61510-5_16
Mendeley helps you to discover research relevant for your work.