Combining data reduction and parameter selection for improving RBF-DDA performance

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Dynamic Decay Adjustment (DDA) algorithm is a fast constructive algorithm for training RBF neural networks for classification tasks. In a previous work, we have proposed a method for improving RBF-DDA generalization performance by adequately selecting the value of one of its training parameters (Θ-). Unfortunately, this method generates much larger networks than RBF-DDA with default parameters. This paper proposes a method for improving RBF-DDA generalization performance by combining a data reduction technique with the parameter selection technique. The proposed method has been evaluated on four classification tasks from the UCI repository, including three optical character recognition datasets. The results obtained show that the proposed method considerably improves performance of RBF-DDA without producing larger networks. The results are compared to MLP and k-NN results obtained in previous works. It is shown that the method proposed in this paper outperforms MLPs and obtains results comparable to k-NN on these tasks. © Springer-Verlag Berlin Heidelberg 2004.

Cite

CITATION STYLE

APA

Oliveira, A. L. I., Melo, B. J. M., Neto, F. B. L., & Meira, S. R. L. (2004). Combining data reduction and parameter selection for improving RBF-DDA performance. In Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 3315, pp. 778–787). Springer Verlag. https://doi.org/10.1007/978-3-540-30498-2_78

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free