On generalization and K-fold cross validation performance of MLP trained with EBPDT

2Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents the generalization capability of multilayer perceptrons (MLP). The learning algorithm is based on mixing the concepts of dynamic tunneling along with error backpropagation (EBPDT), which enables detrapping of the local minimum point. In this study, the generalization capability is presented on three standard datasets, and the k-fold cross validation results is presented for two of the datasets. A comparative study of the performance of the proposed method with EBP clearly demonstrates the power of tunneling applied in conjunction with EBP type of learning.

Cite

CITATION STYLE

APA

Chowdhury, P. R., & Shukla, K. K. (2002). On generalization and K-fold cross validation performance of MLP trained with EBPDT. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2275, pp. 352–359). Springer Verlag. https://doi.org/10.1007/3-540-45631-7_47

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free