This paper presents the generalization capability of multilayer perceptrons (MLP). The learning algorithm is based on mixing the concepts of dynamic tunneling along with error backpropagation (EBPDT), which enables detrapping of the local minimum point. In this study, the generalization capability is presented on three standard datasets, and the k-fold cross validation results is presented for two of the datasets. A comparative study of the performance of the proposed method with EBP clearly demonstrates the power of tunneling applied in conjunction with EBP type of learning.
CITATION STYLE
Chowdhury, P. R., & Shukla, K. K. (2002). On generalization and K-fold cross validation performance of MLP trained with EBPDT. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2275, pp. 352–359). Springer Verlag. https://doi.org/10.1007/3-540-45631-7_47
Mendeley helps you to discover research relevant for your work.