Batch-sequential algorithm for neural networks trained with entropic criteria

4Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The use of entropy as a cost function in the neural network learning phase usually implies that, in the back-propagation algorithm, the training is done in batch mode. Apart from the higher complexity of the algorithm in batch mode, we know that this approach has some limitations over the sequential mode. In this paper we present a way of combining both modes when using entropic criteria. We present some experiments that validates the proposed method and we also show some comparisons of this proposed method with the single batch mode algorithm. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Santos, J. M., De Sá, J. M., & Alexandre, L. A. (2005). Batch-sequential algorithm for neural networks trained with entropic criteria. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3697 LNCS, pp. 91–96). https://doi.org/10.1007/11550907_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free