Generalized entropy cost function in neural networks

10Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Artificial neural networks are capable of constructing complex decision boundaries and over the recent years they have been widely used in many practical applications ranging from business to medical diagnosis and technical problems. A large number of error functions have been proposed in the literature to achieve a better predictive power. However, only a few works employ Tsallis statistics, which has successfully been applied in other fields. This paper undertakes the effort to examine the q-generalized function based on Tsallis statistics as an alternative error measure in neural networks. The results indicate that Tsallis entropy error function can be successfully applied in the neural networks yielding satisfactory results.

Cite

CITATION STYLE

APA

Gajowniczek, K., Chmielewski, L. J., Orłowski, A., & Ząbkowski, T. (2017). Generalized entropy cost function in neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10614 LNCS, pp. 128–136). Springer Verlag. https://doi.org/10.1007/978-3-319-68612-7_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free