Artificial neural networks are capable of constructing complex decision boundaries and over the recent years they have been widely used in many practical applications ranging from business to medical diagnosis and technical problems. A large number of error functions have been proposed in the literature to achieve a better predictive power. However, only a few works employ Tsallis statistics, which has successfully been applied in other fields. This paper undertakes the effort to examine the q-generalized function based on Tsallis statistics as an alternative error measure in neural networks. The results indicate that Tsallis entropy error function can be successfully applied in the neural networks yielding satisfactory results.
CITATION STYLE
Gajowniczek, K., Chmielewski, L. J., Orłowski, A., & Ząbkowski, T. (2017). Generalized entropy cost function in neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10614 LNCS, pp. 128–136). Springer Verlag. https://doi.org/10.1007/978-3-319-68612-7_15
Mendeley helps you to discover research relevant for your work.