On virtually binary nature of probabilistic neural networks

5Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

A sequential design of multilayer probabilistic neural networks is considered in the framework of statistical decision-making. Parameters and interconnection structure are optimized layer-by-layer by estimating unknown probability distributions on input space in the form of finite distribution mixtures. The components of mixtures correspond to neurons which perform an information preserving transform between consecutive layers. Simultaneously the entropy of the transformed distribution is minimized. It is argued that in multidimensional spaces and particularly at higher levels of multilayer feed forward neural networks, the output variables of probabilistic neurons tend to be binary. It is shown that the information loss caused by the binary approximation of neurons can be suppressed by increasing the approximation accuracy.

Cite

CITATION STYLE

APA

Grim, J., & Pudil, P. (1998). On virtually binary nature of probabilistic neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1451, pp. 765–774). Springer Verlag. https://doi.org/10.1007/bfb0033301

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free