Optimization of the Kernel Functions in a Probabilistic Neural Network Analyzing the Local Pattern Distribution

9Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This article proposes a procedure for the automatic determination of the elements of the covariance matrix of the gaussian kernel function of probabilistic neural networks. Two matrices, a rotation matrix and a matrix of variances, can be calculated by analyzing the local environment of each training pattern. The combination of them will form the covariance matrix of each training pattern. This automation has two advantages: First, it will free the neural network designer from indicating the complete covariance matrix, and second, it will result in a network with better generalization ability than the original model. A variation of the famous two-spiral problem and real-world examples from the UCI Machine Learning Repository will show a classification rate not only better than the original probabilistic neural network but also that this model can outperform other well-known classification techniques.

Cite

CITATION STYLE

APA

Galleske, I., & Castellanos, J. (2002). Optimization of the Kernel Functions in a Probabilistic Neural Network Analyzing the Local Pattern Distribution. Neural Computation, 14(5), 1183–1194. https://doi.org/10.1162/089976602753633448

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free