Optimization of the Kernel Functions in a Probabilistic Neural Network Analyzing the Local Pattern Distribution

  • Galleske I
  • Castellanos J
  • 4

    Readers

    Mendeley users who have this article in their library.
  • 8

    Citations

    Citations of this article.

Abstract

This article proposes a procedure for the automatic determination of the elements of the covariance matrix of the gaussian kernel function of probabilistic neural networks. Two matrices, a rotation matrix and a matrix of variances, can be calculated by analyzing the local environment of each training pattern. The combination of them will form the covariance matrix of each training pattern. This automation has two advantages: First, it will free the neural network designer from indicating the complete covariance matrix, and second, it will result in a network with better generalization ability than the original model. A variation of the famous two-spiral problem and real-world examples from the UCI Machine Learning Repository will show a classification rate not only better than the original probabilistic neural network but also that this model can outperform other well-known classification techniques.

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Authors

  • I. Galleske

  • J. Castellanos

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free