Efficient approximation of the conditional relative entropy with applications to discriminative learning of bayesian Network classifiers

14Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

We propose a minimum variance unbiased approximation to the conditional relative entropy of the distribution induced by the observed frequency estimates, for multi-classification tasks. Such approximation is an extension of a decomposable scoring criterion, named approximate conditional log-likelihood (aCLL), primarily used for discriminative learning of augmented Bayesian network classifiers. Our contribution is twofold: (i) it addresses multi-classification tasks and not only binary-classification ones; and (ii) it covers broader stochastic assumptions than uniform distribution over the parameters. Specifically, we considered a Dirichlet distribution over the parameters, which was experimentally shown to be a very good approximation to CLL. In addition, for Bayesian network classifiers, a closed-form equation is found for the parameters that maximize the scoring criterion, © 2013 by the authors.

Cite

CITATION STYLE

APA

Carvalho, A. M., Adão, P., & Mateus, P. (2013). Efficient approximation of the conditional relative entropy with applications to discriminative learning of bayesian Network classifiers. Entropy, 15(7), 2716–2735. https://doi.org/10.3390/e15072716

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free