Entropic risk minimization for nonparametric estimation of mixing distributions

4Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We discuss a nonparametric estimation method for the mixing distributions in mixture models. The problem is formalized as a minimization of a one-parameter objective functional, which becomes the maximum likelihood estimation or the kernel vector quantization in special cases. Generalizing the theorem for the nonparametric maximum likelihood estimation, we prove the existence and discreteness of the optimal mixing distribution and provide an algorithm to calculate it. It is demonstrated that with an appropriate choice of the parameter, the proposed method is less prone to overfitting than the maximum likelihood method. We further discuss the connection between the unifying estimation framework and the rate-distortion problem.

Cite

CITATION STYLE

APA

Watanabe, K., & Ikeda, S. (2015). Entropic risk minimization for nonparametric estimation of mixing distributions. Machine Learning, 99(1), 119–136. https://doi.org/10.1007/s10994-014-5467-7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free