Some Further Results on the Minimum Error Entropy Estimation

20Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

The minimum error entropy (MEE) criterion has been receiving increasing attention due to its promising perspectives for applications in signal processing and machine learning. In the context of Bayesian estimation, the MEE criterion is concerned with the estimation of a certain random variable based on another random variable, so that the error's entropy is minimized. Several theoretical results on this topic have been reported. In this work, we present some further results on the MEE estimation. The contributions are twofold: (1) we extend a recent result on the minimum entropy of a mixture of unimodal and symmetric distributions to a more general case, and prove that if the conditional distributions are generalized uniformly dominated (GUD), the dominant alignment will be the MEE estimator; (2) we show by examples that the MEE estimator (not limited to singular cases) may be non-unique even if the error distribution is restricted to zero-mean (unbiased). © 2012 by the authors.

References Powered by Scopus

Get full text
192Citations
21Readers
Get full text
Get full text

Cited by Powered by Scopus

System Parameter Identification: Information Criteria and Algorithms

175Citations
25Readers
Get full text
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Chen, B., & Principe, J. C. (2012). Some Further Results on the Minimum Error Entropy Estimation. Entropy, 14(5), 966–977. https://doi.org/10.3390/e14050966

Readers over time

‘13‘15‘17‘19‘2000.511.52

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 3

60%

Professor / Associate Prof. 2

40%

Readers' Discipline

Tooltip

Engineering 4

100%

Save time finding and organizing research with Mendeley

Sign up for free
0