Bounds on Rényi and Shannon entropies for finite mixtures of multivariate skew-normal distributions: Application to swordfish (Xiphias gladius Linnaeus)

27Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Mixture models are in high demand for machine-learning analysis due to their computational tractability, and because they serve as a good approximation for continuous densities. Predominantly, entropy applications have been developed in the context of a mixture of normal densities. In this paper, we consider a novel class of skew-normal mixture models, whose components capture skewness due to their flexibility. We find upper and lower bounds for Shannon and Rényi entropies for this model. Using such a pair of bounds, a confidence interval for the approximate entropy value can be calculated. In addition, an asymptotic expression for Rényi entropy by Stirling's approximation is given, and upper and lower bounds are reported using multinomial coefficients and some properties and inequalities of Lp metric spaces. Simulation studies are then applied to a swordfish (Xiphias gladius Linnaeus) length dataset.

Cite

CITATION STYLE

APA

Contreras-Reyes, J. E., & Cortés, D. D. (2016). Bounds on Rényi and Shannon entropies for finite mixtures of multivariate skew-normal distributions: Application to swordfish (Xiphias gladius Linnaeus). Entropy, 18(11). https://doi.org/10.3390/e18110382

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free