Mixture models are in high demand for machine-learning analysis due to their computational tractability, and because they serve as a good approximation for continuous densities. Predominantly, entropy applications have been developed in the context of a mixture of normal densities. In this paper, we consider a novel class of skew-normal mixture models, whose components capture skewness due to their flexibility. We find upper and lower bounds for Shannon and Rényi entropies for this model. Using such a pair of bounds, a confidence interval for the approximate entropy value can be calculated. In addition, an asymptotic expression for Rényi entropy by Stirling's approximation is given, and upper and lower bounds are reported using multinomial coefficients and some properties and inequalities of Lp metric spaces. Simulation studies are then applied to a swordfish (Xiphias gladius Linnaeus) length dataset.
CITATION STYLE
Contreras-Reyes, J. E., & Cortés, D. D. (2016). Bounds on Rényi and Shannon entropies for finite mixtures of multivariate skew-normal distributions: Application to swordfish (Xiphias gladius Linnaeus). Entropy, 18(11). https://doi.org/10.3390/e18110382
Mendeley helps you to discover research relevant for your work.