Gaussian mixtures: Entropy and geometric inequalities

35Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

A symmetric random variable is called a Gaussian mixture if it has the same distribution as the product of two independent random variables, one being positive and the other a standard Gaussian random variable. Examples of Gaussian mixtures include random variables with densities proportional to e -|t|p and symmetric p-stable random variables, where p ∈ (0, 2]. We obtain various sharp moment and entropy comparison estimates for weighted sums of independent Gaussian mixtures and investigate extensions of the Binequality and the Gaussian correlation inequality in the context of Gaussian mixtures. We also obtain a correlation inequality for symmetric geodesically convex sets in the unit sphere equipped with the normalized surface area measure. We then apply these results to derive sharp constants in Khinchine inequalities for vectors uniformly distributed on the unit balls with respect to p-norms and provide short proofs to new and old comparison estimates for geometric parameters of sections and projections of such balls.

Cite

CITATION STYLE

APA

Eskenazis, A., Nayar, P., & Tkocz, T. (2018). Gaussian mixtures: Entropy and geometric inequalities. Annals of Probability, 46(5), 2908–2945. https://doi.org/10.1214/17-AOP1242

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free