Self-normalized processes arise naturally in statistical applications. Being unit free, they are not affected by scale changes. Moreover, self-normalization often eliminates or weakens moment assumptions. In this paper we present several exponential and moment inequalities, particularly those related to laws of the iterated logarithm, for self-normalized random variables including martingales. Tail probability bounds are also derived. For random variables Bt > 0 and At, let Yt(λ) = exp{λAt - λ2Bt2/2}. We develop inequalities for the moments of At/Bt or sup t>0 At/{Bt(log log Bt) 1/2} and variants thereof, when EYt(λ) < 1 or when Yt(λ) is a supermartingale, for all λ belonging to some interval. Our results are valid for a wide class of random processes including continuous martingales with At = Mt and B t = √〈M〉t, and sums of conditionally symmetric variables di with At = ∑i=1t di and Bt = √∑i=1t di2. A sharp maximal inequality for conditionally symmetric random variables and for continuous local martingales with values in Rm, m ≥ 1, is also established. Another development in this paper is a bounded law of the iterated logarithm for general adapted sequences that are centered at certain truncated conditional expectations and self-normalized by the square root of the sum of squares. The key ingredient in this development is a new exponential supermartingale involving ∑i=1t di and ∑t=1t di2. A compact law of the iterated logarithm for self-normalized martingales is also derived in this connection.
CITATION STYLE
De La Peña, V. H., Klass, M. J., & Lai, T. L. (2004). Self-normalized processes: Exponential inequalities, moment bounds and iterated logarithm laws. Annals of Probability, 32(3 A), 1902–1933. https://doi.org/10.1214/009117904000000397
Mendeley helps you to discover research relevant for your work.