uniform integrability of a squared sum of iid variables

975 Views Asked by At

I'm trying to prove that if $X_i$ are independent, identically distributed random variables such that $E X_i = 0$ and $E X_i^2 < \infty$ then the sequence $\frac{(\sum_{i=1}^{n} X_i)^2}{n}$ is uniformly integrable. Actually I've been told that even something stronger holds, namely that $$\frac{\max_{k\leq n}(\sum_{i=1}^{k} X_i)^2}{n}$$ is uniformly integrable. Can someone please give me a hint or a reference to some proof? I've been told that the Hoffmann-Jorgensenn inequality might come in handy, but I suppose that's just for the generalization with $\max$. I know this would be trivial if we had $\frac{\sum_{i=1}^{n} X_i^2}{n}$, but the problem is that the whole sum is squared, not each variable separately.

Thank you very much for your help.

2

There are 2 best solutions below

3
On

I think there are two ways.

  1. Assume you know that $S_n/\sqrt{n}$ converges in distribution to $N(0,1)$ (that is, you know that central limit theorem holds). Then $P(S_n^2>nR)\to P(N^2>R)$ for all $R>0$. Conclude using a $2\varepsilon$-argument and the fact that for a non-negative random variable $Y$, $$E(Y\chi_{\{Y>R\}})=R\cdot P(Y>R)+\int_R^{+\infty}P(Y>t)dt$$

  2. Using a truncation argument and a fourth-moments inequality. Such a method gives the wanted uniform integrability for max after having used Kolmogorov inequality for example.

0
On

ok, here's what I got so far: Davide - I didn't really know how to follow your truncating idea - I tried to do it, but I can't figure how to use fourth moment inequality in a sensible way, but instead I tried using the first approach, although slightly modified so that I'd get uniform integrability with max, I'd be grateful if someone took a look at it: I'm using Kolmogorow's inequality $$ P (\frac{1}{n}\max_{k \leq n} S_k^2 \geq M) = P(\max_{k \leq n} |S_k| \geq \sqrt{Mn}) \leq \frac{1}{M} \sigma^2$$ and the following Hoffmann-Jorgensen inequality - if $X_i$'s are independent then for all nonnegative s, r, t we have $$P(\max_{k \leq n} |S_k| > s + r + t) \leq P(\max_{k \leq n} |X_k| > s) + 2P(\max_{k \leq n} |S_k| > t)P(\max_{k \leq n} |S_n - S_k| > r/2) $$ taking s = t = r/2 and by stationarity (same distribution) $S_n - S_k \sim S_j$ we get $$P(\max_{k \leq n} |S_k| > 4s) \leq P(\max_{k \leq n} |X_k| > s) + 2[P(\max_{k \leq n} |S_k| > s)]^2 $$ so by using $$P (\frac{1}{n}\max_{k \leq n} S_k^2 \geq s) = P(\max_{k \leq n} |S_k| \geq \sqrt{sn})$$ we obtain $$ P (\frac{1}{n}\max_{k \leq n} S_k^2 \geq s) \leq P(\max_{k \leq n} |X_k| > \sqrt{sn}) + 2[P(\max_{k \leq n} |S_k| > \sqrt{sn})]^2 $$ by using Kolmogorov on the second summand $$P (\frac{1}{n}\max_{k \leq n} S_k^2 \geq s) \leq P(\max_{k \leq n} |X_k| > \sqrt{sn}) + \frac{2}{s^2} \sigma^4 $$

which is cool because the part with $1/s^2$ is nice and integrable, so we could use the formula for expectation suggested by Davide and this ingredient would be less than $\epsilon$ for R sufficiently large, now (if I didn't make a mistake along the way) we just need a sensible estimate on $P(\max_{k \leq n} |X_k| > \sqrt{sn})$ which shouldn't be that hard. I thought about writing something like $$P(\max_{k \leq n} |X_k| > \sqrt{sn}) = 1 - [P(|X_1| \leq \sqrt{sn})]^n$$ $$P(|X_1| \geq \sqrt{sn}) \leq \frac{\sigma^2}{ns}$$ so $$P(\max_{k \leq n} |X_k| > \sqrt{sn}) \leq 1 - (1 - \frac{\sigma^2}{ns})^n$$ and I guess we could work with it, but I was hoping for something nicer. Does anyone have an idea?