Bounding the moments of sample-mean of i.i.d. zero-mean r.v.

91 Views Asked by At

Let $X_1, X_2, \dots$ be given, and they are i.i.d. r.v.'s. (For simplicity let $X_i$ take real values, but complex case seems easy to generalize.) Suppose $\mathbb{E}(X_i) =0$ and w.l.o.g. let $\mathbb{E}(|X_i|^2) =1$. Assume, moreover, $\mathbb{E}(|X_i|^k) <\infty$ for $k=3,4,5...$.

Denote $$Y_N :=N^{-1/2} (X_1 + \dotsc + X_N)$$

Let $\chi$ be standard Normal. It would seem to us that, from central limit theorem, there distribution are approximately equal:

$$F_{Y_N} \approx F_\chi$$

Thus, is there a bound

$$\mathbb{E} (|Y_N|^k) \leq \eta(N) \mathbb{E} (|\chi|^k)?$$

where $\eta$ is not related to the distribution of $X$, but universal, and

$$0 \leq \limsup_{N \to \infty} \eta (N) \leq M <\infty?$$

What I have looked up, not sure if these helps:

  • Marcinkiewicz–Zygmund inequality (How can we bound the constants $B_p$?)

  • In Durrett Probability p.71, there is a related theorem (labeled 2.5.7): for any $\epsilon >0$, a.s.,

$$\lim_{N \to \infty} \frac {Y_N} {(\log N)^{1/2+\epsilon}} = 0$$