Bounds on moments of sample mean

1.5k Views Asked by At

Let $X_i$s be i.i.d zero mean random variables whose $p$-th moments are finite. Prove $$E\left[\left(\sum_{i=1}^{n}X_i\right)^p\right]\leq C_p n^{p/2}$$ where $C_p$ is a constant independent of $n$.

My effort: I think we have to expand the sum and then say each term is of $k$-th moment, count the number of them and the upper bound. This seems to be messy. Is there a better solution? Or, is it a name of a theorem?

3

There are 3 best solutions below

1
On BEST ANSWER

Let's assume that $p\geq 2$.

By Marcinkiewicz–Zygmund inequality, there exists $B_p$ such that $$E\left( \left\vert \sum_{i=1}^{n}X_{i}\right\vert ^{p}\right)\leq B_{p}E\left( \left( \sum_{i=1}^{n}\left\vert X_{i}\right\vert ^{2}\right) _{{}}^{p/2}\right)=B_pn^{p/2}E\left( \left( \frac 1n \sum_{i=1}^{n}\left\vert X_{i}\right\vert ^{2}\right) _{{}}^{p/2}\right)$$

Since $x\mapsto x^{p/2}$ is convex, $$\left(\frac 1n \sum_{i=1}^{n}\left\vert X_{i}\right\vert ^{2}\right) _{{}}^{p/2}\leq \frac 1n \sum_{i=1}^n |X_i|^p $$ hence

$$E\left( \left\vert \sum_{i=1}^{n}X_{i}\right\vert ^{p}\right)\leq B_pn^{p/2}E\left( \frac 1n \sum_{i=1}^n |X_i|^p \right) = B_pn^{p/2} E(|X_1|^p) $$

2
On

The way I see around it is to use Khintchine inequality. The key point is that if $F$ is a convex function, and $X_i$ and $X'_i$ two i.i.d. zero mean random variables: $$ E\left[F(X_i)\right]=E\left[F\bigg(X_i-E(X'_i)\bigg)\right]\leq E\left[F\bigg(X_i-X'_i\bigg)\right] $$ This is a symmetrization argument since $X_i-X'_i$ is now symmetric and has the same distribution as $\epsilon_iX_i$ where $\epsilon_i$ has the Rademacher's distribution. This particularly means that for $p\geq 1$: $$ E\left[\left(\sum_{i=1}^n X_i\right)^p\right]\leq E\left[\left(\sum_{i=1}^n \epsilon_iX_i\right)^p\right]. $$ Note that if $p$ is not an integer and the moments are to be well defined, the absolute values should be considered.

Using Khintchine inequality, there is a constant $A_p$ such that: $$ E\left[\left(\sum_{i=1}^n \epsilon_i X_i\right)^p\right]\leq A_p E\left[\left(\sum_{i=1}^n |\epsilon_i X_i|^2\right)^{p/2}\right]. $$ Note that $|\epsilon_i|^2=1$. The last step is to use Holder's inequality to show that: $$ \left(\sum_{i=1}^n X_i^2\right)\leq \left(\sum_{i=1}^n (|X_i|^2)^{p/2}\right)^{2/p}\left(\sum_{i=1}^n 1^q\right)^{1/q} $$ where $\frac 2p+\frac 1q=1$. Combining this with the previous inequality, we get: $$ E\left[\left(\sum_{i=1}^n |X_i|^2\right)^{p/2}\right]\leq E\left[\left(\sum_{i=1}^n |X_i|^p\right)\right]n^{p/2q}. $$ Using $\frac{p}{2q}=\frac p2-1$, the inequality follows with the constant $C_p=A_p E(|X|^p)$. Note that to use Holder's inequality, $p$ should be bigger than or equal to $2$.

7
On

The problem is correct as stated for $p \in \Bbb N$. However, if $p$ is non-integer then there should be absolute-value signs within the expectation (i.e., $|\sum X_i|^p$) or else the random variable is not well-defined. Assuming this, I would just like to remark that the statement is false for $p \in (0,2)$.

An easy counterexample is given as follows: Let $X_i$ be $\alpha$-stable random variables for some $\alpha \in (0,2)$, i.e., they are characterized by the fact that $\Bbb E[e^{itX_1}] = e^{-|t|^{\alpha}}$.

It is checked easily enough that $$a_1X_1+...+a_nX_n \stackrel{d}{=} (|a_1|^{\alpha}+...+|a_n|^{\alpha})^{1/\alpha} X_1.$$

Now, it is known that $X_i$ have finite mean as long as $\alpha>1$. Thus if we set $a_i=1$ in the above distributional equality, we get: $$\Bbb E \bigg| \sum_1^n X_i \bigg| = n^{1/\alpha} \Bbb E|X_1|,$$ which gives a counterexample for $p=1$. One may use the same random variables (and a similar argument) to get counterexamples for every $p \in (0,2)$.