Bounding the $p$th moment of a sum of random variables by the sum of the $p$th moments

1k Views Asked by At

This is problem 6.3 from Resnick:

Suppose that $X,Y\ge0 $ are random variables and that $p\ge 0$. Prove that

(a) $E[(X+Y)^p] \le 2^p (E[X^p] + E[Y^p])$;

(b) if $p>1$, then $E[(X+Y)^p] \le 2^{p-1} (E[X^p] + E[Y^p])$;

(c) if $p \in [0,1]$, then $E[(X+Y)^p] \le E[X^p] + E[Y^p]$.

Here are my thoughts — and a few questions.

If either $E[X^p]=\infty$ or $E[Y^p]=\infty$, then all the inequalities are true. So assume that $E[X^p], E[Y^p] < \infty$ — i.e., that $X,Y\in L_p$.

It seems as though it would suffice to show (b) and (c); perhaps (a) is included to shed light on the other parts? Alas, (a) is the only part I have anything for.

So. I adapted part of the author's discussion of Minkowski's inequality: $X+Y \le 2\max\{X,Y\}$ implies that $\left(X+Y\right)^{p}\le\left[2\max\left\{ X,Y\right\} \right]^{p}=\max\left\{ 2^{p}X^{p},2^{p}Y^{p}\right\} \le2^{p}\left(X^{p}+Y^{p}\right)$; then take expectations.

One thing that seems strange about this is that it seems $X,Y$ are being treated as numbers rather than as functions: Why should it be true that $X+Y\le 2\max\{X,Y\}$ for all $\omega$?

Beyond that, I haven't made any progress. I've tried looking at Minkowski's and Jensen's inequalities for (b) and Lyanpunov's inequality for (b) or (c), but I can't get anything to work.

Can anyone offer ideas on how to start these two parts? (This is homework, so I'm not looking for a complete solution or anything...)

1

There are 1 best solutions below

3
On BEST ANSWER

(b) Hölder inequality is a good idea: $$\left| \sum_{j=1}^n x_j y_j \right| \leq \|x\|_p \|y\|_q$$ for any two $x,y \in \mathbb{R}^n$ where the conjugate index $q$ equals $q = \frac{p}{p-1}$. Use this inequality for $n=2$ and $(x_1,x_2) = (X(\omega),Y(\omega))$, $(y_1,y_2) := (1,1)$

(c) This follows if we can show $$(x+y)^p \leq x^p + y^p$$ for $x,y \geq 0$, i.e. if we can prove that the mapping $[0,\infty) \ni x \mapsto x^p$ is subadditive. There is a general statement which states that any concave function $f$ (such that $f(0) \geq 0$) is subadditive, see e.g. wikipedia for a proof. This means that it suffices to show that $[0,\infty) \ni x \mapsto x^p$ is concave.

In all three cases we use "deterministic" estimates for $(x+y)^p$, $x,y \geq 0$, and then take expectation on both sides to prove the claimed inequalities.