Expand the inequality $E(|X|^r)^\frac{1}{r}\le E(|X|^s)^\frac{1}{s}$ to $-\infty<r<s<\infty$

120 Views Asked by At

For a random variable $X$ and numbers $0<r<s<\infty$

$$E(|X|^r)^\frac{1}{r}\le E(|X|^s)^\frac{1}{s}$$

This almost immediately follows from Holder's inequality for random variables $Y,Z$

$$E(|ZY|)\le E(|Z|^p)^\frac{1}{p}E(|Y|^q)^\frac{1}{q}$$ Setting $Y=1$ we end up with

$$E(|Z|)\le E(|Z|^p)^\frac{1}{p}.$$

With $|Z|=|X|^r$ we get

$$E(|X|^r)\le E(|X|^{rp})^\frac{1}{p}.$$

Now, since $p \ge1$ we have $rp \ge r$ and setting $rp=s$ we get

$$E(|X|^r)\le E(|X|^{s})^\frac{r}{s}$$

or

$$E(|X|^r)^\frac{1}{r}\le E(|X|^{s})^\frac{1}{s}.$$

Is there a way to expand this inequality to $-\infty<r<s<\infty$?

1

There are 1 best solutions below

0
On

Yes, assuming the expectations exist (for $r=0$, it is $\lim_{r\to 0}(\mathbb{E}[|X|^r])^{1/r}=\exp(\mathbb{E}[\log |X|])$). This is just the power mean inequality in the discrete case. So the lazy way to prove it is simply take limit of discrete case. A more honest way to prove this is to establish comparing $r=0$ with $s>0$ (Jensen's inequality with $\varphi(x)=-\log x$ for the random variable $|X|^s$)), and then use $Y=X^{-1}$ for first the case $r=-1$, $s=0$, and finally for general $r,s$.