How to show that for any random variable $X$ and $t > 0$, $\Pr(X - E[X] \geq t\sigma[X]) \leq \frac{1}{1 + t^2}$?

340 Views Asked by At

The question in the title is from exercise 3.18 in the first edition of Mitzenmacher and Upfal's Probability and Computing. It's essentially asking to prove a bound slightly tighter than Chebyshev's when we only need to bound how much larger $X$ can be than its mean. I'm completely at a loss on how to proceed with proving it. I would appreciate a helpful hint more than an outright answer.

2

There are 2 best solutions below

3
On BEST ANSWER

For $u \ge 0$, $$P(X-E[X] \ge t \sigma) = P(X - E[X] + u \ge t \sigma + u) \le \frac{E[(X-E[X]+u)^2]}{(t\sigma + u)^2} = \frac{\sigma^2+u^2}{(t\sigma + u)^2}.$$ Choosing $u = \sigma/t$ minimizes the right-hand side with value $\frac{1}{1+t^2}$.


Substituting $u=\sigma/t$ in: $$\frac{\sigma^2 + u^2}{(t\sigma + u)^2} = \frac{\sigma^2 + \sigma^2/t^2}{(t\sigma + \sigma/t)^2} = \frac{1+1/t^2}{(t + 1/t)^2} = \frac{t^2 + 1}{(t^2+1)^2} = \frac{1}{t^2 + 1}.$$

4
On

This is a particular instance of Cantelli's inequality that states that

If $X$ is square integrable random variable with mean $m$ and variance $\sigma^2$, then $$ \mathbb{P}[X-m>\alpha]\leq\frac{\sigma^2}{\sigma^2 +\alpha^2} $$

take $\alpha=t\sigma$.