Cantelli's inequality proof

589 Views Asked by At

If $X$ is a random variable with finite mean $m$ and finite variance $v$, then for $\alpha > 0$,

$$P(X-m\geq \alpha) \leq \dfrac{v}{v+ \alpha^2}$$

I am trying to do the excercise by following two hints I am given, they are:

i) First show that $P(X-m \geq \alpha) \leq P((X-m+y)^2 \geq (\alpha + y)^2)$ for all $y>0$.

ii) Then use Markov's inequality, and minimise the resulting bound over choice of $y > 0$.

The problem I have, is showing that $P(X-m \geq \alpha) \leq P((X-m+y)^2 \geq (\alpha + y)^2$ for all $y>0$.

If I assume that I have solved the problem above, my proof goes like:

The random variable $(X-m+y)^2$ is now non-negative so we may apply Markov's inequality to get

\begin{equation*} \begin{split} \pmb{P}((X-m+y)^2 \geq (\alpha + y)^2) & \leq \dfrac{\mathrm{E}((X-m+y)^2)}{(\alpha + y)^2} \\ & = \dfrac{\mathrm{E}(X^2) - \mathrm{E}(X) + y^2}{(\alpha + y)^2} \\ & = \dfrac{v + y^2}{(\alpha + y)^2} \end{split} \end{equation*}

Now let $ g(y):= \dfrac{v+y^2}{(\alpha + y)^2}$ and we seek the value of $y>0$ that minimises $g$.

Differentiating $g$ and setting $g\prime(y) = 0$ gives $g^\prime(y)=2(\alpha y - v) / (\alpha+y)^3$ and we get that $y=\frac{v}{\alpha}$ yields $g \prime (y) = 0$.

Now plugging this value into $g$ gives

$$\pmb{P}(X-m \geq \alpha) \leq \dfrac{v+(\frac{v}{\alpha})^2}{(\alpha + \frac{v}{\alpha})^2} = \dfrac{v \alpha^{-2}(\alpha^2 + v)}{\alpha^{-2}(\alpha^2 + v)^2} = \dfrac{v}{\alpha^2 + v} $$

and I claim the proof is complete.

Any advice is appreciated

1

There are 1 best solutions below

0
On

$$\mathbb{P}(X-m\geqslant\alpha)=\mathbb{P}(X-m+y\geqslant\alpha+y)\leqslant\mathbb{P}((X-m+y)^2\geqslant(\alpha+y)^2)$$