Chebyshev's inequality Problem

104 Views Asked by At

Let $\mathrm{X}$ be a random variable with $\mathbb{E}$[$\mathrm{X}$]=0 and $\mathbb{V}$ar[$\mathrm{X}$]=$\sigma^2$, and $\mathcal{a}$ $\gt$ 0. Prove that

P($\mathrm{X}$$\ge$$\mathcal{a}$)$\le$$\frac{\sigma^2}{\sigma^2+\mathcal{a^2}}$

My attempts: I started with Chebyshev's inequality, which gave P($\vert$X$\vert$$\ge$$\mathcal{a}$)$\le$$\frac{\sigma^2}{a^2}$, then I considered the random variable Y=X+t, and ended up with P($\vert$X+t$\vert$$\ge$$\mathcal{a}$+t)$\le$$\frac{\sigma^2(1+2t)}{(a+t)^2}$.

Any help is appreciated