Doubt about the Markov Inequality

248 Views Asked by At

I have a random variable $X$ that can have 3 values : $\left \{ 0,1,2 \right \}$, and $E[X]=1$.

If i use the Markov Inequality i get that $P(X\geq 2)\leq \frac{1}{2}$.

Now, if i assign $\frac{1}{2}$ probability to 2, and the remaining $\frac{1}{2}$ probability to 1, i get an expected value $E[X]=1.5$, that is different from the initial expected value. Why Markov Inequality doesn't work ? Where am i wrong ?

2

There are 2 best solutions below

0
On

"If i use the Markov Inequality i get that $P(X≥2)≤\frac{1}{2}$"

This already uses $\mathbb{E}[X]=1$

"Now, if i assign $\frac{1}{2}$ probability to $2$, and the remaining $\frac{1}{2}$ probability to $1$, i get an expected value $\mathbb{E}[X]=1.5$, that is different from the initial expected value"

In this you are changing the expectation, hence the bound above when using Markov's inequality will change

0
On

To have $E[X]=1$, you will need $P(X=0)=P(X=2)$ and $P(X=1)=1-2P(X=2)$

  • so if you assign $P(X=2)=\frac12$ then you have $P(X=0)=\frac12$ and so $P(X=1)=0$
  • while if you assign $P(X=2)=0$ then you have $P(X=0)=0$ and so $P(X=1)=1$
  • and if you assign $P(X=2)=\frac13$ then you have $P(X=0)=\frac13$ and so $P(X=1)=\frac13$

You can in fact assign $P(X=2)$ any non-negative value up to $\frac12$ and the Markov inequality you found is consistent with this