I have a random variable $X$ that can have 3 values : $\left \{ 0,1,2 \right \}$, and $E[X]=1$.
If i use the Markov Inequality i get that $P(X\geq 2)\leq \frac{1}{2}$.
Now, if i assign $\frac{1}{2}$ probability to 2, and the remaining $\frac{1}{2}$ probability to 1, i get an expected value $E[X]=1.5$, that is different from the initial expected value. Why Markov Inequality doesn't work ? Where am i wrong ?
"If i use the Markov Inequality i get that $P(X≥2)≤\frac{1}{2}$"
This already uses $\mathbb{E}[X]=1$
"Now, if i assign $\frac{1}{2}$ probability to $2$, and the remaining $\frac{1}{2}$ probability to $1$, i get an expected value $\mathbb{E}[X]=1.5$, that is different from the initial expected value"
In this you are changing the expectation, hence the bound above when using Markov's inequality will change