Let $X$ be a random variable. Then $\operatorname{var}(X) =0$ implies $p( X = E (X)) = 1.$
Actually i need both directions. But i was able to show that $p( X = E (X)) = 1$ implies $\operatorname{var}(X) =0.$ So all i need is the other direction.
I think we have to use the definition of the variance and set the $\operatorname{var}(X) = 0.$ But why is $p(X = E(X)) = 1$?
Suppose for sake of contradiction $P(X \ne E[X]) > 0$. Then there exists some $\epsilon > 0$ such that $$P(|X - E[X]| > \epsilon) > 0.$$ But then $$\text{Var}(X) = E[(X - E[X])^2] \ge E[(X - E[X])^2 \mathbf{1}_{|X - E[X]| > \epsilon}] > \epsilon^2 P(|X - E[X]| > \epsilon) > 0,$$ a contradiction.
If the above is hard to understand, considering the case of a discrete random variable $X$ can be helpful for intuition. If $P(X \ne E[X]) > 0$, then there is some value $c \ne E[X]$ such that $P(X=c) > 0$. Then $$\text{Var}(X) = E[(X - E[X])^2] \ge (c - E[X])^2 P(X=c) > 0.$$ This argument does not work for continuous random variables, though.