I was reading the slides of this lecture : https://economics.mit.edu/files/4622 while I saw (page 7) this inequality: $var(X) \geq (0-\Bbb E[X])^2\Bbb P(X=0)$ but I don't understand how to prove it.
The problem: Let $I_i$ for $1\leq i \leq n$ be $n$ random variables following the same Bernoulli law (with parameter $q$) but their are not independent.
Let $X = \sum_{i=1}^n I_i$. Show that $var(X) \geq (0 - \Bbb E[X])^2\Bbb P(X=0)$.
What I've tried: I thought about Markov/Bienaymé-Chebyshev inequalites, but I cannot obtain $P(X=0)$. I wonder why there is "$(0 - \Bbb E[X])^2)$"...
Note that $\displaystyle X = \sum_{i=1}^n I_i$ is a discrete random variable with support $\{0, 1, \ldots, n\}$. Therefore,
$$Var[X] = E[(X - E[X])^2]=\sum_{x=0}^n (x - E[X])^2\Pr\{X =x\} \geq (0-E[X])^2\Pr\{X = 0\}$$
As all the summand are non-negative, the last inequality is obtained by dropping all terms except $x = 0$.