Let $X$ be a random variable that takes values in $\{0, 1\}$. In this sense, I have
$$ \mathbb{V}(X) = \mathbb{E}((X - \mathbb{E}(X)^2) = \frac{1}{2}\mathbb{E}(|X - \mathbb{E}(X)|). $$
My question is: "Why does the last equality hold?".
In this scenario, I could write the variance, as a function of the expectation, in the following manner:
$$ \mathbb{V}(X) = \mathbb{E}(X^2) - \mathbb{E}^2(X) = \mathbb{E}(X) - \mathbb{E}^2(X) = \mathbb{E}(X)(1 - \mathbb{E}(X)), $$
since $X$ takes values in $\{0, 1\}$.
However, this strategy does not help me that much with my problem. Any suggestions?
You can compute it in a direct way. Let $\mathbb{P}(X=1)=p$. Then $E[X]=p$. Note that $X^2$ has exactly the same distribution, so $E[X^2]=p$ as well, and hence $V(X)=E[X^2]-E^2[X]=p(1-p)=p-p^2$.
We can also find the distribution of $|X-E[X]|$. It equals to $1-p$ when $X=1$ and equals to $p$ when $X=0$. So $\mathbb{P}(|X-E[X]|=1-p)=\mathbb{P}(X=1)=p$ and $\mathbb{P}(|X-E[X]|=p)=1-p$. Hence $E[|X-E[X]|]=p(1-p)+(1-p)p=2p(1-p)=2V(X)$.