Suppose we have two random variables: $X$ is a continuous r.v.; $Y$ is a discrete r.v. taking values $0$ and $1$. Is the following expression true ?
$E[(E[X|Y])^{2}]= [(E[X|Y=1])^{2}]\times P(Y=1)+ [(E[X|Y=0])^{2}]\times P(Y=0)$ ?
Suppose we have two random variables: $X$ is a continuous r.v.; $Y$ is a discrete r.v. taking values $0$ and $1$. Is the following expression true ?
$E[(E[X|Y])^{2}]= [(E[X|Y=1])^{2}]\times P(Y=1)+ [(E[X|Y=0])^{2}]\times P(Y=0)$ ?
On
Recall that the conditional expectation is a function of the conditioning variable: $E[X|Y]=g(Y)$
Further, the expectation of any function of a random variable, in terms of the density function is $E[h(Y)] = \int h(y) \, f_Y(y) \, dy$ or, for a discrete variable, $E[h(Y)] = \sum_y h(y) P(Y=y) \, $ (Note that we can recover the original definition of $E[Y]$ as a particular case).
In your case (Bernoulli variable) $E[h(Y)] = h(0) \, P(Y=0) + h(1) \, P(Y=1)$.
Just plug $h(Y)= g(Y)^2 = E[X|Y]^2$ and you get your formula.
First: we must suppose here that $X$ and $Y$ are defined on the same probability space; call it $(\Omega,\mathcal{F},P)$. Also, I'm going to switch to using function/integral notation for most things, because it makes conditional expectations a little more clear.
Note that $\DeclareMathOperator{\E}{\mathbb{E}}\E[X\mid Y]$ is defined as a $\sigma(Y)$-measurable random variable $\E[X\mid Y]:\Omega\rightarrow\mathbb{R}$ (where $\sigma(Y)\subseteq\mathcal{F}$ is the $\sigma$-algebra generated by $Y$) such that $$\tag{1} \forall A\in\sigma(Y),\qquad\int_A\E[X\mid Y](\omega)\,dP(\omega)=\int_A X(\omega)\,dP(\omega). $$
Now, since $Y$ takes only values $0$ and $1$, the events in $\sigma(Y)$ are pretty boring: it contains only the events $\emptyset$, $A_0:=\{\omega\in\Omega\mid Y(\omega)=0\}$, $A_1:=\{\omega\in\Omega\mid Y(\omega)=1\}$, and $A_1\cup A_2=\{\omega\in\Omega\mid Y(\omega)\in\{0,1\}\}$. (Okay, it may contain other events -- but these are the only ones that can have non-zero probability, $\emptyset$ excluded.)
Now, part of the definition of conditional expectation is that $\E[X\mid Y]$ is defined to be ANY $\sigma(Y)$-measurable function satisfying $(1)$; it turns out that they're all equivalent after the fact. So, if we can find any such function, we're good.
Define $\E[X\mid Y=0]$ and $\E[X\mid Y=1]$ as you normally would (these are integrals, not random variables/functions). Then the claim is that we can write $$\tag{2} \E[X\mid Y]=\E[X\mid Y=0]\cdot 1_{A_0}+\E[X\mid Y=1]\cdot 1_{A_1}. $$ If you can prove $(2)$, then you are done, as then $$\tag{3} \begin{align*} \E[X\mid Y]^2=&(\E[X\mid Y=0]\cdot 1_{A_0}+\E[X\mid Y=1]\cdot 1_{A_1})^2\\ =&(\E[X\mid Y=0])^2\cdot 1_{A_0}^2+2\E[X\mid Y=0]\E[X\mid Y=1]1_{A_0}1_{A_1}\\ &+(\E[X\mid Y=1])^2\cdot 1_{A_1}^2\\ =&(\E[X\mid Y=0])^2\cdot 1_{A_0}+(\E[X\mid Y=1])^2\cdot 1_{A_1}, \end{align*} $$ where we have used the fact that $A_0$ and $A_1$ are disjoint to claim $1_{A_0}\cdot 1_{A_1}=1_{A_0\cap A_1}=0$, and the fact that $1_A^2=1_A$ for any event $A$. Taking expectations in $(3)$ would prove exactly what you want.
So, it remains to prove $(2)$. To do so, check that the function on the right side of $(2)$ satisfies property $(1)$. Let me know if you get stuck.