I'm stuck with this problem: Let $\mathscr G$ and $\mathscr F$ two sub-sigma-algebras of $\mathscr F_0$ such that $\mathscr G \subset \mathscr F$. If $X$ is a random variable (measurable) on $\mathscr F_0$ with $E(X) < \infty$, then
$E[(X - E(X|\mathscr G))^2] = E[(X - E(X|\mathscr F))^2] + E[(E(X|\mathscr F) - E(X|\mathscr G))^2]$
I have tried many things, like unfolding the squared terms on both sides two apply some use of the tower property but nothing has worked.
Any help would be really appreciated.
It really is a matter of expanding and using the tower law. I'd recommend giving it another shot. If not, here's how it goes down.
The left side, when expanded is: $$E[X^2 - 2X E[X|F] + E[X|G]^2].$$ The right side is $$E[X^2 - 2 X E[X|F] + E[X|F]^2] + E\left[E[X|F]^2 - 2E[X|F]E[X|G] + E[X|G]^2 \right].$$
Show the equality you're looking for is equivalent to showing $$E[-XE[X|G]] = E\left[ E[X|F]^2 - X E[X|F] - E[X|F]E[X|G]\right].$$
Note that $$E[-XE[X|G]] = E[E[-XE[X|G] | F] ] = E[E[X|G] E[-X| F] ] = -E[E[X|G]E[X|F]].$$
Similarly, we gave $E[X E[X|F]] = E[E[X|F]^2]$. This completes the proof.