Assume having a Markov chain $X-U-Y$ with $X$ and $Y$ being RVs having Gaussian marginal distributions $X\sim N(\mu_X, \sigma_X)$ and $Y\sim N(\mu_Y, \sigma_Y)$. Let us also define the $\hat X = E[X|U]$ and $\hat Y = E[Y|U]$ as the MSE estimators. Then is it possible to decompose the squared difference as follows: $$E[(X - Y)^2] = E[(\hat{X} - X)^2] + E[(\hat Y - Y)^2] + E[(\hat X - \hat Y)^2]?$$ I appreciate any help, comment or suggestion. Thanks.
The early steps that come to my mind are as follows: \begin{equation} \begin{aligned} LHS &= E[(X - Y)^2] = E[(X - \hat X + \hat X - \hat Y + \hat Y - Y)^2] \\ &= RHS + 2\left(E[(X - \hat X)(\hat X - \hat Y)]\\ + E[(X - \hat X)(\hat Y - Y)]\\ + E[(\hat X - \hat Y)(\hat Y - Y)]\right)\\ &= RHS + 2\left(E[(X - \hat X)(- \hat Y)] + E[(\hat X )(\hat Y - Y)]\right) \end{aligned} \end{equation} where the last equality follows from the orthogonality principle in MSE estimation and the property of conditional expectation that $E[(X - \hat X)(Y - \hat Y)|U] = 0$. I am unsure of the next steps to prove that the term in the parenthesis is zero.