Suppose I have that an unknown vector, x, where x is drawn from the following distribution$ \bigl(\begin{smallmatrix} x_1 \\ x_2 \end{smallmatrix} \bigr)$ ~ $N\bigl(0, \bigl[\begin{matrix} \sigma^2_1 & \rho\sigma_1\sigma_2\\ & \sigma_2^2 \end{matrix} \bigr]\bigr) $
Further suppose that I am given a value $s$ such that:
$s=x_2 + \epsilon$,
where $\epsilon$~$N(0,v)$
I need to calculate $E[x_1 - x_2 | s]$
Here is my answer, but I don't know if it's correct:
From standard Bayes rule, I have that $E[x_2 | s]=ws ; w=\frac{\sigma^2_2}{\sigma^2_2+v}$ (1)
Furthermore, I also know that $E[x_1 | x_2]=\frac{\rho\sigma_1}{\sigma_2}x_2$ (2)
Do equations (1) and (2) imply that:
$E[x_1 - x_2 | s] = w(\frac{\rho\sigma_1}{\sigma_2}-1)s$?