Let $\mathbf{x}$ be a $N$-dimensional random vector distributed as $\mathbf{x} \sim \mathcal{N}(\mathbf{0}, \mathbf{\Sigma})$. Let
$$\hat{\mathbf{x}} = \sqrt{1-\eta^{2}} \mathbf{x} + \eta \mathbf{z}$$
be a noisy estimate of $\mathbf{x}$, with $\eta \in [0,1]$ and where the noise vector $\mathbf{z} \sim \mathcal{N}(\mathbf{0}, \mathbf{I})$ is independent on $\mathbf{x}$.
From the above equation, the distribution of $\hat{\mathbf{x}}$ given $\mathbf{x}$ reads as $\mathcal{N}(\sqrt{1-\eta^{2}} \mathbf{x}, \eta^{2} \mathbf{I})$.
But how can I write the a posteriori probability distribution of the real $\mathbf{x}$ given its noisy estimate $\hat{\mathbf{x}}$?
- Tentative solution: When $\eta^{2}=0$, we should have $\mathbf{x} | \hat{\mathbf{x}} \sim \mathcal{N}(\hat{\mathbf{x}}, \mathbf{0})$, whereas when $\eta^{2}=1$, we should have $\mathbf{x} | \hat{\mathbf{x}} \sim \mathcal{N}(\mathbf{0}, \mathbf{\Sigma})$.
It is pretty easy to write out the joint distribution, and convince yourself it is indeed a multivariate normal, as $$ \begin{align*} \mathcal{N} \left(\begin{bmatrix} \mathbf{z} \\ \mathbf{x} \\ \hat{\mathbf{x}} \end{bmatrix}, \begin{bmatrix} \mathbf{I} & \mathbf{0} & \mathbf{I}\eta \\ \mathbf{0} &\mathbf{\Sigma} & \mathbf{\Sigma}\sqrt{1 -\eta^2} \\ \eta\mathbf{I} & \sqrt{1-\eta^2} \mathbf{\Sigma} & \sqrt{1-\eta^2}\mathbf{\Sigma}\sqrt{1-\eta^2} + \eta^2\mathbf{I}\end{bmatrix} \right). \end{align*} $$ From this it is easy to read off the joint distribution of $\mathbf{x},\hat{\mathbf{x}}$ after marginalising out $\mathbf{z}$, and then using known results for the conditional distribution of multivariate normal distributions we have $$ \mathbf{x} \, | \, \hat{\mathbf{x}} \sim \mathcal{N}\left(\mathbf{x}\,|\,\mathbf{\mu}_{x|\hat{x}}, \mathbf{\Sigma}_{x|\hat{x}}\right), $$ where \begin{align*} \mathbf{\mu}_{x|\hat{x}}&=\sqrt{1 - \eta^2}\mathbf{\Sigma}\left[\sqrt{1-\eta^2}\mathbf{\Sigma}\sqrt{1-\eta^2} + \eta^2\mathbf{I}\right]^{-1}\hat{\mathbf{x}}, \\ &=\left(\mathbf{\Sigma}^{-1} + \left(\frac{\sqrt{1-\eta^2}}{\eta}\right)^2\mathbf{I}\right)^{-1} \frac{\sqrt{1-\eta^2}}{\eta^2}\hat{\mathbf{x}}, \qquad \eta \neq 0 , \end{align*} and \begin{align*} \mathbf{\Sigma}_{x|\hat{x}} &= \mathbf{\Sigma} - \sqrt{1-\eta^2}\mathbf{\Sigma}\left[ \sqrt{1-\eta^2}\mathbf{\Sigma}\sqrt{1-\eta^2} + \eta^2\mathbf{I} \right]^{-1}\mathbf{\Sigma}\sqrt{1-\eta^2} \\ &= \left( \mathbf{\Sigma}^{-1} + \left(\frac{\sqrt{1-\eta^2}}{\eta}\right)^2\mathbf{I}\right)^{-1} \qquad \eta \neq 0. \end{align*} So that in particular when $\eta = 0$ we get $$ \begin{align*} \mu_{x|\hat{x}} &= \mathbf{\Sigma}\mathbf{\Sigma}^{-1}\hat{\mathbf{x}} = \hat{\mathbf{x}} \\ \Sigma_{x | \hat{x}} &= \mathbf{\Sigma} - \mathbf{\Sigma}\mathbf{\Sigma}^{-1}\mathbf{\Sigma} = \mathbf{0} \end{align*} $$ as you would hope.