Derivation of MMSE from an estimator of two Gaussians

586 Views Asked by At

Suppose $X$ and $N$ are independent Gaussian with different variance but N has zero mean. Now $Y = X+N$. I am trying to find out the minimum mean square error estimator for $X$ given $Y$. I set the estimator to be the expected value of $X$ given $Y = y$ and doing the integral figuring out the estimator is $p^2Y$, where p is the ratio of standard deviation of $x$ and $y$ (not sure if this is even correct). Now I am trying to find out its MMSE. The book shows MMSE is the variance of $x$ multiplying by $1-p^2$, however I couldn't get rid of the expectation of $X$ when I plug in my estimator function. So could someone give me an derivation of this answer?

1

There are 1 best solutions below

0
On

You can use the fact that the linear MMSE estimator of $X$ given $Y$ is $E[X] + cov(X,Y) cov(Y,Y)^{-1} (Y- E[Y])$ and for jointly Gaussian $X,Y$, the linear MMSE estimator and MMSE estimator coincide (*). In this case, it is clear that $Y$ and $X$ are jointly Gaussian (as any linear combination of them is clearly Gaussian).

In this case, it is easy to calculate out $E[Y] = E[X]+E[N]$, $cov(Y,Y) = var(Y) = var(X+N) = var(X) + var(N)$ by independence, and $cov(X,Y) = cov(X,X+N) = cov(X,X)+cov(X,N) = cov(X,X) + 0 = var(X)$.

To calculate out the MSE, you can use the fact (shown in sec. 3.3 of (*) ) for the linear mmse estimator's mse to be $cov(X,X) - cov(X,Y) cov(Y,Y)^{-1} cov(Y,X)$. It is easy to calculate $cov(Y,X) = cov(X+N,X) = cov(X,X) + cov(N,X) = cov(X,X) = var(X)$ by independence.

(*) For proof, see for example, Random Processes for Engineers by B. Hajek, freely avaliable here, sections 3.3 and 3.4