Minimum Mean Square Error (MMSE) and Mutual Information (I)

514 Views Asked by At

Consider this setting:

$Y=X+N$

where $N$ is a Gaussian standard random variable and $X$ is another arbitrarily distributed r.v. You can think of this $X$ as a message being transmitted over an AWGN channel the output of which is the r.v. $Y$. I am wondering if anybody can introduce me some good resources on the connection between $MMSE = E[(X- E[X|Y])^2]$ and mutualinput-output information, namely $I(X:Y)= E[log \frac {p_{X,Y}}{p_X p_Y}]$

1

There are 1 best solutions below

2
On BEST ANSWER

Do the names Guo, Shamai and Verdu ring any bells? :)

Check out the following two papers by these guys. They are comprehensive and are probably just what the doctor ordered.

http://www.princeton.edu/~verdu/reprints/GuoShaVer.Apr2005.pdf

http://www.princeton.edu/~verdu/reprints/GuoWuShaVer2011.pdf