$X$ is a random variable with normal distribution, assume $Y=X$, what is the mutual information $I(X;Y)$?
I guess that $h(Y|X)=0$ since when $X$ is known, $Y$ is completely known, so $$I(X;Y)=h(Y)-h(Y|X)=h(Y)=\frac{1}{2}\log 2\pi e\sigma^2$$ nat.
But, I was told I was wrong! and a numerical computation also shows that the value of $$I(X;Y) \neq \frac{1}{2}\log 2\pi e\sigma^2$$ Where is my mistake? Please help me out of this problem, thanks a lot! (Please note that $X$ and $Y$ are both continuous).
If $(X_1,X_2)$ is a Gaussian vector each variable having the same variance ($\sigma^2$) with covariance matrix $K$, then $$h(X_1,X_2)=\frac{1}{2}\log((2\pi e)^2|K|)=\frac{1}{2}\log((2\pi e)^2 \sigma^4(1-\rho^2))$$ where $\rho$ is the correlation coefficient.
Now $I(X_1:X_2)=2h(X_1)-h(X_1,X_2)=-\frac{1}{2}\log(1-\rho^2)$.
When $X_1=X_2$, $\rho=1$ and hence $I(X:X)=\infty$.