$X \to Y \to Z$ PGM with $X,Y,Z ~ \text{MVN}(0,\Sigma)$. What is mutual information $I(X;Z)$? (Cover & Thomas 8.9)

23 Views Asked by At

Cover and Thomas Elements of Information Theory Problem 8.9 describes a 3-node network $X \rightarrow Y \rightarrow Z$ with a multivariate Gaussian distribution. C&T asks, what is the mutual information $I(X;Z)$? Supposing unit marginal variances of $X, Y$ and $Z$, we know that the multivariate covariance matrix has the form

$$ \Sigma= \begin{pmatrix} 1 & \rho_{xy} & 0\\ \rho_{xy} & 1 & \rho_{yz}\\ 0 & \rho_{yz} & 1\\ \end{pmatrix} $$

We can integrate out $y$ from $f_{XYZ}(x,y,z)$ to obtain $f_{XZ}(x, z)$. From the definition of mutual information, it follows that

$$ I(X;Z)=\int_{-\infty}^{\infty} dx \int_{-\infty}^{\infty} dz ~~f_{XZ}(x,z) \log \frac{f_{XZ}(x, z)}{f_X(x)f_Z(z) } $$

But $f_{XZ}(x, z)$ is a bivariate normal, and after some algebra (completing the square, etc.) we find that there is zero covariance between $X$ and $Z$. Since mutual information in the bivariate case is $-\frac{1}{2}\log(1-\rho^2)$, the result $\rho_{xz}=0$ implies that there is no mutual information between $X$ and $Z$. And yet, if we fix $X=x$, surely $z$ responds, so there is some way to send information. What is akilter?