Mutual Information Gaussian

1k Views Asked by At

Suppose $x$ has a multivariate Gaussian distribution given by $\mathcal{N}(\mu , \Sigma)$. How do you express the mutual information between two coordinates that is $\mathrm{I}(x_i;x_j)$ , as a function of $\mu$ and $\Sigma$ ?

1

There are 1 best solutions below

1
On BEST ANSWER

Hint: $\begin{bmatrix} X_i \\ x_j \end{bmatrix} \sim N(\begin{bmatrix} \mu_i \\ \mu_j \end{bmatrix} , \begin{bmatrix} \Sigma_{ii} & \Sigma_{ij} \\ \Sigma_{ji} & \Sigma_{jj} \end{bmatrix})$.

You should know how to calculate the distribution of $X_i | X_j$ (this is another Gaussian with a mean and covariance you can calculate. Then, using the entropy of a Gaussian , you can find $I(X_i ; X_j) = H(X_i) - H(X_i | X_j)$.