Mutual information of continuous variables

2.8k Views Asked by At

I think I am misunderstanding the notion of mutual information of continuous variables. Could anyone help me clear up the following?

Let $X \sim N(0, \sigma^2) $ and $Y \sim N(0, \sigma^2) $ denote Gaussian random variables. If $X$ and $Y$ are correlated with a coefficient $\rho$, then the mutual information between $X$ and $Y$ is given by (reference: https://en.wikipedia.org/wiki/Mutual_information).

\begin{equation} I(X; Y) = -\frac{1}{2} \log (1-\rho^2). \end{equation}

Here, I thought $I(X; Y) \rightarrow \infty$ when $\rho \rightarrow 1$ (for $X = Y$, $\rho = 1$). I considered this another way.

I considered $Y = X$. In this case, I would obtain $ I (X; Y) = H(X) - H(Y|X) = H(X) $.

For the Gaussian random variable $X$, $H(X)$ is bounded as follows (reference: https://en.wikipedia.org/wiki/Differential_entropy): \begin{equation} H(X) \leq \frac{1}{2} \log ( 2 \pi e \sigma^2). \end{equation}

Thus, $ I (X; Y) \leq \frac{1}{2} \log ( 2 \pi e \sigma^2)$.

Here is my question. I obtained two different results on $ I (X; Y)$ for $X = Y$. What could be some mistakes in my understanding?

Thank you in advance.

1

There are 1 best solutions below

4
On BEST ANSWER

Differential entropy can actually be negative, and thus the upper bound on your information is not correct. Indeed, if they are the same random variable on a continuous domain, then you would hope that the mutual information between them would be infinite (and if they are the same Gaussian, indeed that is the case).

EDIT: I guess I should have clarified: In differential entropy sense, H(Y | X) is not 0; it is negative infinity if X = Y. Any singularity in differential entropy has negative infinite relative uncertainty to any quantized uniform distribution.