Mutual information of two random variables

340 Views Asked by At

I am wondering how one does calculate the mutual information for the following setup. Suppose we have two random variables, $$X_0=\{x_1,x_2,x_3\}$$ $$X_1=\{x_1,x_2,x_3,x_4\}$$ where $P(X_0=x_1)=P(X_0=x_2)=P(X_0=x_3)=1/3$ and $P(X_1=x_1)=P(X_1=x_2)=P(X_1=x_3)=P(X_1=x_4)=1/4$

2

There are 2 best solutions below

5
On

If $X_0$ and $X_1$ are independent, then the answer to the question is straightforward as

$I(X_0,X_1)=H(X_0)-H(X_0|X_1)=H(X_0)-H(X_0)=0$, where the second step is done due to the fact that the random variables are independent. Note that the mutual information is always zero when the random variables are independent.

For the case where the variables are not independent, the answer is not straightforward, as it depends on the conditional probability distribution $p_{X_0|X_1}(x_0|x_1)$ and so $H(X_0|X_1)\neq H(X_0)$.

Consequently, an answer cannot be given if the conditional probability of both random variables is unknown.

2
On

Note that $X_1=\{X_0,x_4\}$. It follows that

$$ \begin{align} I(X_0;X_1) &= H(X_0)-H(X_0 \mid X_1)\\ &=H(X_0)-H(X_0 \mid X_0, x_4)\\ &=H(X_0)\\ &=\log(3), \end{align} $$

You can also find the same result starting from $I(X_0;X_1) = H(X_1)-H(X_1 \mid X_0)$. I will leave this as an exercise for you.