mutual information problem

329 Views Asked by At

Consider the following problem:

What is $I(X;Y)$ where $X$ is the outcome of a roll of a fair 6-sided die and $Y$ is whether the outcome of THAT SAME ROLL was even or odd?

Intuitively, I thought that $I(X;Y) = H(X) + H(Y) - H(X, Y)$ - that is, when given even or odd, we cut $X$ in half (so we know the number was either 1, 3, 5 or 2, 4, 6). However, I'm slightly confused (as my professor argued that the correct answer was 1).

His idea was that if I'm given the number rolled, then all of the even/odd information is contained within, so $I(X;Y) = 1$.

Maybe I don't understand mutual information correctly, an explanation would be appreciated.

1

There are 1 best solutions below

1
On

Since there are six equiprobable outcomes for $X$ and two equiprobable outcomes for $Y$, if entropy is measured in bits, $$H(X)=(\log 6)/(\log 2),\qquad H(Y)=1.$$ Since $Y$ is a deterministic function of $X$, the joint random variable $(X,Y)$ has six equiprobable outcomes, just as $X$ does. So, $H(X,Y)=(\log 6)/(\log 2)$ also. Therefore, $$I(X;Y)=H(X)+H(Y)-H(X,Y)=\frac{\log 6}{\log 2} + 1 - \frac{\log 6}{\log 2}=1:$$ since all the information in $Y$ is already in $X$, $$H(X,Y)=H(X)\ \ \ \Rightarrow\ \ \ I(X;Y)=H(Y).$$ The idea that knowing $Y$ cuts the number of possibilities for $X$ in half is correct, but translated into entropy, it's about $H(X\mid Y)$: $$ H(X\mid Y)=H(X,Y)-H(Y)=\frac{\log 6}{\log 2}-1=\frac{\log 3}{\log 2}: $$ once you know $Y$, there are only three equiprobable outcomes for $X$.