Shannon measure of information dice question

141 Views Asked by At

You play a game with a friend where she rolls two $6$-sided dice. You first have to guess the total, and then once she confirms that you have the correct answer you have to guess the combination of dice (the order does not matter, i.e., guessing $6$-$1$ or $1$-$6$ is the same).

  1. How much information is in the distribution of totals?

  2. How many possible combinations are there? What is the SMI for the dice combinations?

  3. If your friend confirms to you that the total is 5, how much information has she given you and how much remains?

1

There are 1 best solutions below

2
On

The first point is incorrect. You have 11 possible values for the sum, but the are not equiprobable. See eg here.

The second is almost correct. But again, the 21 combinations are not equiprobable. Because the combination $(6,5)$ has twice the probability of the combination $(6,6)$

Even after these corrections, the question might seem slightly ambiguous. Let $X$ be the "full" result (21 possible combinations) and $Y$ be the total sum. Then, "how much information the sum gives" can be understood in two ways. One, in itself; then the answer is $H(Y)$. Two, how much information gives about the full result; then the answer is $H(X)-H(X|Y)=I(X;Y)=H(Y)-H(Y|X)$. But in this case $H(Y|X)=0$ (knowing the combination one knows the sum), hence fortunately there is no ambiguity.

For the third, one must compute $H(X | Y=5)$