Conditional entropy on race outcome

49 Views Asked by At

The problem is:

9 guys are racing.

The favorite has a probability of 3/4 to win the race.

Each other competitor has an equal chance to win.

If it becomes known that the favorite did not win the race, what is the uncertainty of the result?

My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win. My trouble is on how to model the P(X|Y) and P(X,Y) needed to find the entropy.

2

There are 2 best solutions below

0
On

Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $\log 8 = 3$ bits.

0
On

My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.

When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.

(In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )

In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X \mid X \ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.