Consider the following game between three parties: Alice, Bob and a Referee. The game starts with $n$ closed boxes. For some fixed $k<n$ known to all parties, the Referee freely chooses $k$ distinct boxes and places a prize inside each of them. Alice sees the Referee do this and thus, she knows which boxes have a prize. Bob now enters the room. He can talk to Alice, then choose one box and if this box has a prize, he can keep it.
What is the smallest number of bits of information that Alice needs to convey to Bob such that Bob is guaranteed to win a prize?
The naive strategy is for Alice to pick a prize-containing box at random and give the ID of this box to Bob. This costs $\log n$ bits of information since she picks one box out of $n$ choices. But given that there are $k$ choices and any of them would do the job, can Alice and Bob win the game with less than $\log n$ bits of communication? If yes, what should their strategy be?
EDIT: I am happy to assume the Referee is not adversarial if that admits a better strategy for Alice and Bob.
An obvious strategy is: Alice picks the lowest box id and sends it. There are $n-k+1$ possible values, hence the information content is at most $\log(n-k+1)$
Actually, if we assume that the $k$ boxes are chosen at random, the distribution of the picked one will not be uniform, hence the information content will be less than that.
The pmf of the smallest member of a $(n,k)$ selection, indexed from $j=0\cdots n-k$ is
$$P(j)= \frac{1}{\binom{n}{k}}\binom{n-j}{k}\frac{k}{n-j}$$
I'm not sure if the entropy of this distribution can be put in a simple form.
Edited: for large $n$ the entropy (in nats) of the above gives me the following not-too-illuminating expression:
$$H\approx \frac{k}{n} \frac{e^{-k/n}}{1-e^{-k/n}} - \log(1- e^{-k/n})$$
For large $n$ and $k$, with $k/n \to 0$, this behaves as $\log(n)-\log(k)+ O(1)$