Understanding an application of Entropy

125 Views Asked by At

I'm struggling with the following exercise on entropy.

Suppose that your friend Alice chooses a number $X$ uniformly at random from $[1,n]$, which she writes down using $\log n$ bits; you can assume that all quantities concerning $\log n$ are integers. Now the following happens: With probability $1/4$, Alice tells you the first $3/4 \log n$ bits of $X$ and with probability $3/4$, she tells you only $1/8 \log n$ bits of $X$. What is your entropy regarding $X$?

The way I understand this, is that, without the second part where Alice tells me some random bits, my entropy $H(X)$ would be $$ H(X) = -\sum_x p(x) \log p(x) = - n (1/n) \log (1/n) = \log n.$$

Regarding the additional information gained from Alice, my intuition tells me that my entropy should be lowered by the bits weighted with their respective probability of me getting them:

$$ H(X) - (1/4) (3/4) \log n - (3/4) (1/8) \log n = 29/36 \log n $$

Is my intuition correct? If yes, how can I formalize it?

1

There are 1 best solutions below

0
On BEST ANSWER

Your intuition is good. Entropy is defined as the expected information gain, that is, how much you expect to learn from this event. If we were just looking at the first part, where you're given a random integer, as you have written, we would expect to learn $\log n$ bits of information.

In the second part, with probability $\frac{1}{4}$ we will learn $\frac{3}{4}\log n$ bits of information, and with probability $\frac{3}{4}$ we will learn $\frac{1}{8} \log n$ bits of information. Thus the expected amount of information we will learn is $$\Big(\frac{1}{4}\Big)\Big(\frac{3}{4}\Big)\log n + \Big(\frac{3}{4}\Big)\Big(\frac{1}{8}\Big)\log n$$