Shannon information for a set of 3 equally probable elements?

80 Views Asked by At

How to calculate entropy as a number of binary choices for a set of three equally probable elements? The Shannon's formula gives $\log_2(3)=1.585$. But any interpretation of binary choices gives me $5/3=1.667$.

1

There are 1 best solutions below

0
On

The correct answer is the first one, $H(X)=\log_2(3)=1.585\cdots$

The number you get by trying a binary choice decision tree corresponds, in essence to a coding of the form

 A : 0
 B : 10
 C : 11

which gives you a mean code length of $ 1 \times 1/3 + 2 \times 1/3 +2\times 1/3 =5/3 = 1.666\cdots$. This is indeed the optimum coding (or, equivalently, the optimum binary choice procedure). But this is not really the entropy; the first Shannon Theorem only asserts that $ H(X) \le L < H(X)+1$ where $L$ is the optimum mean binary code length (or the mean number of choice questions). Only when we take extensions of the source (i.e., we code blocks of occurences as a whole - or, in the binary choice formulation, we make questions to discover not isolated symbols, but blocks) we have that $L \to H(X)$