Differing Calculations? Entropy of ONE Alphabet $= 4.7$ or $0.18$?

128 Views Asked by At

If I choose a random alphabet from (a-z), $26$ characters, what is the entropy?

Shannon's formula:

$$H = - \sum p \log_2(p) = - (1/26)\log_2(1/26) = 0.18$$ bits.

However, other formulas on the Internet use:

$$H = \log_2 (N^L) = \log_2(26) = 4.7$$

bits.

Which one is correct?

1

There are 1 best solutions below

2
On

The internet is correct.

The sum is over all possible outcomes, and not just one. So you should get:

$H = \sum_{x = 'a'}^{'z'} - p(x) \log_2(p(x)) = \sum_{x = 'a'}^{'z'} - \frac{1}{26} \log_2(\frac{1}{26} = 26 * (-\frac{1}{26} \log_2(\frac{1}{26})) = -\log_2(\frac{1}{26}) = \log_2(26) = 4.70043971814...$