If entropy says the number of bits an information needs, why in this case it's less than one?

212 Views Asked by At

If,$3/4$ of the times, is raining in a city and $1/4$ is not, the entropy $(-\log(3/4))$ would say we need almost $0.415$ of a bit to say it's raining, and $2$ bits $(-\log(1/4) )$ to say it is not, right?? How can we make sense of this, how can we have less than one and more than $0$ bits?

1

There are 1 best solutions below

0
On BEST ANSWER

Having partial bits says that you can do more with the information than answer that one question. The number of bits given by an answer in your case is $-\frac 34 \log_2\left(\frac 34\right)-\frac 14 \log_2\left(\frac 14\right)\approx 0.811$. I should be able to find an encoding scheme that would represent the rain/no rain status of $1000$ days in $811$ bits.