I have event that has probability $p=0.96$. There is such task:
How many an information I need to determinate the event that has probability $q=0.04$?
Anyway: How many bits I need to deteminate the event?
We must use a formula of information content: $I=-{log}_{2}q=-{log}_{2}0.04=4.64$ bits. How should we explain it that we need about 5 bits? I'm going to build a follow tree as below: enter image description here
I presume, that 5 bits is necessary, because 1 bit determinates symbol that has probability $P=0.5$. I want to say if two events have equal probabilities that 1 bit is enough for us. But in the task first four bits are "inclined" to event with $p=0.96$ and we should balance events by adding new fivest bit. Is that right?