I was studying Shannon's Entropy function and for a 35% chance of a particular event, the formula produced the answer 1.5 bits.
log2(0.35) = 1.5 bits(approx.) of information.
I know it's pretty trivial in context of practical applications, but how can one visualize 0.5 a bit of information. I mean a bit could be one or zero but 0.5? Analog values ?!?!?
Thank you to Jyrki Lahtonen in the comments for answering it convincingly.