Conceptual Question regarding Shannon Entropy and bits

61 Views Asked by At

It is said that the number of "information bits" contained in a certain piece of information can be roughly translated as the number of yes/no-questions that would have to be answered in order to transmit the information. But isn't this entirely dependent on the knowledge of the receiver (what questions they ask) and if so, how could one ever talk about the "objective" number of bits in a certain piece of information?

In theory, it seems to me that any amount of information could be transmitted in a single bit using the right (possibly very long) yes/no-question. Am I missing something fundamental here?

Thank you in advance!

1

There are 1 best solutions below

0
On BEST ANSWER

But isn't this entirely dependent on the knowledge of the receiver

Of course. If the receiver already has the full knowledge, then she needs to ask zero (not even one!) questions. If the receiver knows that the information has one two possible outcomes A/B, each equally probable, then she has to ask just one question. In the two examples above, the amount of information she attains after discovering the actual value, is respectively zero and one bit. Which is precisely the entropy.

This amount of bits/questions is "objective" in the sense that, if $n$ persons agree in the a priori knowledge they have about the unknown value (which is modelled by a probability distribution), then they must agree about the optimal (in average) amount of questions they must ask to discover the value.