Similarity between Entropy of information source & Expectation of a random varible

2.4k Views Asked by At

While I was watching a lecture on Information theory, I found that entropy of an information source is the average amount of information that it provides in terms of bits (or nats, decits or whatever), which's actually the weighted average of information contained in all the symbols that source provides (weighted by probabilities of individual symbols).. I found a striking similarity between this & concept of Expectation of a random variable which has similar stuff in it's explanation. Am I right on this ? I mean is their any intuitive connection between the two concepts? The explanation of both the concepts taking into consideration their similarity (if find any) is also welcome. Thank you.

1

There are 1 best solutions below

2
On BEST ANSWER

The entropy is defined using an expectation. If you have a random variable $X$ whose pdf is $P$ then its entropy is $$H(X) = \mathbb{E}(-\log P(X)).$$ Another connection is through your definition above. For every $n$, one can find a uniquely decodable code $C_n$ that encodes $n$-tuples of values of $X$ in binary, minimizing the expected length $L_n$. The entropy is the limit of $L_n/n$.