Interpreting Entropy (Comprehension Question)

171 Views Asked by At

Consider a fair coin toss X . There are two outcomes, each with probability 1/2. The entropy of this random variable is

$H ( X ) = − \Big(\frac{1}{2} \log_2\big(\frac{1}{2}\big) + \frac{1}{2}\log_2\big(\frac{1}{2}\big)\Big)= 1$

I'm a bit confused on how to interpret $1$ as the value of the entropy.. What can I deduce from that?

I guess the question is how can I interpret these values say if the values was lower $0.25$ or something

1

There are 1 best solutions below

0
On BEST ANSWER

The entropy of a finite probability distribution ${\bf p}=(p_1,p_2,\ldots,p_n)$ is a measure for the uncertainty about the outcome of the corresponding experiment. If $p_1=0.97$ and $p_2=0.03$ this uncertainty is small, but if $p_1=p_2={1\over2}$ this uncertainty is large. Thinking about the problem in depth shows that the quantity $$H({\bf p}):=\sum_{k=1}^n p_k\log_2{1\over p_k}\tag{1}$$ has the properties that one would wish for such a measure, e.g. when more complicated situations are at stake. (I'm sure this is explained in your textbook.) Of course we need a "unit of uncertainty". This unit is called $1$ bit and is the uncertainty involved in the most simple experiment: the throw of a fair coin. By taking the $\log$ to base $2$ in $(1)$ it is assured that $H\bigl({1\over2},{1\over2}\bigr)=1$.