How can $H(p, 1-p)$ be plotted?

115 Views Asked by At

So, I get of course that I can plot the entropy $- x \log_2(x)$ just as is which gives us:

enter image description here

This is because we say that $p = (p_1, \ldots, p_n)$ and we have

\begin{align*} H(p) = H(p_1, \ldots, p_n) &= - \sum_{k=1}^n p_k \log_2 p_k \end{align*}

Now I have here an example that defines $p_0 \mapsto H(p_0, 1-p_0)$ (continuous on $[0,1]$) for $p_0 = \frac{1}{2} = 1 - p_0$ and right below I have this plot:

enter image description here

So in this example $p$ is actually not really defined this is why I don't understand what $H(p, 1-p)$ actually is and also how it comes that I can actually plot it since the only thing I have given is $p_0 = \frac{1}{2}$ and its complement.

1

There are 1 best solutions below

3
On BEST ANSWER

In general they're discussing the function $f(p) = H(p,1-p) = -p \log_2(p) - (1-p)\log_2(1-p)$. If you try plotting that function, you'll get the plot you see above. This function quantifies the entropy of a 2-state system with a probability $p$ of one state and $1-p$ of the other. We see that when $p=0$ or $1$ we get 0, because the system is in a definite state. The value $p_0 = 1/2$ they emphasize is the important because it's the symmetric case (the 2 states are equivalent) and it's the maximum entropy (the least information). In particular, $f(p_0) = f(1/2) = 1$, meaning we have exactly one bit of information needed to describe the system. The example you were reading off was likely unclear in that they were using $p$ and $p_0$ interchangeably.