Shannon's entropy in a set of probabilities

266 Views Asked by At

Let $P = p_1, \ldots, p_N$ be a set of probabilities (i.e., $0 \leq p_i \leq 1$). I can compute the Shannon's entropy as follows: $$ H(P) = -\sum_{i=1}^N p_i \log_2 p_i $$

Now, suppose I perform the following operations:

  • I select some $p_i \in P$, and create the set $P_{\text{sub}} \subseteq P$
  • Some of the probabilities $p_i \in P_{\text{sub}}$ are decreased somehow
  • The probabilities in $P_{\text{sub}}$ are normalized so that $\sum_i p_i = 1$

Is it possible that the entropy of the set $P_{\text{sub}}$ is greater than the entropy in $P$, i.e., $H(P) < H(P_{\text{sub}})$?

1

There are 1 best solutions below

0
On BEST ANSWER

Yes, choose, for example, your initial probabilities as $[0.99\,\,0.05\,\,0.05]$, then choose your subset to be the last two probabilities $[0.05\,\,0.05]$ (normalizing we get $[0.5\,\,0.5]$).