Entropy of Disjoint vs. Overlapping Mixture

225 Views Asked by At

This question extends this Shannon Entropy Inequality question. Fix the shapes but not locations of the pmfs of discrete distributions $P$ and $Q$ and let $R = \lambda P + (1-\lambda) Q$. Will the entropy of $R$ be larger if $P$ and $Q$ overlap or if they have disjoint support, in general?

1

There are 1 best solutions below

1
On

We have, if $P$ and $Q$ overlap \begin{alignat*}{3} H(R) &= &&-[\sum_{k \in P \setminus Q} \lambda P(k) \log(\lambda P(k)) + \sum_{k \in Q \setminus P} (1-\lambda) Q(k) \log((1-\lambda) Q(k)) \\ &&&+ \sum_{k \in Q \cap P} [\lambda P(k) + (1-\lambda) Q(k)] \log(\lambda P(k) + (1-\lambda) Q(k))] \\ &= &&-[\sum_{k \in P \setminus Q} \lambda P(k) \log(\lambda P(k)) + \sum_{k \in Q \setminus P} (1-\lambda) Q(k) \log((1-\lambda) Q(k)) \\ &&&+ \sum_{k \in Q \cap P} \lambda P(k) \log(\lambda P(k) + (1-\lambda) Q(k)) \\ &&&+ \sum_{k \in Q \cap P} (1-\lambda) Q(k) \log(\lambda P(k) + (1-\lambda) Q(k))] \\ &< &&-[\sum_{k \in P \setminus Q} \lambda P(k) \log(\lambda P(k)) + \sum_{k \in Q \setminus P} (1-\lambda) Q(k) \log((1-\lambda) Q(k)) \\ &&&+ \sum_{k \in Q \cap P} \lambda P(k) \log(\lambda P(k)) + \sum_{k \in Q \cap P} (1-\lambda) Q(k) \log((1-\lambda) Q(k))] \\ &= &&-[\sum_{k \in P} \lambda P(k) \log(\lambda P(k)) + \sum_{k \in Q} (1-\lambda) Q(k) \log((1-\lambda) Q(k))]. \end{alignat*} This last line is the entropy if $P$ and $Q$ do not overlap.