Entropy of the Uniform Mixture of Discrete Probability Distribtuions

123 Views Asked by At

Consider the following inequality: \begin{equation} H\left(\frac{1}{3}p_{1} + \frac{1}{3}p_{2} + \frac{1}{3}p_{3}\right) \geq H(0.5p_{1} + 0.5p_{2}) \end{equation} where H(.) denotes the Shannon entropy of the probability distribution in its argument. Does this hold for discrete probability distributions $p_{1}, p_{2}$ and $p_{3}$ over the same alphabet such that the distributions are permutations of each other (i.e. such that $H(p_{1}) = H(p_{2}) = H(p_{3})$)? Furthermore, can this be generalised to say that the entropy of a larger uniform mixture of permutations of discrete probability distributions always increases (or maintains) the entropy?

1

There are 1 best solutions below

5
On BEST ANSWER

Consider domain $\{0,1\}$, and trivial point masses $p_1,p_2,p_3$ as follows. $ p_1(0)=p_3(0)=1 $ and $p_2(1)=1$. Then $$ H\left(\frac{p_1+p_2}{2}\right) = h_2(1/2) =\log 2 $$ but $$ H\left(\frac{p_1+p_2+p_3}{3}\right) = H\left(\frac{2}{3}p_1+\frac{1}{3}p_2\right) = h_2(1/3) < \log 2 $$ so your inequality does not hold, even for a domain of size $2$.