Show that entropy $(p1,...,pi,...,pj,...,pm)$, < entropy $(p1,..., (pi+pj)/2 ,..., (pi+pj)/2 ,...,pm)$.

492 Views Asked by At

Show that the entropy of the probability distribution, $(p1,...,pi,...,pj,...,pm)$, is less than the entropy of the distribution $(p1,..., (pi+pj)/2 ,..., (pi+pj)/2 ,...,pm)$.

I don't understand what is meant by this? I don't see the pattern..

Let's say I have $(p_1,p_2,p_3,p_4)$ for the first one, is the second one $(p_1,(p_1+p_3)/2,(p_2+p_4)/2,p_4)$? Always take average of previous and next term?

Thank you!

1

There are 1 best solutions below

1
On BEST ANSWER

Nope, the second probability distribution keeps all but two components invariant and sets those two components both equal to the average of them. And what is required to show is $$-p\ln p-q \ln q\leq-(p+q)\ln\frac2{p+q}$$ Or equivalently, by Jensen inequality $$S(\pi)=-\sum_i^n\pi_i\ln\pi_i$$ is concave down, which is directly followed by the definition.