Lower bound of the entropy over the probability distribution

67 Views Asked by At

I am trying to show that:

$$\inf_{z: 1^Tz=1} \sum_{i=1}^m {z_i \log z_i} = -\log m$$

I thought about Jensen inequality or induction, but none of them provided me something.

1

There are 1 best solutions below

0
On BEST ANSWER

Jensen’s inequality works straightforwardly. Since for each $i$, $\log z_i$ is involved in the given inequality, we assume that $z_i>0$. Consider a function $f(x)=x\log x$ for $x>0$. Since $f’’(x)=\frac 1{x}>0$, the function $f$ is convex. Thus

$$\frac{\sum f(z_i)}m\ge f \left(\frac{\sum z_i}m\right)=f\left(\frac 1m\right)=\frac 1m\log\frac 1m=-\frac{\log m}{m}.$$

and the minimum is attained when each $z_i$ equals $\frac 1m$.