An upper bound of binary entropy

3.7k Views Asked by At

Binary entropy is given by $$H_{\mathrm b}(p) = -p \log_2 p - (1 - p) \log_2 (1 - p), \hspace{6 mm} p \le \frac{1}{2}$$ How can I prove that $$H_{\mathrm b}(p) \le 2 \sqrt{p(1-p)}$$

3

There are 3 best solutions below

8
On BEST ANSWER

Given $I=(0,1)$, for any $x\in I$ let $g(x)=-x\log x$ and $f(x)=g(x)+g(1-x)$.

Obviously $f\in C^{\infty}(I)$, $\,f(x)=f(1-x)$, $\,f>0$ and $$ \lim_{x\to 0^+}f(x)=\lim_{x\to 1^-}f(x) = 0. $$ We have $f'(x)=\log(1-x)-\log(x)$, hence:

$$\begin{eqnarray*} \frac{d}{dx}\frac{f(x)^2}{x(1-x)}&=&f(x)\cdot\frac{-x\log x+(1-x)\log(1-x)}{x^2(1-x)^2}\\&=&\frac{g(x)^2-g(1-x)^2}{x^2(1-x^2)}.\tag{1}\end{eqnarray*}$$ Since $g(x)>g(1-x)$ over $J=\left(0,\frac{1}{2}\right)$ (the proof of this fact comes later), the function $\frac{f(x)^2}{x(1-x)}$is increasing over $J$, hence:

$$ f(x)\leq 2\cdot f\left(\frac{1}{2}\right)\cdot\sqrt{x(1-x)}=\color{blue}{\left(2\log 2\right)}\sqrt{x(1-x)} \tag{2}$$

follows. Back to the missing part: obviously $g(x)-g(1-x)$ vanishes at $0,\frac{1}{2},1$. In order to prove that these points are the only zeros, it is enough to check that $h(x)=g(x)-g(1-x)$ is concave over $J$. Pretty easy:

$$ \frac{d^2}{dx^2}h(x) = \frac{2x-1}{x(1-x)}.\tag{3} $$ The proof is finished by noticing that $f(p)=\log(2)\cdot H_{\mathrm b}(p)$.


An very tight approximation for the binary entropy function is given by:

$$ H_{\mathrm{b}}(p) \approx \left(4p(1-p)\right)^{\frac{3}{4}}.\tag{4}$$

It does not hold as an upper bound or a lower bound, the the absolute value of the difference between the RHS and the LHS is always less than $8\cdot 10^{-3}$.

8
On

This inequality is actually not trivial. Here is a proof sketch. We know that there are three equality cases, so to get rid of the ones at the ends we take $\frac{H(p)}{2\sqrt{p(1-p)}}$. By differentiating with respect to $p$ and slowly analyzing the result (elementary algebra plus the inequality $x-\frac{1}{2}x^2 \le \ln(1+x) \le x$ for $x \ge 0$) it is possible to show that it is decreasing for $p > \frac{1}{2}$, which by symmetry means that it is maximum when $p = \frac{1}{2}$. It is not that hard but not elegant.

0
On

You may use the inequality $\ln(x)\le \sqrt{x-1}$ for any $x\ge 1$. Precisely speaking, note that \begin{equation} \begin{aligned} H(p)&=p\ln\frac{1}{p} +(1-p)\ln\frac{1}{1-p} \\ &\le p\sqrt{\frac{1}{p}-1} +(1-p)\sqrt{\frac{1}{1-p}-1}\\ &= 2\sqrt{p(1-p)} \end{aligned} \end{equation} Here the logarithm is to base natural exp..