Entropy of $f(x)=1$

124 Views Asked by At

Let $f(x)$ be a probability density function $f(x) = 1$ on $x = [0,1]$, and entropy defined as $$H(p(x)) = -\int p(x) \log_2(p(x)) \, dx$$ where $p(x)$ is a pdf. Unless I've made an arithmetic error, the entropy of $H(f(x)) = 0$. $$H(f(x)) = -\int_0^1 1\log_2(1) \,dx = 0$$ Given that other uniform distributions maximize entropy, this seems counter-intuitive. Other than mathematical definitions, is there an intuitive explanation why this is true? (Assuming it is true.)

1

There are 1 best solutions below

0
On BEST ANSWER

Differential Entropy can actually be negative, which is one of it's drawbacks. It just so happens that on $[0,1]$ all continuous distribution entropies are negative except for the uniform distribution. Let $h(x)$ be any continuous distribution on $[0,1]$ and $u(x)=1$ be the uniform distribution. Here's the proof using KL divergence notation::

$$0\leq D_{KL}(h(x))||u(x))=\int_0^1 h(x) \log( h(x)/u(x))dx=-H(h(x))-\int_0^1 h(x)\log(u(x))dx=-H(h(x)),$$

since $u(x)=1$. So

$$H(h(x))\leq 0.$$

By the way, positivity of KL divergence is a consequence of Jenson's inequality:

$$D_{KL}(f||g)=\int\log \left(\frac{f(x)}{g(x)}\right)[f(x)dx]=\int-\log(g(x)/f(x)) [f(x)dx]\geq -\log(\int g(x)dx)=0$$