How is it the Kullback-Leibler divergence is always non-negative but differential entropy can be positive or negative?

333 Views Asked by At

According to wikipedia, we have $$ D_{KL}(f ||g ) \geq 0 $$ always, but if $f$ is the pdf of a random variable $X$ and $g$ is the density of the un-normalized Lebesgue measure i.e. the constant function $1$, then $$ D_{KL}(f ||g ) = \int f \log(f/g) = \int f\log f = -h(X), $$ the negative of the differential entropy. Many distributions however have positive differential entropy (as shown here). So this means the left-hand side can take negative values. What's going on here?

1

There are 1 best solutions below

2
On BEST ANSWER

It is not necessarily true that $KL[f || g] \geq 0$ when $f, g$ are not probability measures. In your case $g$ does not integrate to one and hence the result fails, as your counterexample(s) demonstrate.

For some more intuition on what breaks down, let's look at a standard proof of the non-negativity of the KL for probability distributions $f, g$. This proof uses Jensen's inequality:

$$KL[f || g] = - \int \log (g/f) f \; dx \geq -\log \int g dx = 0.$$

Note that in the last step we rely on the assumption that $\int g dx = 1$. For your scenario (with $g$ being the Lebesgue measure, presumably on all of $\mathbb{R}$), this integral is infinite.