Is there a counterexample that shows that the KL divergence does not satisfy the triangle inequality?

2.2k Views Asked by At

The KL Divergence, or relative entropy for two probability distributions $p,q$ on $\Omega$ is defined as:

$$ H(p|q) = \int_{\Omega} p(\omega) \log \frac{p(\omega)}{q(\omega)} d\omega $$

This is a divergence and not a distance since it does not satisfy symmetry nor does it satisfy the triangle inequality. I have seen counterexamples for symmetry, but I was wondering if anyone has any simple counterexamples to show that it does not satisfy the triangle inequality.

1

There are 1 best solutions below

2
On BEST ANSWER

Take $\Omega=\{0,1\}$; $p(0)=1/2$, $q(0)=1/4$, $r(0)=1/10$:

$$ H(p\mid q)+H(q\mid r)\approx 0.24<H(p\mid r)\approx 0.51 $$

($\log$ with base $e$).