An inequality relating the Kullback-Leibler divergence of two discrete distributions with constant reference distribution

170 Views Asked by At

Suppose that $D_{KL}(\textbf{p}_1||\textbf{q})<1$ and $D_{KL}(\textbf{p}_2||\textbf{q})<1$. I'm trying to show that either $D_{KL}(\textbf{p}_1||\textbf{p}_2)$ or $D_{KL}(\textbf{p}_2||\textbf{p}_1)$ will have an upper bound close to 1 provided $\textbf{q}$ is fixed. It seems intuitive that if $\textbf{p}_1$ and $\textbf{q}$ are similar enough and if $\textbf{p}_2$ and $\textbf{q}$ are also similar enough then $\textbf{p}_2$ can reasonably approximate $\textbf{p}_1$ or vice versa. Is this actually true?

1

There are 1 best solutions below

0
On BEST ANSWER

This is false because the KL divergence does not satisfy the triangle inequality.

Consider $\mathcal{X} = \{0,1\}$, $p_1(0) = 0.9$, $p_1(1) = 0.1$, $p_2(0) = 0.1$, $p_2(1) = 0.9$, $q(0) = q(1) = 0.5$. Then

$$D_{KL}(p_1 || q) = D_{KL}(p_2 || q) = 0.3681,$$

but

$$D_{KL}(p_1 || p_2) = D_{KL}(p_2 || p_1) = 1.7578.$$

The discrepancies become even more pronounced as $p_1(0)$ and $p_2(1)$ approach $1$. As $p_1(0)$ and $p_2(1)$ approach $1$, $D_{KL}(p_1 || p_2) = D_{KL}(p_2 || p_1)$ approach infinity. The intuition behind this is that $p_1$ and $p_2$ approach distributions with disjoint support, which we know results in a KL divergence of infinity. Moreover, with $q$ uniform as it is here, $$D_{KL}(p_1 || q) = D_{KL}(p_2 || q) \leq \log(2),$$ and thus remains finite. Therefore this conjecture is false.