Bound on the variance of a distribution based on a divergence measure.

110 Views Asked by At

Given two absolutely continuous distributions $p(x)$ and $q(x)$ on $\mathbb{R}^d$ with the same first moment and finite second moment, is it possible to bound the second moment of $q$ given that $KL(p(x)|q(x)) \le \epsilon$ and that the second moment of $p$ is some constant $C$, where $KL(p(x)|q(x))$ denotes the Kullback–Leibler divergence?

1

There are 1 best solutions below

0
On

No.

Let $P_n = \big(1-\frac 1{n^2}\big) \delta_0 + \frac{1}{2n^2} \delta_n + \frac1{2n^2}\delta_{-n}$, $Q_n(dx) = \big(1-\frac {1+n}{n^2}\big) \delta_0 + \frac{1+n}{2n^2}\delta_n + \frac{1+n}{2n^2}\delta_{-n}$. Then $P_n$ and $Q_n$ have mean zero, $P_n$ has variance $1$, $Q_n$ has variance $n+1$, and $$ KL(P||Q) = \Big(1-\frac 1{n^2}\Big) \log \frac{1-\frac1{n^2}}{1-\frac {1+n}{n^2}} - \frac1{2n^2} \log(1+n) \to 0, n\to \infty. $$

(It follows from here that $Q$ can even have infinite variance for arbitrarily small $KL(P||Q)$.)