Gaussian approximation to arbitrary distribution in Kullback–Leibler divergence

394 Views Asked by At

The Kullback–Leibler divergence of two densities $p_1,p_2$ over $\mathbb R^d$ is$\def\KL{\mathrm{KL}}$ $$ D_\KL(p_1,p_2)=\int_{\mathbb R^d} p_1(x) \log\frac{p_1 (x)} { p_2(x)}\,\mathrm d x.$$ I know from Gibbs' inequality that $D_{\KL}$ is non-negative. Now I want to prove the following: let $p_2$ be a fixed density, and $p_1$ the density of some Gaussian distribution, then $D_{\KL}(p_1,p_2)$ is minimized exactly when $p_1$ takes the expectation of $p_2$ as its expectation, and the covariance matrix of $p_2$ its covariance matrix.

I guess this might require a very technical proof.

1

There are 1 best solutions below

0
On

Actually your conjecture is false. Let $p_2$ be a centered Laplacian with unit variance.

Then if $p_1$ is a Gaussian with zero mean and variance $s$, then the distance $D_{KL}(p_1,p_2)$ is not minimized at $s=1$.

enter image description here