KL Divergence between the sums of random variables.

1.9k Views Asked by At

The relative entropy or Kullback–Leibler distance between two probability density functions $g(x)$ and $f(x)$ is defined as $$D(g\|f) = \int_{x} g(x)\log\frac{g(x)}{f(x)} dx .$$ We have two random variables $V$ and $W$, \begin{equation*} \begin{split} &V=X_1+X_2, \text{where}\ X_1\sim g(x), X_2\sim f(x)\ \text{are independent},\\ &W=X_3+X_4, \text{where}\ X_3, X_4\sim f(x)\ \text{are independent}. \end{split} \end{equation*} It is easy to show that \begin{equation*} \begin{split} &V\sim G(x)=(g\ast f)(x),\\ &W\sim F(x)=(f\ast f)(x), \end{split} \end{equation*} where $(g\ast f)(x) = \int g(\tau)f(x-\tau)d\tau$ is the convolution of $g$ and $f$.

The questions are:

  1. Is it true that $D(g\|f)> D(G\|F)?$

  2. Is it true that $\frac{1}{2}D(g\|f)> D(G\|F)?$

If we can prove 2, 1 is obviously true. They are true for Poisson and Gaussian distributions, however, I can't prove for the general cases.

1

There are 1 best solutions below

0
On

The first inequality is a simple consequence of the chain rule for KL divergences with an additive noise "channel" where $X_2$ (or $X_4$) acts as noise.