Relative entropy (KL divergence) of sum of random variables

749 Views Asked by At

Suppose we have two independent random variables, $X$ and $Y$, with different probability distributions. What is the relative entropy between pdf of $X$ and $X+Y$, i.e. $$D(P_X||P_{X+Y})$$ assume all support conditions are met. I know in general pdf of $X+Y$ is convolution of pdf of $X$ and $Y$, but is there an easier way to calculate the relative entropy or at least simplify it?

2

There are 2 best solutions below

2
On

Let $f(t)$ be the PDF of $X$ and $g(t)$ be the PDF of $Y$. $$D_{KL}(P_X\parallel P_{X+Y}) = \int_{-\infty}^{+\infty}f(x)\log\frac{f(x)}{(f*g)(x)}\,dx$$ does not admit any obvious simplification, but the term

$$\log\frac{f(x)}{(f*g)(x)}=\log\frac{\int_{-\infty}^{+\infty} f(t)\,\delta(x-t)\,dt}{\int_{-\infty}^{+\infty} f(t)\,g(x-t)\,dt} $$ can be effectively controlled if some informations about the concentration/decay of $g(t)$ are known.

Is this the case?

3
On

Here's an attempt.

$$D(P_X||P_{X+Y}) = \mathbb{E}[\log \frac{P_X(X)}{P_{X+Y}(X+Y)}] = \mathbb{E}[\log \frac{P_X(X)P_Y(Y)}{P_{X+Y}(X+Y)P_Y(Y)}] \\ = \mathbb{E}[\log \frac{P_X(X)}{P_Y(Y)}] + \mathbb{E}[\log \frac{P_Y(Y)}{P_{X+Y}(X+Y)}] = \infty$$

because $g$ is uniform and has $g(x)=0$ for some $x$ such that $f(x)>0$.