Relative entropy (divergence) of sum of two Gaussian random variables

318 Views Asked by At

Let $X$ and $Y$ be two Gaussian randoms with $X \sim N(m_1, s_1)$ and $Y \sim N(m_2, s_2)$ and density functions $f_1(x)$ and $f_2(x)$. Let $f(x)$ be the density function of $X = X_1 + X_2$.

I want to derive $D(f\mathrel{\|}f_2)$. I used $$D(X\mathrel{\|}Y) = -\int_{-∞}^∞ p(x) \ln\left( \frac{q(x)}{p(x)} \right) \,\mathrm dx,$$ but I could not describe the density functions with respect to the summation of Gaussian random variables.

1

There are 1 best solutions below

2
On BEST ANSWER

If $X$ and $Y$ are independent then $f$ is the density of $N(m_1+m_2, s_1+s_2)$ (assuming $s_1$ and $s_2$ are variances?).

From here you have explicit expressions for $f$ and $f_2$ so you can compute $D(f \| f_2)$ using the formula.