Kullback-Leibler divergence from mixture distribution to its components

238 Views Asked by At

Suppose $f$ is the density of a mixture distribution with half the weight on a standard normal distribution and half the weight on a logistic distribution rescaled to have standard deviation $1$.

Now let $g$ be the density of the standard normal distribution and $h$ the distribution of this rescaled logistic distribution. Then $$g(x) = \frac{1}{\sqrt{2\pi}}e^{-x^2/2}$$ $$h(x) = \frac{\pi}{\sqrt{3}} \frac{e^{-\pi x/\sqrt{3}}}{(1+e^{-\pi x/\sqrt{3}})^2}$$ $$f(x) = \frac{1}{2}g(x) + \frac{1}{2}h(x).$$

I am wondering about the Kullback-Leibler divergences $KL(f, g)$ and $KL(f, h)$. In particular, I'd like to know which one is smaller.

I have simulated a time series from this mixture distribution and performed Bayesian model averaging on it, where I also included Gaussian and logistic models. In theory, I should end up with the posterior of the distribution with smallest divergence converging to 1, but it seems that the posteriors do not actually converge, even after a sample of $7500$ observations. Hence, this made me want to attack this problem analytically. Could it perhaps be that the divergences are equal or at least very close?

I am struggling to calculate the divergences myself, since I either end up with a very nasty integral or a very nasty expectation. Could someone help me figure this out?