Bound KL divergence between two distributions by KL divergence of two Gaussian mixture models

204 Views Asked by At

I'm trying to bound the KL divergence between two continuous random variables with the KL divergence between two Gaussian mixture approximations motivated by the fact that the Gaussian mixture model is a universal approximator of densities, see https://stats.stackexchange.com/questions/365155/a-gaussian-mixture-model-is-a-universal-approximator-of-densities/365158#365158

In the end I would like to plug in the KL divergence for Gaussians.

So my question: Are there any results about the parameters of the mixture model approximating a given density and how the KL divergence between a function and its Gaussian mixture approximation behaves?