Kullback–Leibler divergence between a quasi-arithmetic mean of Normal distributions and a standard Normal distribution

39 Views Asked by At

Is there a way to compute the Kullback–Leibler divergence between a quasi arithmetic mean of normal distributions and a standard normal distribution in closed form?

$$D_{KL} = D_{KL}\left(f^{-1}\left(\sum _{i=1} ^N\frac{f(\mathcal{N}(\mu _i, \sigma_i^2))}{N}\right) \ \middle|\middle| \ \mathcal{N}(0, I)\right)$$

Or since this is probably not possible, is there a way to compute an upper bound $D_{KL}^{\prime}$ that can be computed in closed form, with $D_{KL}^{\prime} \ge D_{KL}$ such that minimizing $D_{KL}^{\prime}$, minimizes $D_{KL}$?

For example for the arithmetic mean (where $f(z_i) = az_i +b$), it is easy to show that: $$D_{KL} \left( \sum _{i=1} ^N \frac{\mathcal{N}(\mu _i, \sigma_i^2)}{N} \ \middle|\middle| \ \mathcal{N}(0, I) \right) \leq \sum _{i=1} ^N \frac{1}{N} D_{KL} \left( \mathcal{N}(\mu _i, \sigma_i^2)\ \middle|\middle|\ \mathcal{N}(0, I) \right)$$ where the right hand side can be computed in closed form.

I know that it can be assumed that $f$ is strictly increasing without a loss of generality (Handbook of Means and Their Inequalities, P. S. Bullen, p.271). Also, the Kullback–Leibler divergence is convex and positive.

However, I can't find how to use these properties to find an upper bound to $D_{KL}$.