In Bayesian inference, it is sometimes necessary to divide a Gaussian mixture (GM) posterior distribution by a GM prior. If the posterior GM is given by $$p_{1}(x) = \sum_{i=1}^{n} \alpha_{i} \mathcal{N} (x; \mu_{i}, \sigma_{i}), $$ where $\mathcal{N} (\cdot; \mu, \sigma)$ is a Gaussian distribution with mean $\mu$, standard deviation $\sigma$, $\alpha_{i} \in [0, 1]$ such that $\sum_{i=1}^{n} \alpha_{i} = 1$, and the prior is given by $$ p_{2}(x) = \sum_{i=1}^{m} \beta_{i} \mathcal{N}(x; \nu_{i}, \lambda_{i}), $$ where $\beta_{i} \in [0, 1]$ such that $\sum_{i=1}^{m} \beta_{i}$ = 1, then how do I approximate $$p_{3} (x) = \frac{p_{1} (x)}{p_{2} (x)} $$ as a GM?
I have attempted to represent both $p_1$ and $p_2$ using their power series and then perform the division using power series division. My questions are:
- Is power series division a reasonable approach to this problem? I've plotted the power series of $p_{3}$ and compared it to a pointwise estimate of $p_{2} / p_{1}$, it is equivalent on certain intervals, but the power series diverges elsewhere.
- Is it possible to determine $p_3$'s power series' radius and interval of convergence? I believe $e^{x} = \sum_{n=0}^{\infty} x^{n} / n!$ converges for all $x \in \mathbb{R}$, so the power series of $p_{1}$ and $p_{2}$ should also converge for all $x$.
- Is it possible to determine, or approximate, a Gaussian's central moments using its power series representation?