How to learn mixture Gaussian with inequality constraint of component variances

146 Views Asked by At

Let $f_1(x)$,…,$f_n(x)$ be Gaussian density functions with different parameters, $\mu_i$ and $\sigma_i$ are the parameters (mean and variance) of the Gaussian component i, and $w_1,\ldots,w_n$ be real numbers that sum-up to unity. Now the function $g(x)=\sum_i w_if_i(x)$ is also a density function and I call it mixture-Gaussian density.

I am trying to learn a mixture Gaussian with two components, $g(x)=w_1f_1(x)+w_2f_2(x)$, and one inequality constraint: $\sigma_1 < \sigma_2$.

For the unconstrained mixture Gaussian, I use EM algorithm to learn the parameters. For this constrained mixture Gaussian, I don't know how to do it. One work around I considered is that I tried to learn the model parameters many times with random initialization using EM algorithm, and selected only solutions satisfied the constraint. But I am not sure about the correctness of this approach. Another approach is that during the M step, if the parameters violate the constraint, I will restart one of them to satisfy the constraint. Again, it's just a hack without theoretical guarantee of convergence.

Please kindly advise. Thanks.