One step in the derivation of the expectation maximization algorithm.

36 Views Asked by At

Although there are some answers to related problems I've found no direct solution to this specific problem.

I'm reading this in depth introduction to the expectation maximization algorithm and got stuck in a step in its derivation.

On page 6 in step 4 there is

$$ \geq \Bigg[\sum_z P(z|X,\theta_n)\cdot\ln\bigg(\frac{P(X|z,\theta)P(z|\theta)}{P(z|X,\theta_n)}\bigg)\Bigg] - \ln\big(P(X|\theta_n)\big)$$

which in the next line results in

$$ = \sum_z P(z|X,\theta_n)\cdot\ln\bigg(\frac{P(X|z,\theta)P(z|\theta)}{P(z|X,\theta_n)P(X|\theta_n)}\bigg)$$

What happened in between those lines?

1

There are 1 best solutions below

1
On BEST ANSWER

Note that $\sum_z P(z|X, \theta_n)=1$. This way, \begin{align*} \bigg[\sum_z P(z|X, \theta_n)\cdot &\ln \left(\frac{P(X|z, \theta)P(z|\theta)}{P(z|X, \theta_n)}\right)\bigg]-\ln P(X|\theta_n) \\ &= \bigg[\sum_z P(z|X, \theta_n)\cdot \ln \left(\frac{P(X|z, \theta)P(z|\theta)}{P(z|X, \theta_n)}\right)\bigg]-\ln P(X|\theta_n)\sum_zP(z|X, \theta_n)\\ &= \bigg[\sum_z P(z|X, \theta_n)\cdot \ln \left(\frac{P(X|z, \theta)P(z|\theta)}{P(z|X, \theta_n)}\right)\bigg]-\sum_zP(z|X, \theta_n)\ln P(X|\theta_n) \\ &= \sum_z P(z|X, \theta_n)\cdot \left( \ln \left(\frac{P(X|z, \theta)P(z|\theta)}{P(z|X, \theta_n)}\right)-\ln P(X|\theta_n)\right)\\ &= \sum_z P(z|X, \theta_n)\cdot \left( \ln \left(\frac{P(X|z, \theta)P(z|\theta)}{P(z|X, \theta_n)P(X|\theta_n)}\right)\right), \end{align*} where the last equality is the usual log property: $\ln(a)-\ln(b)= ln(a/b)$.