variational lower bound

63 Views Asked by At

I am reading https://xyang35.github.io/2017/04/14/variational-lower-bound/ second derivation for KL divergence.

if you check equation, you will see at the end it gets: $$ = -L + \log P(X) $$ But I can not understand how we isolated $\log P(X)$ since it is multiplied with summation of $Q(z)$ in the row before.

I did check on this matter and it says $Q(z)$ is just 1. Ok, but why we didn't use 1 in first term as well and we would get: $$ = \log \frac{P(x,z)}{Q(z)} + \log P(x) $$ How could $Q(z)$ just disappear in second term?

1

There are 1 best solutions below

5
On

I will assume that what you wonder is how they can do:

$$-\int_Zq(Z)\log\left(\frac{P(X,Z)}{q(Z)}\right) + p(X)\int_Zq(Z) = -L+\log(p(X))$$

It is because if we look at the second term:

$$p(X)\cdot\underset{=1}{\underbrace{\int_Zq(Z)}} = p(X) \cdot 1 = p(X)$$ This is because of unity of probability measure. Any probability measure must have it's density function $f(t)$ fulfill:

$$\int_{\Omega} f(t)dt=1$$

where $\Omega$ is it's event space.