How to estimate distribution via transition probability in MCMC

50 Views Asked by At

In the chapter 11 of PRML, equation 11.73. The equation is trying to use samples obtained from Markov chain to define the important-sampling distribution.

$$ P_G(z) = \frac{1}{Z_G}exp(-G(z)) = \frac{1}{L}\sum^L_{l=1}T(z^{(l)},z) $$ where $T(z,z')$ is the transition markov chain probability, and the sample set is given by $z^{(1)},...,z^{(L)},$

According to Markov chain, a distribution is said to be invariant if $$ p(x) = \sum_{x'} T(x',x)p(x') $$

But I don't understand why sum of transition probability could use for estimation of proposed probability. Is there any proof or relevant theorem?

1

There are 1 best solutions below

2
On

In the invariant expression, $$p(x) = \sum_{x'} T(x', x)p(x')$$ we can think that $p(x)$ is the average value of $T(x', x)$ according to the probability $p(x')$. That is, $$p(x) = \sum_{x'} T(x', x)p(x') \approx \frac{1}{L}\sum_{l=1}^{L}T(x'^{(l)},x) \,.$$ So, we can get the desired result by $p(x) = P_G(x)$.