Gibbs sampling is a profound and popular technique for creating samples of Bayesian networks (BNs). In PyMC3, Metropolis sampling is another popular approximate inference technique to sample BNs but - in my opinion - a less intuitive one. Due to the transfer probability of a Markov chain and a (symmetric) proposal distribution $g$, I have difficulties to work out the below example of Metropolis sampling for a small BN.
Suppose we deal with the below Student BN and corresponding conditional probability tables (CPTs). For the topological ordering $(D, I, G, S, L)$ I consider the initial state $$ x^{(0)} = (D = d^1, I = i^0, G = g^1, S = s^0, L = l^0). $$ My question: suppose we consider the proposal distribution $g$ to be a random walk, what states $x'$ can we visit?
My attempt:
If we change one variable at a time according to the fixed ordening (like Gibbs sampling), we could obtain $x' = (D = d^0, I = I^0, G = g^1, S = s^0, L = l^0)$. Is this approach correct?
But how do we determine the acceptance ratio $\alpha$? Is this based on the CPTs? For example, \begin{align*} \alpha = \frac{P(x')}{P(x^{(0)})} = \frac{0.4 \cdot \ldots}{0.6 \cdot \ldots} = z \end{align*} How do we decide whether we accept or reject $x'$? Do we make use of a random integer $u$, such that we accept $x'$ as $x^{(1)}$ if $u > z$?
Can somebody help me to work out this (simple) example?