I have question about a formula in the machine learning paper. The paper is as follows.
https://arxiv.org/pdf/1906.02691.pdf
In page 9, formula (1.6), I totally agree with it since it is famous formula in the Prof.Koller's book "probabilistic graphical model". The formula is as follows.
$p_\theta(x_1,...,x_M) = \prod\limits_{j=1}^Mp_\theta(x_j|Pa(x_j))$ (1.6)
However, conditioned on x, I cannot agree with the formula (2.2) in page 16 which is as follows.
$q_\phi(z|x) = q_\phi(z_1,...,z_M|x) = \prod\limits_{j=1}^Mq_\phi(z_j|Pa(z_j),x)$ (2.2)
I think $q_\phi(z_j | Pa(z_j),x)$ should be changed to $q_\phi(z_j | x)$ since $Pa(z_j)$ equals to $x$. This refers to the case where the graphical model is $x$ --> $z$.
Even if the graphical model is not $x$ --> $z$ but $x$ --> $v$ --> $z$ which implies $Pa(z_j) = v$ by introducing temporary variable $v$, I think formula (2.2) still has problem. In this case, I think the original formula (2.2) should be changed to follows.
$q_\phi(z|Pa(z_j),x) = p_\phi(z_1,...,z_M|Pa(z_j),x) = \prod\limits_{j=1}^Mq_\phi(z_j|Pa(z_j),x)$
If someone can derive the formula (2.2), please help me. Thank you.