If we have the Markov chain $X \to Y \to Z$, or equivalently $$I(X;Z| Y)=0, \tag{1}$$ where $I(\cdot)$ denotes the mutual information. Does the Markov chain $X \to (Y,W) \to Z$ also hold? Or $$I(X;Z|Y,W)=0~~? \tag{2}$$ Intuitively, I think (2) is true. But how to prove? Thanks in advance.
Another simple question about probabilities, $$P(Z=z|X=x,Y=y)\cdot P(Y=y)=P(Z=z,Y=y|X=x) \tag{3}$$ Is (3) right? I am confused. Thanks again.
The first is practically trivial, once we understand that the random variables $X$, $Y$... are not necessarily one-dimensional (scalars). So you can just define $S=(Y,W)$ (a multivariate variable) and with that notation you get the first case.
Regarding the second, simplifying/abusing notation:
$$P(Z | X Y ) P(Y) = \frac{P(Z X Y)}{P(XY)} P(Y)$$
and $$ P(Z Y |X)=\frac{P(Z X Y)}{P(X)}$$
These would be equal iff $\frac{P(Y)}{P(XY)}=\frac{1}{P(Y)}$, or $P(XY)=P(X)P(Y)$, i.e. iff $X,Y$ are independent.