I have a very simple problem, but it seems I have difficulty to prove it rigorously.
Suppose random variables $X, Y$ and $Z$ form the following Markov chain: $X\leftrightarrow Y\leftrightarrow Z$. My intuition says that if $W$ is another random variable independent of $X$ and $Z$, then the following $$X\leftrightarrow (Y, W)\leftrightarrow Z$$ also form a Markov chain.
Any simple proof or disproof for this statement?
I am thinking more in terms of information theory. In information theoretic context, the Markov chain $X\leftrightarrow Y\leftrightarrow Z$ implies that $Z$ has no "information" about $X$ once you reveal $Y$. In other words, all the information about $X$ contained in $Z$ goes through $Y$. So, since $W$ is independent of $X$ and $Z$, once $Y$ and $W$ are revealed then $Z$ still carries no information about $X$.
It depends on $W$. For example if $W=f(Y)$, then you are right, however if W is independent of $Y$, then conditioning on $W$ can increase the conditional mutual information $I(X;Z|YW)$! For example if $W$ is $mod 2$ sum of $X,Z$ which is independent of $X,Z$ then $I(X;Z|Y)=0, I(X;Z|YW)=1$ which violates the desired Markov chain.