Extracting a smaller Markov chain from a larger Markov chain

142 Views Asked by At

I am not very familiar with Markov chains, hence the probably ill titled questions.

If we have 5 random variables $X, Y, Z, W$ and they form a Markov chain such that

$$X \rightarrow Y \rightarrow Z \rightarrow W \rightarrow P$$

Is also the case that

$$X \rightarrow Y \rightarrow P$$ or $$X \rightarrow Y \rightarrow W $$

Intuitively I would assume it is true because no information about $X$ can be gained as we move along the chain, thus it is okay to remove a link.

Secondly if i recall correctly then if $Z = f(Y)$ then $X \rightarrow Y \rightarrow Z$ is a Markov chain. So in my example $f$ is just a composite function.

This is obviously a weak understanding at best, so any help is appreciated.

Edit:
My understanding is that 3 random variables $X, Y, Z$ form a markov chain if $X$ and $Z$ and conditionally independent given $Y$. So $$X \rightarrow Y \rightarrow Z \iff p(x, y, z)=p(x) p(y | x) p(z | y)$$

2

There are 2 best solutions below

2
On BEST ANSWER

It suffices to show you can remove one link (then you can recursively remove as many as you like). Suppose: $$ X \rightarrow Y \rightarrow Z \rightarrow W$$ You want to show $$ X \rightarrow Y \rightarrow W$$

For simplicity assume all random variables are discrete and let $S_X, S_Y, S_Z, S_W$ be the finite or countably infinite sets of all possible values these variables can take (with positive probability).

Then for all $w \in S_W, x \in S_X, y \in S_Y$ we have \begin{align} &P[X=x, Y=y,W=w] \\ &\overset{(a)}{=} \sum_{z \in S_Z} P[X=x, Y=y, Z=z, W=w]\\ &\overset{(b)}{=}\sum_{z \in S_Z} P[X=x]P[Y=y|X=x]P[Z=z|Y=y]P[W=w|Z=z]\\ &= P[X=x]P[Y=y|X=x]\sum_{z \in S_Z} P[W=w|Z=z]P[Z=z|Y=y]\\ &\overset{(c)}{=}P[X=x]P[Y=y|X=x]\sum_{z \in S_Z} P[W=w|Z=z, Y=y]P[Z=z|Y=y]\\ &=P[X=x]P[Y=y|X=x]P[W=w|Y=y] \end{align} where (a) is from the law of total probability; (b) and (c) use the Markov chain assumption $X\rightarrow Y \rightarrow Z \rightarrow W$.

1
On

The question is not very clear. I am assuming that you have a Markov chain, $ \{ X_n : n \geq 0 \} $. You are interested in finding whether the chain $ \{ X_{2n} : n \geq 0 \} $ is also a Markov chain.

If that is the case, then the answer is yes. Actually $ \{ X_{kn} : n \geq 0 \} $ is a Markov chain for any integer $ k \geq 1 $. The transition probability matrix is given by $ \mathbf{P}^k $ where $ \mathbf{P}$ is the transition matrix of the original chain.