Is subsequence (non-contiguous) of a Markov chain is also a Markov chain.

61 Views Asked by At

Let us assume $A \rightarrow B \rightarrow C \rightarrow D$ be a Markov chain. Can we also state that $A\rightarrow C \rightarrow D$ is also a Markov chain? It intuitively feels right. Can anyone give a formal proof of it or maybe cite some resource.

1

There are 1 best solutions below

6
On

I don't think that your question makes sense. A Markov chain has a set of states, and transition probabilities dictating the probability of transitioning from one state to another. If you had states $A, B, C, D$ in a Markov chain you could talk about the path $A \to B \to C \to D$, but this path is not itself a Markov chain.

If you are asking if a proper subset of a Markov chain is a Markov chain, well then this also doesn't make much sense since now our transition probabilities no longer sum to $1$.

Feel free to edit your question or respond if you want further clarification.