Question about Infinite Markov chains

75 Views Asked by At

Do 2 Markov chains $\left\{X_n\right\}^\inf_{n=0} $ and $\left\{Y_n\right\}^\inf_{n=0} $ with all of these properties exist so that the probability for infinite n values to maintain $X_n=Y_n$ is 0? is 1? is 0.5?

  • same group of states
  • same transition matrix
  • different starting states

I was thinking that for the probability to be 1 all the values in the transition matrix must be equal.

For the probability to be 0 all the values in the transition matrix must be different. But since we're talking about infinite number of N values, how can I illustrate this? Or maybe I'm going about this the wrong way?