Same transition matrix implies markov chain.

26 Views Asked by At

Let $X_i$ , $i \geq 0 $ be discrete time and state stochastic process.

Suppose that $P(X_n = j | X_{n-1} = i) $ does not depend on $n$. that is $$P(X_n = j| X_{n-1} = i) = P(X_1 = j | X_0 = i) $$ for all $n$.

I want to know wheter this $X_n$ is Markov chain. I think that it is not Markov.

Since $P(X_2 =j |X_1= i, X_0 = x_0) = \frac{P(X_2 = j, X_1=i| X_0=x_0)}{P(X_1 =i | X_0 = x_0)}$, if it is markov chain, then it must be that $P(X_2 = j, X_1=i | X_0=x_0) = P(X_2=j|X_1=i)P(X_1=i|X_0=x_0)$.