My professor has told us that Markov property means that:
Let $\{X_n\}_n$ be a markovian sequence in a countable stte space $X$ then $\{Y_n\} = X_{n+k}$ is also a markovian sequence with the same probability transition matrix $P$ as $\{X_n\}_n$ ,the same initial disturbution $\pi_0$ and also independent of $X_k,X_{k-1},...,X_0$ .
I found that definition a little bit abstract and I am trying to turn into a formula.
Is it right to say ? that the phrase "is also a markovian sequence with the same probability transition matrix $P$ as $\{X_n\}_n$ " can be expressed as
$P[X_{n+k} = b | X_{n+k-1} = a] = P[X_1 = b | X_0 = a] = p(a,b)$ where a,b are two states $\in X$.