Suppose I have a discrete-time Markov chain $(X_n)_{n \geq 0}$ on a state space $\mathcal{S}$. It has the Markov property, meaning that for any $N \geq 1$ and $(x_n)_{n = 0}^{N} \subset \mathcal{S}$:
$$\mathbb{P}(X_N = x_N \; | \; X_{N-1} = x_{N-1}, \; ..., \; X_1 = x_1, \; X_0 = x_0) = \mathbb{P}(X_N = x_N \; | \; X_{N-1} = x_{N-1})$$
But is this still true if I restrict how far back in time I go? That is, for any $0 \leq M < N$, is it still true that:
$$\mathbb{P}(X_N = x_N \; | \; X_{N-1} = x_{N-1}, \; ..., \; X_{M+1} = x_{M+1}, \; X_M = x_M) = \mathbb{P}(X_N = x_N \; | \; X_{N-1} = x_{N-1})$$
For example, is it still true that $\mathbb{P}(X_N = x_N \; | \; X_{N-1} = x_{N-1}, \; X_{N-2} = x_{N-2}) = \mathbb{P}(X_N = x_N \; | \; X_{N-1} = x_{N-1})$?
Yes this is true. For example, to show that $$P(X_4=1|X_3=0,X_1=2)=P(X_4=1|X_3=0)$$ you just write LHS as $$\sum_{ij} P(X_4=1|X_3=0,X_2=i,X_1=2,X_0=j) P(X_3=0,X_2=i,X_1=2,X_0=j|X_3=0,X_1=2)$$ and apply Markov property to get $$\sum_{ij} P(X_4=1|X_3=0) P(X_3=0,X_2=i,X_1=2,X_0=j|X_3=0,X_1=2)$$ which becomes $$ P(X_4=1|X_3=0) P(X_3=0,X_1=2|X_3=0,X_1=2)=P(X_4=1|X_3=0)$$