For $A,B$ measurable sets and $(X_n)_n$ a Markov chain. Do any of the following properties hold?
$$P(X_2 \in B | X_1=x_1,X_0 \in A) = P(X_2 \in B|X_1=x_1)$$ or $$P(X_2 \in B|X_1 \in A,X_0=x_0) = P(X_2 \in B|X_1 \in A)?$$
If anything is unclear, please let me know.
Yes, both follow almost trivially from the definition.