Let a Markov Chain and ${A}_{0}, \, {A}_{1} \dots {A}_{n} $ subsets of state space where $A_n$ is not a singleton. Giving an example, verify that the following statement is not correct
$\mathbb{P}(X_{n+1}=j|X_{n} \in {A}_{n}) = \mathbb{P}(X_{n+1}=j|X_{0} \in {A}_{0}, X_{1} \in {A}_{1}, \dots X_{n} \in {A}_{n})$
Assume that $j>1$ and $A_2=\{j-1,j+1\}$ and that the transition probabilty from $j-1$ to $j$ is $1$ and that the transition probability from $j+1$ to $j$ is $0$.
Assume, furthermore, that the chain starts at $0$ and transitions to $-(j-1)$ with probability $\frac{1}{2}$ and $-(j+1)$ with probability $\frac{1}{2}$. Assume that we then transition from $-(j+1)$ to $j+1$ with probability $1$ and transition from $-(j-1)$ to $j-1$ with probability $1$.
Then, letting $A_0=\{0\}$ and $A_1=\{j-1\},$ we now have $$ \mathbb{P}(X_3=j|X_2\in A_2)=\mathbb{P}(X_1=-(j-1))=\frac{1}{2} \\ \mathbb{P}(X_3=j|X_0=0,X_1\in A_1,X_2\in A_2)=1, $$ since the behaviour of $X_2$ is deterministic once we've fixed $X_1$.
So the intuitive short-cut is that the $A_j$ sets might hold non-trivial information about where in $A_n$ you're starting, and this affects the conditional probability. Of course, if $A_n$ is a singleton, there is no possible non-trivial information about our starting point and so, this intuition does not violate the definition of a Markov chain.