I would like to make clarification and show my curiosity about markov process. I will show some part of definition related to markov process from here.
The Markov property is the dependence structure among random variables. The simplest dependence structure for $X_0, X_1, . . .$ is no dependence at all, that is, independence.
The Markov property could be said to capture the next simplest sort of dependence: in generating the process $X_0, X_1, . . . $sequentially, the “next” state $X_{n+1}$ depends only on the “current” value $X_n$, and not on the “past” values
$$X_0, . . ., X_{n−1}$$
while statement itself seems clear. I am curiosity about one thing: we are saying that next state depend only on current state, but on the other hand current state is the same as next state for one step previous process, right?
$X_2$ is depend on $X_1$,not on $X_0$, but $X_1$ is depend on $X_0$ and so on, so how is cancelled this dependence effect as we move forward? How dependence between $X_0$ and $X_2$ lost?
The Markov chain property does not say that $X_0$ and $X_2$ are independent, only that the dependence of $X_2$ on $(X_0,X_1)$ is equivalent to the (a priori simpler) dependence of $X_2$ on $X_1$ only.
Consider for example some i.i.d. standard normal random variables $Z_k$ and define $$ (X_0,X_1,X_2)=(Z_0,Z_0+Z_1,Z_0+Z_1+Z_2). $$ Then $(X_k)_{0\leqslant k\leqslant2}$ is a Markov chain because $X_2=X_1+Z$ where $Z=Z_2$ is independent of $(X_0,X_1)$ and standard normal. Thus, the conditional distribution of $X_2$ conditionally on $(X_0,X_1)$ is normal with mean $X_1$ and variance $1$, in particular this conditional distribution depends on $(X_0,X_1)$ through $X_1$ only, as Markov property demands.
And $X_2=X_0+Y$ where $Y=Z_1+Z_2$ is independent of $X_0$ and centered normal with variance $2$ hence the conditional distribution of $X_2$ conditionally on $X_0$ is normal with mean $X_0$ and variance $2$, in particular this conditional distribution does depend on $X_0$.