A stochastic process has the Markov property if the conditional probability distribution of future states of the process (conditional on both past and present states) depends only upon the present state, not on the sequence of events that preceded it. A process with this property is called a Markov process.
Is the google definition.
My book defines:
$Pr(X_{i+1}<x | X_i = x_i,X_{i-1}=x_{i-1},...,X_0=x+0) = Pr(X_{i+1}<x | X_i = x_i)$
Does this mean that:
X_{i+1} is independent of $X_{i-1}$ etc. or
does this mean that $X_{i}$ contains all the information of $X_{i-1}$,$X_{i-2}$ etc?
The Google definition states that the probability that the $i+1$ random variable has a value less than some $x$ given all the previous values is equal to the probability given only the single previous value.