I have only started taking Stochastic Processes class and we started discussing Markov Chains. In one of the examples we had to determine if the following two ALWAYS define a Markov Chain on $\mathbb{Z}:$
- $X_0 := Y_0, X_n := Y_1$ for all $n \geq 1$
- $X_0 := 0, X_n := X_{n-1} + Y_n +n$ for all $n \geq 1$
$(Y_n)$ is an iid sequence of integer-valued random variables.
I was sure that both of them are Markov Chains indeed since in case 2) it depends only on the previous state and in case 1) they are same. But my professor claims that they are not because in case
- $X_0, X_1$ are independent but $X_2 = X_1$, which implies that the transition probabilities change over time.
- The distribution of $X_{n+1}$ given $X_n$ changes over time.
I am a bit confuesed by how does the change over time of transition probabilities or distribution affectets Markov property? I would be extremly grateful if someone would be able to help me understand this in more details.
In both cases, your professor is likely assuming an additional property: that the Markov chain is time-homogeneous. That is, that $\Pr[X_n = i \mid X_{n-1} = j]$ depends only on the states $i$ and $j$, and not on the time $n$.
It is very common, though not universal, to assume that Markov chains are time-homogeneous. The two chains fail to satisfy this because:
(Additionally, though I wouldn't add this if I didn't see the other answer, some people assume that a Markov chain has a finite state space, and the second chain definitely does not. This is less common; Markov chains with infinite state spaces are just too useful.)