I have a question on find a counter example of Markov Chain. The question states as follows:
Suppose $X_0$, $X_1$ are the Markov chain whose state space is $\mathbb{Z}$, Since we know from Markov property, we know that $$ \mathbb{P}(X_n = i_n | X_1 = i_0, X_2 = i_1, ... X_{n-1} = i_{n-1}) = \mathbb{P}(X_n = i_n | X_{n-1} = i_{n-1}) $$ now we want to find an example such that the following property doesn't holds:
$$ \mathbb{P}(X_n \geq 0 | X_1 \geq 0, X_2 \geq 0, ... X_{n-1} \geq 0) = \mathbb{P}(X_n \geq 0 | X_{n-1} \geq0) $$
My thought is to start from a state space with only three states $1$, $2$ and $3$, and the equality holds when $X_2$ is not depends on $X_1$ and $X_2$ so the inequality will holds. But I don't think my example is good enough. Can you guys come up with a better one? Thanks!