So in my probability studies I just encountered this:
For a Markov chain $ X_n $ we have a finite state space $ \{1,2,...,k\} $ such that we can transit from one index to the next and from k only to k. We are asked if this $ Y_n = \min\{2,X(n)\} $ is a Markov chain
I can neither prove it nor give a counterexample so I really do need the help on this if someone please can.
Hint: $X_n$ is a Markov chain and the current state is either the same or increases by $1$, up to $k$. That means $\min(2, X_n) = X_n$ as long as $X$ is in state $1$. The only possible transition (except $1 \to 1$) is $1 \to 2$ (why?) What happens with $Y_n$ then? Can you draw it as a markov chain?