Let $X_n$ be an independent Markov chain which has values (states) $X_n={0,1,2}$ with its transition matrix. $$\\p= \begin{pmatrix} 0 & \frac{1}{2} & \frac{1}{2} \\ \frac{1}{2} & \frac{1}{2} & 0 \\ 1 & 0 & 0 \end{pmatrix} $$ Let $Y_n=f(X_n)$ $f(0)=0$ $f(1)=1$ $f(2)=1$ is $Y_n$ a Markov chain?
My intuition and solution: It is not a Markov chain. $P(Y_n=1|Y_{n-1}=1,Y_{n-2}=0)=\frac{1}{4}$ $P(Y_n=1|Y_{n-1}=1)=\frac{1}{2}*p(1)$, where p(1) is a probability that we are at the state 1, given we are either at the state 1 or 2. Surely probability that we are at the state 1 isn't equal to half, which is obvious if we look at the transition matrix, but how to calculate this probability?
My guess: Lets calculate stationary probabilities, which are $(\frac{2}{5} ,\frac{2}{5} ,\frac{1}{5})$, so we are on average two times more often in the state one than in state 2. which would mean that $p(1)$, I am looking for is equal to $\frac{2}{3}$?
EDIT: I found a way to avoid looking for this probability: it is enough to calculate the probability $P(Y_n=1|Y_{n-1}=1,Y_{n-2}=1,Y_{n-3}=0)$, but still I would like to know if I had been right before I found this way.
The concept you're looking for here is called Lumpability. If you aggregate states of a Markov chain and the chain is "lumpable", then the aggreate process that you obtain is again a Markov chain.
Lumpability property (see Theorem 6.3.2 in Finite Markov chain, by Kemeny ans Snell): A discrete-time Markov chain $\{X_{i}\}$ is lumpable with respect to the partition $T=\{t_1,\ldots,t_M\}$ if and only if, for any subsets $t_i$ and $t_j$ in the partition, and for any states $n,n'$ in subset $t_i$, \begin{align} {\displaystyle \sum _{m\in t_{j}}p(n,m)=\sum _{m\in t_{j}}p(n',m)} \end{align} In your case, it is easy to see that your chain is not lumpable within the partition $T=\{\{0\}, \{1,2\}\}$, as $p(1,0)\neq p(2,0)$