Suppose we have a random walk $X_0,X_1,...$ on the graph below with the vertices $1,2,3$ and $Y_n:=\mathbb{1}_{\{1,2\}}(X_n)$
First of all I really don't know the meaning of $Y_n$ and what the indicator variable is supposed to?
With $Y_n$ I have to determine $\mathbb{P}(Y_3=0|Y_0=1,Y_1=0,Y_2=1)$ and if this random walk is a Markov chain.
A Markov chain has a set of states in this case $S=\{1,2,3\}$ and a stochastic matrix $P$ in this case $$ \begin{pmatrix} 0 & 1 & 0 \\ 0.5 & 0 & 0.5 \\ 0 & 0 & 1 \\ \end{pmatrix} $$
So $Y_n$ will be a Markov chain.

I think you have a typo in your transition matrix $P$, where the $(3, 2)$ element should be $1$, and the $(3, 3)$ element should be $0$.
The indicator function takes on the value of $1$ if $X_n$ is either $1$ or $2$, and $0$ if $X_n = 3$.
Consider the case where $X_0 = 3, X_1, = 2, X_2 = 1$, so that $Y_0 = 0, Y_1 = 1, Y_2 = 1$. In this case, the only only value $Y_3$ can take on is 1, as $X_3 = 2$ for sure, given the transition matrix you provided. However, if you were just told that $Y_2 = 1$, there is some probability that $X_2 = 2$, in which case $Y_3$ could take on the value 0. In this case, knowing the history beyond the current state is informative, and thus $Y_n$ is not Markov. I think what you meant to say in your justification is that $(X_n, Y_n)$ is a Markov chain, as $X_n$ is Markov and $Y_n$ is a deterministic function of $X_n$, but $Y_n$ by itself is not.