Let $X_0,X_1,...$ be a Markov Chain with transition matrix
$$P=\begin{pmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ p & 1-p & 0 \end{pmatrix} $$ for $0<p<1.$ Let $g$ be a function defined by $$ g(x)=\left\{ \begin{array}{rcr} 0, & & \text{if} \ x=&1 \\ 1, & & \text{if} \ x=&2,3\\ \end{array} \right. $$
Let $Y_n=g(X_n)$, for $n\geq 0$. Show that $Y_0,Y_1,...$ is not a Markov chain.
My attempt:
So I want to show that
\begin{align} \mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)=\mathbb{P}(Y_n=j|Y_{n-1}=i). \end{align}
does not hold. Substituting in $X_i$ for $Y_i$ i get that
\begin{align} \mathbb{P}(Y_n=j|Y_0=y_0,...,Y_{n-1}=i)&=\mathbb{P}(g(X_n)=j|g(X_0)=x_0,...,g(x_{n-1})=i)\\ &=...? \end{align}
How do I know which states to substitute my $X_i$'s for? I'm pretty sure I should use $P$ to do this but I have no idea how.
Consider $\mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)$. Given that $Y_{0}=0$, we must have $X_{0}=1$, from which the only possibility is $X_{1}=2$ and $X_{2}=3$. Therefore $\mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0) = 1$.
Now consider $\mathbb{P}(Y_{2}=1|Y_{1}=1)$. If $X_{1}=3$, then $Y_{1}=1$ but $\mathbb{P}(X_{2}=1)=p$ and if $X_{2}=1$ then $Y_{2}=0$. Therefore this probability is not $1$ (there is a nonzero probability that $Y_{2}$ will equal $0$).
Since $\mathbb{P}(Y_{2}=1|Y_{1}=1,Y_{0}=0)\neq\mathbb{P}(Y_{2}=1|Y_{1}=1)$, the chain is not Markov.
The intuitive reason that $Y_{n}$ is not a Markov chain is because probabilities related to its values depend on knowledge of multiple prior states, whereas the Markov property means that probabilities of values of the chain depend only on the previous state.