Consider a Markov chain $X_n$, for $ n=0, 1, 2, \ldots$, with states 0, 1, 2, whose transition probability matrix is
P = \begin{bmatrix} 0 & 1/2 & 1/2\\ 1/2 & 1/2 & 0\\ 1 & 0 & 0 \end{bmatrix}
Let $$f(0)=0, f(1)= f(2)=1.$$
If $$Y_n= f(X_n),$$ is $Y_n,$ for $n=1, 2, \ldots$ a Markov chain?
Now, my attempt for solution was to find the transition diagram of Yn (which from its definition we know it has 2 states) and then calculate the transition probabilities and check whether they sum up to 1:
$$p0,0 = P[Yn=0 | Yn-1=0] = P[Xn=0 | Xn-1 = 0] = 0 $$ (from matrix P we know that state 0 must go to either state 1 or 2).
$$p0,1 = P[Yn=1 | Yn-1=0] = P[Xn=1 or Xn=2 | Xn-1=0] = P[Xn=1 | Xn-1=0] + P[Xn=2 | Xn-1=0] = 1/2 + 1/2 = 1.$$
then we sum $$ p0,0 + p0,1 = 0 + 1 = 1. $$ Hence $Yn$ is a markov chain.
I know there's a mistake with my calculations because Yn isn't a markov chain, but i can't figure what i was doing wrong here. could someone show me the right way to solve this problem?
thanks!
For a sequence to be characterized by a Markov chain, the current state must have all information necessary to determine the next state. This is the Markov property. The current state contains all information to determine the future states, and the past is conditionally independent of the future, given the present.
$$P(X_n = x_n | X_{n-1}=x_{n-1}, X_{n-2}=x_{n-2})= P(X_n = x_n | X_{n-1}=x_{n-1})$$
After you replace $X_n$ by $f(X_n)$ the Markov property is not satisfied anymore, because if you know that you are in states 1 or 2, you cannot determine what the distribution of the next state is.
$$P(X_n = x_n | X_{n-1}=1) \neq P(X_n = x_n | X_{n-1}=2)$$
Note that,
$$P(f(X_n) = 1 | f(X_{n-1})=1, f(X_{n-2})=1) \neq P(f(X_n) = 1 | f(X_{n-1})=1, f(X_{n-2})=0)$$
Indeed, the left hand side equals $1/2$ whereas the right hand side equals $1/4$.
Left hand side: if $f(X_{n-1})=f(X_{n-2})=1$ then the system must have gone through $X_{n-1}=X_{n-2}=1$ which entails that $X_n=1$ with probability $1/2$.
Right hand side: if $f(X_{n-2})=0$ and $f(X_{n-1})=1$ then with probability $1/2$ the system is at state $X_{n-1}=1$ and with probability $1/2$ will move to state $0$ at the next step, and with probability $1/2$ the system is at state $X_{n-1}=2$ and will move to state $0$ at the next step with probability 1.
By the way: in your question, you show how to determine $p_{0,0}$ and $p_{0,1}$ but you did not show how to determine $p_{1,0}$. The fact that $f(X_{n})$ is not a Markov chain becomes more evident when trying to compute $p_{1,0}$ which is exactly what is missing in your tentative solution.