Let $\{X_n\}_{n=0}^\infty$ a sequence of discrete random variables independent identically distributed. Let $Y_n$ such that $Y_n=X_{n-1}X_n$ for all $n\ge 1$
Is $\{Y_n\}_{n=0}^\infty$ a Markov chain?
If this was a Markov chain I would need to prove the Markov property:
$P[Y_n=j|Y_{n-1}=i]=P[X_{n-1}X_n=j|X_{n-2}X_{n-1}=i]$ but I don´t know how to compute the last probability because $Y_n$ and $Y_{n-1}$ are not independent.
I would really appreciate if you can help me with this problem. Any hint or suggestion would be appreciated.
Notice that we never observe $X_{n-1}$ explicitly, only $X_{n-2}X_{n-1}$ - some sort of "incomplete information" about the most recent realization $X_{n-1}$. You can construct a counterexample where $X_k\in\{-1,1\}$, i.e. knowing that $X_{n-1}=1$ gives a different probability for $Y_n=1$ than when $X_{n-1}=-1$, but in both cases, $Y_{n-1}=1$. So knowing the value of $X_n$ changes the transition probability, which is not allowable: If $Y_n$ were a Markov chain, its transition probabilities should be fully determined by the previous state $Y_{n-1}$.