I am working on the following exercise:
Let $(X_n)_{n \ge 1}$ be a stochastic sequence of binary i.i.d. RVs. For each $n$, $P[X_n = 0] = P[X_n = 1] = 1/2$. Let $Y_n = X_n + X_{n+1}$, for $n \ge 1$.
a) Compute $P[Y_n = 1 \mid Y_{n−1} = 1, Y_{n−2} = 2]$ and $P[Y_n = 1 \mid Y_{n−1} = 1]$, $n \ge 3$.
b) Is $(Y_n)_{n \ge 1}$ a Markov Chain?
For a) I get both probabilities to be 1/2. The reasoning is as follows:
- For $P[Y_n = 1 \mid Y_{n−1} = 1, Y_{n−2} = 2]$ we note that $Y_{n-2} = 2$ implies that $X_{n-1} =1$. By this we see that $Y_{n-1} = 1$ implies $X_n=0$, which in turn imlies that $Y_n = X_{n+1}$.
- For $P[Y_n = 1 \mid Y_{n−1} = 1]$ we note that $Y_{n−1} = 1$ alone only implies that either $X_n = 1$ or $X_{n-1} = 1$. Both events are equaly likely, so we do not get any additional information from that. Thus the chance for $Y_n = 1$ is (for obvious reasons) equal to 1/2.
For b) the reasoning above would imply that $(Y_n)_{n \ge 1}$ is indeed a Markov Chain, but I do not see how I could formulate a proof for that.
Could you help me?
Here is a counter-example:
Consider $P(Y_n = 0|Y_{n-1} = 1, Y_{n-2} = 0)$. Using the same approach as above, we see that $(X_n, X_{n-1}, X_{n-2}) = (1, 0, 0)$. So, we know $X_n = 1$ and $P(Y_n = 0 | Y_{n-1} = 1, Y_{n-2} = 0) = 0$.
Now, consider $P(Y_n = 0|Y_{n-1} = 1)$. In this case, we have two feasible outcomes: $(X_{n}, X_{n-1}) = (0,1)$ or $(X_{n}, X_{n-1}) = (1,0)$. So, we only know that $X_n \neq X_{n-1}$. By independence of $X_n$ and $X_{n-1}$, this is useless and $X_n$ can be $0$ or $1$ with equal probability, so $P(Y_n = 0| Y_{n-1} = 1) = \frac{1}{4} \neq 0$.
Hence, the Markovian property is not satisfied and $\{Y_n\}_n$ is NOT a Markov chain.