I have a sequence of random variables: Y1, Y2, Y3, .... where Y1 and Y2 are both i.i.d. Bernoulli(0.5),
and for all j >= 3, the following holds:
if min(Yj-1, Yj-2) = 1, then Yj is Bernoulli(2/3), and
if min(Yj-1, Yj-2) = 0, then Yj is Bernoulli(1/3)
Now, I have been trying to work out why Y1, Y2, Y3, .... is not a Markov chain.
I used the Markov property to check this. That is, if this is not a Markov chain, then,
P(Y3 = 1 | Y2 = 1, Y1 = 0) will not be equal to P(Y3 = 1 | Y2 = 1)
I have just taken Y3, Y2 and Y1 to be 1, 1 and 0, to calculate the above, as an example to show that it is not a markov chain. However, I am getting stuck in the interpretation of the model and therefore the calculations I guess.
I get P(Y3 = 1 | Y2 = 1, Y1 = 0) = P(Y3=1, Y2=1, Y1=0) / P(Y2=1, Y1=0), and
P(Y3 = 1 | Y2 = 1) = P(Y3=1, Y2=1) / P(Y2=1)
Now, to calculate both the conditional probabilities above, I need the joint probabilities. How should I calculate the joint pmf's P(Y3=1, Y2=1, Y1=0) and P(Y3=1, Y2=1) ?
Is there anything I am missing, or anywhere I am going wrong with my approach? Thanks a lot for any help or advice.
Let's analyze the given information and the problem. The terms $Y_1$ and $Y_2$ are i.i.d. Bernoulli$(0.5)$, and for $j \geq 3$:
To prove that $Y_1, Y_2, Y_3, \ldots$ is not a Markov chain, it is sufficient to find one instance where the Markov property does not hold. The Markov property states that:
$$P(Y_{n+1} = y | Y_1 = y_1, Y_2 = y_2, \ldots, Y_n = y_n) = P(Y_{n+1} = y | Y_n = y_n)$$
for all $n$ and for all $y, y_1, y_2, \ldots, y_n$.
Now, let's calculate $P(Y_3 = 1 | Y_2 = 1, Y_1 = 0)$ and $P(Y_3 = 1 | Y_2 = 1)$.
Calculation of $P(Y_3 = 1 | Y_2 = 1, Y_1 = 0)$
Given $Y_2 = 1$ and $Y_1 = 0$, $Y_3$ is Bernoulli$\left(\frac{1}{3}\right)$ because $\min(Y_1, Y_2) = 0$. Therefore:
$$P(Y_3 = 1 | Y_2 = 1, Y_1 = 0) = \frac{1}{3}$$
Calculation of $P(Y_3 = 1 | Y_2 = 1)$
Now, $P(Y_3 = 1 | Y_2 = 1)$ will have two cases, depending on the value of $Y_1$:
But $Y_1$ is Bernoulli$\left(\frac{1}{2}\right)$, so each case has a probability of $\frac{1}{2}$. Therefore:
$$P(Y_3 = 1 | Y_2 = 1) = \frac{1}{2} \left(\frac{1}{3}\right) + \frac{1}{2} \left(\frac{2}{3}\right) = \frac{1}{2}$$
So, $P(Y_3 = 1 | Y_2 = 1, Y_1 = 0) \neq P(Y_3 = 1 | Y_2 = 1)$, which confirms that the sequence $Y_1, Y_2, Y_3, \ldots$ is not a Markov chain.