On the Markov chain defined by $X_n=U_nU_{n+1}$, where $(U_n)$ is i.i.d. symmetric Bernoulli

223 Views Asked by At

I came across this problem in homework:

$U_n$ are i.i.d random variables with $P[Un=1]=P[Un=−1]=0.5$.

a) Show that $X_n=U_nU_{n+1}$ is a Markov Chain.

b) Show that $X_n=(U_n+U_{n+1})/2$ is not a Markov Chain.

I don't know how to start the proof. When I have the first problem solved, I will be able to do the next items on my own...

Thank you. I hope for your help!

1

There are 1 best solutions below

0
On BEST ANSWER

Hint:

Part (a)

Let $x_0, x_1, x_2 \in \left\{ -1,1\right\}$

\begin{align*} Pr\left(X_2=x_2|X_1=x_1, X_0=x_0\right) &=Pr\left(U_2U_3=x_2|U_1U_2=x_1, U_0U_1=x_0\right) \\ &=Pr\left(U_3=x_2U_2|U_2=x_1U_1, U_1=x_0U_0\right) \\ &=\frac{1}{2}Pr\left(U_3=x_2U_2|U_2=x_1U_1, U_1=x_0U_0,U_0=1\right)+\frac{1}{2}Pr\left(U_3=x_2U_2|U_2=x_1U_1, U_1=x_0U_0,U_0=-1\right) \\ &=\frac{1}{2}Pr\left(U_3=x_2x_0x_1\right)+\frac{1}{2}Pr\left(U_3=-x_2x_1x_0\right) \\ &=\frac{1}{2} \end{align*}

Can you do similar thing with $Pr(X_2=x_2|X_1=x_1)$? perhaps by conditioning on $U_1$?

Do it for $n$ terms rather than 3 terms.

Part (b)

Try $x_0=-2$, $x_1=0$, $x_2=0$