Show Markov property

64 Views Asked by At

let $U_n$ be iid random variables, where $P(U_n=1)=P(U_n=-1)=\frac{1}{2}$. Show that $X_n=U_n\cdot U_{n+1}$ is Markov chain, i.e.

$P(X_{n+1}=x_{n+1}|X_n=x_n,\dots,X_0=x_0)=P(X_{n+1}=x_{n+1}|X_n=x_n)$. I was trying to do it with induction hypotheses. Any ideas?

1

There are 1 best solutions below

2
On

By the definition of $X_i$ and $U_i$, you have that for all $n \geq 0$, $X_{n+1} = \frac{X_n}{U_n} U_{n+2}$. Then, when you contition to the event $\{X_n = x_n, \ldots, X_0=x_0\}$ you have that :

  • $X_{n+1} = x_n$ iff $U_{n+2} = U_{n}$ (prob. 0.5, since $U_{n+2}$ is independent of $U_n$)
  • $X_{n+1} = -x_n$ iff $U_{n+2} \neq U_{n}$ (prob. 0.5 too)

Then the distribution of $X_{n+1}$ depends only of the value that the variable $X_n$ has taken ($x_n$).