let $U_n$ be iid random variables, where $P(U_n=1)=P(U_n=-1)=\frac{1}{2}$. Show that $X_n=U_n\cdot U_{n+1}$ is Markov chain, i.e.
$P(X_{n+1}=x_{n+1}|X_n=x_n,\dots,X_0=x_0)=P(X_{n+1}=x_{n+1}|X_n=x_n)$. I was trying to do it with induction hypotheses. Any ideas?
By the definition of $X_i$ and $U_i$, you have that for all $n \geq 0$, $X_{n+1} = \frac{X_n}{U_n} U_{n+2}$. Then, when you contition to the event $\{X_n = x_n, \ldots, X_0=x_0\}$ you have that :
Then the distribution of $X_{n+1}$ depends only of the value that the variable $X_n$ has taken ($x_n$).