I found this problem in a test but I don´t know how to solve it:
Let $X_1,X_2,\ldots$ be independent random variables with distribution function $F$. Define $R_1=1$, and $R_k=\min\{n\geq R_{k-1}:X_n\geq X_1,\ldots,X_{n-1} \}$. Show that $(R_k)_{k\geq 1}$ is a Markov Chain.
Any ideas?
For any positive integers $i,j$, we have that $$ \mathbb P(R_{k+1} = j \mid R_k = i, R_{k-1}, R_{k-2},\ldots, R_1) = \mathbb P(R_{k+1}=j\mid R_k=i), $$ since $R_k$ is $\sigma(X_i, X_{i+1},\ldots X_j)$-measurable and hence is independent of $\sigma(R_1,R_2,\ldots,R_{i-1})$. It follows that $\{R_k:k=1,2,\ldots\}$ is a Markov chain.