Resnick states: If $\{X_n\}$ is Markov with stationary distribution $\pi$, show that $(X_n,X_{n+1}), (n\geq 0)$ is Markov. Give its stationary distribution.
I've already proved that $\{(X_n, Y_n)\}$ where $X_n$ and $Y_n$ are two independent Markov chains is Markov. However, here $X_{n}, X_{n+1}$ are dependent on each other so I don't understand why $(X_{n}, X_{n+1})$ is Markov.
Attempt: \begin{align*} &P((X_{n+1}=i_{n+1}, X_{n+2}=i_{n+2}) \mid (X_{0}=i_0, X_{1}=i_{1}),...(X_n=i_n, X_{n+1}=i_{n+1}))\\ &\mbox{I can't multiply anything since I don't have independence}, X_{n+1} \mbox{is not independent of} X_{n+2}\\ &=P((X_{n+1}=i_{n+1}, X_{n+2}=i_{n+2}) \mid (X_{n}=i_n, X_{n+1}=i_{n+1})) \end{align*} Can I just say the last step since $X_{n+1}$ just depends on $X_n$ and $X_{n+2}$ just depends on $X_{n+1}$ by Markov Property?
I leave the precise details to you -- doing these questions yourself is by far the best way to learn -- but let me give you the ideas behind the proof of the Markov property.
Consider $(X_n, X_{n+1})$ given the whole history $X_0, ..., X_n$. (Note that knowing $(X_0, X_1), (X_1, X_2), ..., (X_{n-1}, X_n)$ is equivalent to knowing $X_0, ..., X_n$.) Clearly the first element of the pair $(X_n, X_{n+1})$ is determined by $X_n$ alone! And to get the second, by the Markov property, we only need to know $X_n$. So in fact the pair $(X_n, X_{n+1})$ is determined by knowledge of $X_n$ only, and so in particular by knowledge of $(X_{n-1}, X_n)$.
Hopefully this helps you to make a rigorous proof!