I have found an exercise in an old exam without a solution.
Let $X_{0},X_{1},...$ be a Markov chain with state space $S$ and $N$ a stopping time. Let $\mathbb{E}_{x}(\cdot)$ denote the conditional expectation if I start in $x \in S$ and $\mathcal{A}_{N}$ the stopping time sigma algeba. Let $f$ be a non-negative function on $S^{k}$. If $\mathbb{P}(N < \infty) = 1$ then
$$ \mathbb{E}\left( f(X_{N},X_{N+1},...,X_{N+k-1}) | A_{N} \right) = \mathbb{E}_{X_{N}}\left( f(X_{0},X_{1},...,X_{k-1})\right) $$ for all $k \geq 1$.
Someone has an idea how to prove it?
Since $X_N$ is $\mathcal{A}_N$-measurable, the right-hand side of your equation is $\mathcal{A}_N$-measurable, and therefore the assertion follows if we can show that
$$\forall B \in \mathcal{A}_N: \quad \mathbb{E}(f(X_N,\ldots,X_{N+k-1}) 1_B) = \mathbb{E} \big[ 1_B \mathbb{E}_{X_N}(f(X_0,\ldots,X_{k-1})) \big]. \tag{1}$$
Fix $B \in \mathcal{A}_N$. As $N<\infty$ almost surely we have
$$\begin{align*} \mathbb{E}(f(X_N,\ldots,X_{N+k-1}) 1_B) &= \sum_{n =0}^{\infty} \mathbb{E}(f(X_N,\ldots,X_{N+k-1}) 1_B 1_{\{N=n\}}) \\ &= \sum_{n =0}^{\infty} \mathbb{E}(f(X_n,\ldots,X_{n+k-1}) 1_B 1_{\{N=n\}}). \end{align*} \tag{2}$$
Since $B \cap \{N=n\} \in \mathcal{F}_n$ it follows from the tower property that $$\begin{align*} \mathbb{E}(f(X_n,\ldots,X_{n+k-1}) 1_B 1_{\{N=n\}}) &= \mathbb{E} \big[ \mathbb{E}(f(X_n,\ldots,X_{n+k-1}) 1_B 1_{\{N=n\}} \mid \mathcal{F}_n) \big] \\ &= \mathbb{E} \big[ 1_B 1_{\{N=n\}} \mathbb{E}(f(X_n,\ldots,X_{n+k-1}) \mid \mathcal{F}_n) \big] \end{align*}$$
Using the Markov property of the Markov chain we get
$$\begin{align*} \mathbb{E}(f(X_n,\ldots,X_{n+k-1}) 1_B 1_{\{N=n\}}) &= \mathbb{E} \big[ 1_B 1_{\{N=n\}} \mathbb{E}_{X_n}(f(X_0,\ldots,X_{k-1})) \big] \\ &= \mathbb{E} \big[ 1_B \mathbb{E}(1_{\{N=n\}}\mathbb{E}_{X_N}(f(X_0,\ldots,X_{k-1})) \big]. \end{align*}$$
Plugging this into $(2)$ and using the monotone convergence theorem, we get $(1)$.