In J.R.Norris' Markov chains book, the strong Markov property for discrete-time, Markov chains is stated and proved as follows:
Let $(X_n)_{n \geqslant 0}$ be a Markov chain with transition matrix $P$ and $T$ be a stopping time. Then, conditional on $T < \infty$ and $X_T=i$, $(X_{T+n})_{n \geqslant 0}$ is a Markov chain with transition matrix $P$ and independent of $X_0, X_1,~\dots~, X_T$.
Proof. If B is an event determined by $X_0, X_1,~\dots~, X_T$, then $B~\cap \{T=m\}$ is determined by $X_0, X_1,~\dots~, X_m$, so, by the Markov property at time $m$ $$\mathbb{P}(\{ X_T=j_0,~\dots~, X_{T+n}=j_n \}~\cap B~\cap \{T=m\}~\cap \{X_T=i\}) \\ =\mathbb{P}_i(X_0=j_0,~\dots~, X_n=j_n) \mathbb{P}(B~\cap \{T=m\}~\cap \{X_T=i\})$$ where we have used the condition $T=m$ to replace $m$ by $T$, and $\mathbb{P}_i=\mathbb{P}(\cdot ~|X_0=i)$. Now sum over $m = 0,1,2,\dots$ and divide by $\mathbb{P}(T < \infty, X_T=i)$ to obtain $$\mathbb{P}(\{ X_T=j_0,~\dots~, X_{T+n}=j_n \}~\cap B ~|T < \infty,X_T=i) \\ =\mathbb{P}_i(X_0=j_0,~\dots~, X_n=j_n) \mathbb{P}(B ~| T < \infty, X_T=i)$$ Q.E.D.
Now I have some questions about this:
1. What is $X_{T+n}$ as a function on the probability space? Is it $\omega \mapsto X_{T(\omega)+n}(\omega)$?
2. What does the author mean by "B is an event determined by $X_0, X_1,~\dots~, X_T$" ? Since this family of random variables are itself random, how can we talk about the $\sigma$-algebra generated by them?