I am having trouble with proving that a radnom varible defined as the time of the $k$-th return is a stopping time. My motivation for proving this comes from a textbook/notes that im currently reading: https://services.math.duke.edu/~rtd/EOSP/EOSP2E.pdf. In the book it is taken as fact that the time of the $k$-th return is a stopping time but I don't seem to find it that easy to prove. I have tried proof with induction, but I can't seem to finish it. The definitions I am working with come from the book (they can be accessed from the link I have pasted), but I will also paste them here:
For a Markov chain $X_{0}, X_{1}, ...$ a random variable $T$ with values in $\{\mathbb{N}_0 \cup {\infty}\}$ is called a stopping time if for any $n$ the event ${T = n}$ depends only on the random variables ${X_0, X_1, ..., X_n}$.
The random variable $T^k_s = min(n > T^{k-1}_s, X_n = s)$, where $s \in S$ is the initial state ($X_0 = s$), is called the time of the $k$-th return, that is, the time when the Markov chain returns to state $s$ for the $k$-th time.
My attempt:
If $k = 1$ then the time of the first return can be written as $T_s = min\{ n > 1, X_n = s\}$ which is by definition a stopping time, since it has its values in $\{ \mathbb{N}_0 \cup \infty \}$ and it's value is solely based on the state $X_n$.
Now assume this holds for $k$. We want to show that $T_s^{k+1}$ is a stopping time. Since $T_s^k$ is a stopping time, I would argue that by the srong Markov property this holds for $k + 1$, since after the state $X_{T_s^{k}} = s$ is reached, we are left with the same markov chain and are starting in the same state $s$, so assuming that $T_s^{k} < \infty$, $T_s^{k+ 1}$ can be written as $T_s^k + T_s$
Is my reasoning correct?
Note: There is a question like this, that already exist's but I dont quite uderstand the answer.