I'm reading about recurrence and transience in Markov chain. First, the author defines relevant concepts:
Then the author presents a lemma:
Lemma 5.1. For all $k \geq 0, \mathbb P_{i}\left(V_{i} \geq k+1\right)=\left(f_{i}\right)^{k}$.
and proof:
Proof: This is true for $k=0 $. Assume it is true for $k-1$. The $k$ th visit to $i$ is a stopping time, so by the strong Markov property $$ \begin{aligned} \mathbb P_{i}\left(V_{i} \geq k+1\right) &=\mathbb P_{i}\left(V_{i} \geq k+1 | V_{i} \geq k\right) \mathbb P_{i}\left(V_{i} \geq k\right) \\ &=\mathbb P_{i}\left(T_{i}<\infty\right)\left(f_{i}\right)^{k-1} \\ &=\left(f_{i}\right)^{k} \end{aligned} $$ Hence the lemma holds by induction.
Could you please explain how to get $\mathbb P_{i}\left(V_{i} \geq k+1 | V_{i} \geq k\right) = \mathbb P_{i}\left(T_{i}<\infty\right)$?
Thank you for your clarification!

The event $\mathbb P_{i}\left(V_{i} \geq k+1 | V_{i} \geq k\right)$ is by time homogenuity the same as $\mathbb P_{i}(V_{i} > 1)$, as given that we have returned $k$ times, the probability of returning at least $k+1$ times is the same returning at least once starting from $i$. And, of course $\mathbb P_{i}(V_{i} > 1)$, the event that we return at least once, is the same as $\mathbb P_{i}\left(T_{i}<\infty\right)$, the probability the first return happens in finite time.