Let T be a discrete time random variable describing the time of first visit for a MC from an initial state i and returning back to state i of a state space $S={0,1}$.
Let $P\left[T \geq n\right]$ be the cumulative probability distribution.
By definition, this is $\sum_{n_{i} \leq n}P\left[n_{i}\right]=P\left[1\right]+ \cdot \cdot \cdot P\left[n-1\right]$
But, how is the above equals to
$P\left[X_{n=1}=1, \cdot \cdot \cdot, X_{n-1}=1 | X_{0}=0\right]$?
Is there a theorem or definition that I have missed out?
If the first return time for state 0 is $\geq n$ then you have to stay at state 1 from time 1 up to time $n-1$.