concept clarification

22 Views Asked by At

While I am studying discrete-time Markov chain, I find I cannot understand an equivalence:

Let $N(y)$ denote the number of times (n \geq 1) that the chain stays in the state $y$. Let $T_y$ be the hitting time ($T_y = \min(n>0):X_n = y$). A statement from my textbook says that the event $\{N(y) \geq 1\}$ is the same as the event $\{T_y < \infty\}$. Why is that?