Definitions of recurrence / transience (Discrete-time Markov Chains)

29 Views Asked by At

We have that the definition of state $i$ being transient is $\mathbb{P}(\text{# times we are at $i$ is finite})=1$. Is this equivalent to saying $\mathbb{P}(\text{the first time step we return to $i$ is } \infty)>0$?

Also is it true that if $\mathbb{P}(\text{the first time step we return to $i$ is }<\infty)=1$, then state $i$ is recurrent?