We say that a state $i\in S$ (where $S$ is the state space of a Markov Chain) is recurrent iff $P_i[X_n=i \space\text{i.o.}]=1$ and transient iff $P_i[X_n=i \space\text{i.o.}]=0$.
My question is, cannot we have anything in between? Why/why not?
I know that recurrence happens if and only if $\sum_np(i,i)^n=\infty$. Fine! But why should it be that a state is either recurrent or transient? Can't it happen that the state, with positive probability, never returns, but also with positive probability, returns infinitely often?
The Markov property implies that the process has no memory outside of its current state. This means that return probabilities are mutually independent in the sense that the probability of returning from a state $i$ to itself $n$ times is the $n$-th power of individual return probabilities.
Consider the probability $R(i)$ for, given initial state $i,$ returning to state $i$ at least once in the future, i.e., the probability that the hitting time $T_i$ is finite.
If this probability $R(i)=P[T_i<\infty]$ is equal to 1 then $i$ is recurrent (from itself as initial state) because the probability of returning infinitely often is the intersection of a countable number of events with probability 1.
If $R(i)$ is strictly less than $1$ then $i$ is transient (from whatever initial state) because the probability of coming back at least twice, three times etc. is bounded by a convergent geometric series; the probability of coming back infinitely often is then bounded by the general term of the geometric series, which converges to $0.$
More explicitly, the probability of coming back at least once to the starting state $i$ is $R(i).$
$$P[T_i<\infty]=R(i)$$
The robability of coming back at least twice is $R(i)$ times the probability, conditional on returning once, that the process comes back at least once after that: but that conditional probability is also $R(i)$ because of the strong Markov property. So coming back at least twice has probability $R(i)^2.$
$$\eqalign{ P[T_i\circ\sigma_{T_i}<\infty]&=P[T_i<\infty]P[T_i\circ\sigma_{T_i}<\infty|T_i<\infty]\\ &=P[T_i<\infty]^2=R(i)^2\\ }$$
A similar reasoning leads to the conclusion that the probability of the event "$X_n$ comes back at least $k$ times" is $R(i)^k.$ The event "$X_n$ returns to $i$ infinitely often" is the intersection of those individual events for all positive integers $k.$
The only case where we could get in trouble is when $i$ is a recurrent state with itself as starting point, but there is a nontrivial probability $p$ to ever arrive from the initial state to $i$ the first time. As an example consider a three-state process $X_n\in\{0,1,2\}$ with initial state $0$ and with the following transition probabilities: $p(0,1)=0.5;\ p(0,2)=0.5;\ p(1,1)=1;\ p(2,2)=1.$
$$\left(p(i,j)\right)_{i,j}=\left( \begin{matrix} 0&0.5&0.5\\ 0&1&0\\ 0&0&1 \end{matrix} \right)$$
If the initial state is $0$ then there is a probability of $50$ percent that the process returns to state $1$ infinitely often. The transition matrix is idempotent, i.e., it is equal to its own $n$-th power for all $n\geq1.$
When discussing recurrent and transient states and processes, either such cases are excluded by imposing restrictions on the process, or the definition of recurrence/transience assumes that the state itself is used as initial state (so that states $1$ and $2$ in the above example would be considered recurrent).