I'm looking through my notes and I've come across the following line:
If $\sum_{i \in I}\pi(i) = \infty$ then we (usually) say that the Markov chain doesn't have an invariant distribution.
My problem is with the "(usually)" part of this sentence. For a probability distribution we require that the $\sum_{i \in I}\pi(i) = 1$ so this "(usually)" seems rather misleading.
Can I get confirmation that I copied this down in error?
An invariant measure for a Markov chain satisfies
\begin{equation} \pi P = \pi, \end{equation}
where $\pi = [ \pi_x ]_{x \in I}$ is a row vector and $P = [p_{x,y}]_{x,y \in I}$ is the transition probability matrix.
If the Markov chain is irreducible, positive recurrent, and aperiodic, then $\pi$ is unique up to scalar multiples. This means that you can normalize the vector $\pi$ so that it sums to $1$. So, an invariant distribution satisfies
\begin{equation} \pi P = \pi, \quad \pi \mathbf{1} = 1 \end{equation}
with $\mathbf{1}$ a column vector of ones of appropriate size. If the invariant measure has $\pi \mathbf{1} = \infty$, then you cannot normalize it to obtain an invariant distribution.