I had a long discussion with a friend of mine about if this simple graph is ergodic. I think it is because every state can be reached in an non-endless number of steps. My friend argues that it is not because there are no state transitions from each node to itself. Is that required? If somebody has a few examples for each case, it would perhaps helps me to understand it better --- other than the pure math...
A <----> B
Here the transition matrix:
A B
A 0.0 1.0
B 1.0 0.0
Actually, your friend is right, the chain is not ergodic. By definition, a Markov chain with transition matrix $P$ is ergodic if there exists $n \geq 1$ such that
$$P^n > 0. \tag{1}$$
A direct calculation shows that $P^n$ either equals $P$ or
$$\begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$$
for any $n \geq 1$. This means that $(1)$ doesn't hold true for any $n \geq 1$.
Intuition: Ergodic means that we can find $n \geq 1$ such that $\mathbb{P}^i(X_n = j)>0$ for any two states $i,j$, i.e. if the process starts at $X_0=i$ it reaches $j$ in $n$ steps with probability $>0$. In the given example, this is not satisfied. For example, for $i=A$, we have either $\mathbb{P}^A(X_n = A)=0$ (if $n$ is odd) or $\mathbb{P}^A(X_n=B)=0$ (if $n$ is even).
Remark: A weaker notion is the so-called "irreducibility" of a Markov chain; then, it is only required that $\mathbb{P}^i(X_n = j)>0$ for some $n$ which may depend on $i$ and $j$. It is not difficult to see that the given Markov chain is irreducible (because, as you say, "every state can be reached in an non-endless number of steps").