Let $(X_n)$ be a Markov chain. For $i\in S$ my text defines $$N_i:=\sum_{n=0}^\infty \mathbf 1_{\{ X_n=i \}}$$ and then, as a part of a larger proof, claims that $$\mathbb E_i(N_i)=\sum_{n=0}^\infty \mathbb P_i(X_n=i)$$ where $\mathbb E_i$ is the expectation with respect to $\mathbb P_i:=\mathbb P(\cdot|X_0=i)$.
It must be very trivial but how do we get it? $$\mathbb E_i(N_i) = \int N_i \ \text{d}\mathbb P_i = \ ? $$
Perhaps this will help:
Claim: If $E$ is any (measurable) event and $\Bbb{P}$ is a probability, then $\Bbb{E}[{\bf 1}_E] = \Bbb{P}(E)$ where $\Bbb{E}$ is the expectation with respect to the probability $\Bbb{P}$.
Proof: Since ${\bf 1}_E$ takes on only the values $1$ and $0$ (it is a Bernoulli random variable), we have $$ \Bbb{E}[{\bf 1}_E] = 0 \cdot \Bbb{P}({\bf 1}_E =0) + 1 \cdot \Bbb{P}({\bf 1}_E = 1) = \Bbb{P}({\bf 1}_E = 1) = \Bbb{P}(E). $$