I was reading a proof using Markov Chains in a finite state space $E$. Denote $p_{ij}(n) = P(X_n = j | X_0 = i)$.
Since the state space is finite, then probability of landing somewhere in the state space after $n$ steps is $1$, that is : $$\sum_{j\in E} p_{ij}(n) = 1 \qquad \forall n \in \mathbb N$$ The next line uses this to imply : $$\lim_{n\to\infty} \sum_{j\in E} p_{ij}(n) = 1$$ which seems logical but from a strict mathematical point of view, why is this true?
One could say that the function is strictly constant therefore the limit is invariant. Consider $f(x) = 1$, then clearly the limit is $1$ itself.
On the other hand consider the following function :
$$f(n) = I\bigg(\frac{1}{n} \ne 0\bigg)$$
where $I$ is the indicator function. Clearly $f(n) = 1$ for all non-zero integers so it's constant but $\lim_{n\to\infty}f(n) = 0$.
So although $f(n)$ is constant, the limit isn't equal itself...
Question is how to prove the first limit is $1$? Also is there a problem in my example using the indicator function?