In the below passage, I was wondering what $\mathbf{1}$ signifies:
How much time does a Markov chain spend in state $i$, in the long term? That is, what is the long term fraction of time that $X_n=i$ ? We can write this long term fraction of time as $ \lim_{n \rightarrow \infty} \frac{1}{n} \sum_{m=0}^{n-1} \mathbf{1}${$X_m=i$}
The text goes on to say that $\frac{1}{n} \sum_{m=0}^{n-1} \mathbf{1}${$X_m=i$} counts the number of steps among {$0,1,\ldots,n-1$} where $X_m= i$ - however, being unsure of what $\mathbf{1}$ means, notationally, I have been unable to see this.
I would greatly appreciate an explanation as to the meaning of $\mathbf{1}$ and how such means that expressions counts the number of steps among that set where $X_m = i$.
$\mathbf 1\{X_m = i\}$ is the indicator function or characteristic function of the event $\{X_m = i\}$, defined by $$\mathbf 1\{X_m = i\} = \begin{cases} 1 & X_m = i, \\ 0 & X_m \ne i.\end{cases}$$ So $\sum\limits_{m=0}^{n-1} \mathbf 1\{X_m = i\}$ counts the number of visits by the chain to state $i$ up to time $n-1$. Thus $\frac 1n \sum_\limits{m=0}^{n-1} \mathbf 1\{X_m = i\}$ is the proportion of time before $n$ spent in state $i$.
In the context of Markov chains, given some irreducible chain with invariant distribution $\pi$, the ergodic theorem states that $$\frac 1n \sum_{m=0}^{n-1} \mathbf 1\{X_m = i\}\to \pi(i) \quad \text{as } n \to \infty $$ with probability $1$. That is, the long-term proportion of time spent in each state is described precisely by the invariant distribution.