Say $S=\{1,2,\dots, n\}$, with common elements $i$ and $j$. (states of a Markov Chain, if you wish) I am trying to compute the limit of a random variable $x_{i,t}$ as $t\to\infty$ and I end up getting the following relationship
\begin{equation} \displaystyle \lim_{t\to\infty} x_{i,t} = k_{1} + k_{2} \lim_{t\to\infty}\sum_{j=1}^{n}W_{ij} \, \frac{1}{t} \sum_{k=0}^{t-1} \mathbb{1}\left\{x_{j,t-k-1} < \frac{1}{2}\right\}, \end{equation}
where $k_{1}$ and $k_{2}$ are two constants, $W$ is a squared row stochastic matrix (probability transition of a Markov Chain, if you wish) and $\mathbb{1}\{\cdot\}$ is a standard indicator function. Say we have $x_{i,0}$ for all $i$.
As you can see, the right hand side indicator function has a close relation with the left hand side. Moreover, It is possible to see that, if $y_{i,t}$ converges to a fixed $\bar{y}$ for all $i$, then the limit of the expression above is either $k_{1}$ or $k_{1} + k_{2}$, depending on the initial $x_{i,0}$ (and potentially on some conditions of $W$?). It sounds like some type of Kolmogorov 0-1 law.
What is the proper formal/rigorous way of proving it? I am not a mathematician and any help would be greatly appreciated. :)