Let $S$ be a set of states for a Markov chain and let $S^C$ be the remaining states. Explain the identity $$\sum_{i\in S}\sum_{j\in S^C}\pi_iP_{ij}=\sum_{i\in S^C}\sum_{j\in S}\pi_iP_{ij}$$
I know that LHS refers the long run proportion of transitions going from one state in $S$ to another in $S^C$ and vice versa for RHS, but how do I explain why they are equal? A qualitative answer would be sufficient.
Two facts: $\pi$ is the long-term probability distribution, $$\sum_iP_{ij}\pi_i=\pi_j$$
Each row of $P$ adds to 1 $$\sum_jP_{ij}=1$$
So \begin{align*} \sum_{j\in S^c}\sum_{i\in S}\pi_iP_{ij}&=\sum_{j\in S^c}(\pi_j-\sum_{i\in S^c}\pi_i P_{ij})\\ &=\sum_{j\in S^c}\pi_j-\sum_{i\in S^c}\pi_i\sum_{j\in S^c}P_{ij}\\ &=\sum_{j\in S^c}\pi_j-\sum_{i\in S^c}\pi_i(1-\sum_{j\in S}P_{ij})\\ &=\sum_{i\in S^c}\sum_{j\in S}\pi_i P_{ij}\end{align*}
For an intuitive answer, think of $P$ as $\begin{pmatrix}1-p&q\\p&1-q\end{pmatrix}$. Its long-term distribution is $(q,p)/(p+q)$; then the long-term probability of going from $S$ to $S^c$ is $pq/(p+q)$, the same as going back. Even if $p$ ($S\to S^c$) is large and $q$ ($S^c\to S$) small, then it's much more likely to find the state in $S^c$, and the probabilities balance out.