I was thinking about a problem in combinatorics and came up with an idea of estimating the joint probability distribution of multiple quantities related to a Markov chain. Let's say $(X_n)_{1 \leq n \leq N}$ is a Markov chain with a finite state space $S$. The quantities I am interested in have the form $$\sum_{1 \leq n_1 < n_2 < \cdots < n_k \leq N} f(X_{n_1}, X_{n_2}, \cdots, X_{n_k}),$$ for some function $f : S^k \rightarrow \{ +1, 0, -1 \}$.
These are very complicated sums, so I would like to start by considering a simplified model. Let $(X_n)_{1 \leq n \leq N}$ be i.i.d. random variables which take values $\pm 1$ with probability $1/2$ each. Consider the two quantities $$S_N = \sum_{1 \leq n \leq N} X_n \quad \text{and} \quad T_N = \sum_{1 \leq n \leq N-1} X_n X_{n+1}.$$ Of course, they are not independent; $S_N = \pm N$ forces $T_N = N-1$ and vice versa. But for moderate values of $S_N$ and $T_N$ (say, of magnitude $O(\sqrt N)$), I expect them to be nearly independent in the sense that $$\mathbb{P}(S_N = s, T_N = t) \approx \mathbb{P}(S_N = s) \cdot \mathbb{P}(T_N = t) \quad (s, t = O(\sqrt N)).$$ How much are $S_N$ and $T_N$ independent in the limit $N \rightarrow \infty$? Would it be possible to give a bound for $$\left| \frac{\mathbb{P}(S_N = s, T_N = t)}{\mathbb{P}(S_N = s) \cdot \mathbb{P}(T_N = t)} - 1 \right|$$ or the mutual information of $S_N$ and $T_N$? (Or any other measures of dependency?)