Say I have a sequence of Random (real-valued) Variables $(X_i)_{i \in \mathbb{N}}$ such that $E[X_i] = \mu, Var(X_i)=\sigma^2$. Furthermore say there exists one $k \in \mathbb{N}$ such that $\rho(X_i,X_j) = 0$ for $|i-j|>k$. a.k.a. the random variables are uncorrelated as soon as they are far enough apart in the sequence ($k$ apart, to be precise.) Now I want to show, that the sequence $(X_i)_{i \in \mathbb{N}}$ obeys the weak law of large numbers, a.k.a. $$\lim_{n \to \infty} P \left( \left|\frac{1}{n}\sum_{k=1}^{n}(X_k-E[X_k]) \right| \geq \epsilon \right)= 0$$
My guess is that I have to tweak the standard proof somehow. There are proofs available here on stackexchange for similar hypotheses, but none quite like the one I'm searching for.
$\newcommand{\P}{\mathbb{P}} \newcommand{\E}{\mathbb{E}} \newcommand{\Var}{\text{Var}} \newcommand{\Cov}{\text{Cov}}$ So, you want to tweak the standard proof somehow? Okay. Recall the Chebyshov's inequality first: \begin{align} \P(|X|\geq a)\leq \frac{\E[X^2]}{a^2} \end{align} For $a>0$. Also recall that: \begin{align} \Var\left( \sum_{i=1}^nX_i\right) = \sum_{i=1}^n\sum_{j=1}^n \Cov(X_i,X_j) \end{align} We have every tool that we need right now. \begin{align} \P\left(\bigg|\frac{1}{n}\sum_{i=1}^n(X_i-\mu)\bigg|\geq \epsilon\right)\leq \frac{\E\bigg[\frac{1}{n^2}\left(\sum_{i=1}^n(X_i-\mu)\right)^2\bigg]}{\epsilon^2} \end{align} The RHS can be written as: \begin{align} \frac{\E\bigg[\frac{1}{n^2}\left(\sum_{i=1}^n(X_i-\mu)\right)^2\bigg]}{\epsilon^2} = \frac{1}{\epsilon^2 n^2}\Var\left( \sum_{i=1}^nX_i\right) = \sum_{i=1}^n\sum_{j=1}^n \Cov(X_i,X_j) \end{align} Now recall from probability theory that $\rho(X,Y)=\frac{\Cov(X,Y)}{\sqrt[]{\Var(X)\Var(Y)}}$. Hence for random variables with nonzero variance we have $\rho=0$ iff $\Cov=0$. So: \begin{align} \sum_{i=1}^n\sum_{j=1}^n \Cov(X_i,X_j) = \sum_{i=1}^n\sum_{\min\{-k+i,1\}}^{k+i} \Cov(X_i,X_j) \end{align} We also have $\Cov(X,Y)\leq \sqrt[]{\Var(X)\Var(Y)}$ by Cauchy-Schwarz. So: \begin{align} \sum_{i=1}^n\sum_{\min\{-k+i,1\}}^{k+i} \Cov(X_i,X_j) \leq 2kn\sigma^2 \end{align} Putting everything together: \begin{align} \P\left(\bigg|\frac{1}{n}\sum_{i=1}^n(X_i-\mu)\bigg|\geq \epsilon\right)\leq \frac{2k\sigma^2}{n\epsilon^2} \end{align} Taking the limit gives the answer.