Let $(X_n)$ be i.i.d. random variables in $L^2$ with $ E(X_1)=0$. Let $c>0$ be a constant. Then $\frac 1n \sum_{k=c+1}^n X_k X_{k-c}$ converges almost surely to $0$.
I need this statement to prove some result related to time series. This looks very much like some consequence of the Law of Large Numbers, but the $(X_n X_{n-c})$ may not be i.i.d. (if $X_{n}(w)=0$ then $(X_n X_{n-c})(w)= 0$ and $(X_{n+c} X_{n})(w)= 0$, so there is some kind of dependence) , which makes things difficult. I'm quite clueless, so any hint is welcome.
Write $$ \sum_{k=c+1}^n X_k X_{k-c}=\sum_{a=0}^{2q-1}\sum_{i\mid c+1\leqslant 2qi+a\leqslant n}X_{2qi+a}X_{2qi+a-c} $$ and for any fix $a$, the sequence $\left(X_{2qi+a}X_{2qi+a-c}\right)_{i\geqslant 1}$ is i.i.d. and centered.