Time Series Mean Square Convergence with given acvf

144 Views Asked by At

Let $(X_t)_{t\in \mathbb{Z}}$ be a stationary time series with $E[X_t]=0$ and acvf

$\gamma(h)={1}_{0}(h)+\rho1_{h \ne0}(h)$ with $\rho \in (0,1)$.

Show that $\frac{1}{n}\sum_{j=1}^nX_{-j}$ converges in $L^2$.

I tried to compute the $L^2$ limit straightforward, but got $0$ as a result, which cannot be true since the limit is supposed to be a RV with expectation $0$ and variance $\rho$.