I'm having trouble to understand a passage in the mean square error of an estimator.
Let $\{X_t\}$ be a stationary process of a time series with mean $\mu$, thus the sample mean estimator is $$\overline{X}_n=\frac{1}{n}(X_1+X_2+\dots +X_n)$$ The mean squared error of this estimator is $$E[\overline{X}_n-\mu]^2=Var(\overline{X}_n)$$ since that $\overline{X}_n$ is a unbiased estimator of $\mu$. Then
$$Var(\overline{X}_n)=\frac{1}{n^2}\sum_{i=1}^n\sum_{j=1}^n Cov(X_i,X_j)\qquad (1)$$
$$=\frac{1}{n^2}\sum_{i-j=-n}^n (n-|i-j|)\gamma(i-j)\qquad (2)$$
$$=\frac{1}{n}\sum_{h=-n}^n \Big(1-\frac{|h|}{n}\Big)\gamma (h)$$
where $\gamma (i-j)=Cov(X_{t+(i-j)},X_t)$ and $\gamma(h)=Cov(X_{t+h},X_t)$ (autocovariance function).
I can't figure out what they make from (1) to (2).
I make a test with $n=1$ and it works, but I don't understand what they did, since in (1) I have a sum with $n^2$ terms and in (2) I have just $2n+1$ terms.
As you mentioned $Cov(X_i,X_j)=\gamma(i-j)$. See in the total sum $\sum_{i=1}^n\sum_{j=1}^n Cov(X_i,X_j)$ how many times $\gamma(k)$ appears. This is the number of solutions of $i-j=k$ for $1\leq i,j\leq n$.
Of course $i=j+k$ and since $1\leq i,j \leq n$, then $k+1\leq i \leq n+k$ which means that $i\in\{1,\dots,n\}\cap\{k+1,n+k\}$. It is not difficult to see that there are $n-|k|$ possible $i$ for which $j$ is uniquely determined and hence it has $k$ solutions. So: $$ \frac{1}{n^2}\sum_{i=1}^n\sum_{j=1}^n Cov(X_i,X_j)=\frac{1}{n^2}\sum_{k=-n}^n (n-|k|)\gamma(k). $$ To number of terms on the right side is indeed $\sum_{k=-n}^n (n-|k|)=n^2$.