Consistently estimate the covariance matrix with weakly correlated observations

53 Views Asked by At

Suppose there are T k-dimensional observations following the generating process: $Y_t = \mu + \epsilon_t$, where $\mu$ is the mean and $\epsilon$ is a weak stationary error with zero mean and time-invariant covariance matrix $\Sigma = E(\epsilon_t\epsilon_t')$ for all t.

The goal is to estimate $\Sigma$ by empirical covariance $\frac{1}{T}(Y_t - \mu_Y)(Y_t - \mu_Y)'$, where $\mu_Y$ is the empirical mean of $Y$. However, we allow some kind of time dependence in the error term. And the question is under which weak correlation can this estimator be consistent for $\Sigma$?

I know there is a classic result about weakly correlated random variables and for mean-estimation: if $X_1, X_2,\dots, X_n$ are random variables with the same means and $\operatorname{corr}(X_m, X_{m+t}) \to 0$ as $t \to \infty$, then the empirical mean $\frac{1}{n}(X_1+X_2+\dots +X_n) \to E(X) $ in probability.

But in the vector case and for second-moment estimation, I find it hard to set a good condition, maybe uniformly convergence to zero of all entries of correlation variance should be enough? And it seems a basic problem maybe in time series study or general statistics. I'm wondering if anyone has seen this problem, references or direction are appreciated.