Standard Error of Sample Variance

3.2k Views Asked by At

I have a time-series of values $X_1, X_2, \ldots, X_t$, for which I compute sample variance:

$$\hat{\sigma}^2 = \operatorname{var}(X_1, \ldots, X_t)$$

(unabiased estimator using $\frac{1}{t-1})$.

In a subsequent calculation, I would like to use to shrink this variance estimate in proportion to its precision. How can I empirically estimate the standard error of this variance estimate?

In theory, the variance of sample variance (for normal distribution) is:

$$\operatorname{var}(\hat{\sigma}^2) = \frac{2}{t - 1} (\sigma^2)^2 $$

where $\sigma^2$ is the true variance.

For the purporse of this calculation, can I safely assume that $\hat{\sigma}^2 = \sigma^2$, and thus let:

$$\operatorname{var}(\hat{\sigma}^2) = \frac{2}{t - 1} (\hat{\sigma}^2)^2 $$

be an estimate of the variance of the sample variance?

Any help would be appreciated! Thanks!