Consistency of an estimator

159 Views Asked by At

I am trying to prove $\widehat{\sigma} = \frac{1}{n} \sum_{i=1}^n \widehat{u}_i^2$ is a consistent estimator for $\sigma= \operatorname E[u_i^2\mid X_i]$ assuming $\operatorname E[u_i\mid X_i]=0$ where $Y_i=\beta_0 + \beta_1 X_i + u_i$ and $\widehat{u}_i = Y_i-(\widehat{\beta}_0 + \widehat{\beta}_1 Xi)$

1

There are 1 best solutions below

0
On BEST ANSWER

It would be important to know what you have already tried or what knowledge you have, in order to give an answer that is appropriate to your level.

Anyway, remember that $\hat{\sigma}$ is a consistent estimator (in weak sense) of $\sigma$ if and only if $$\hat{\sigma} \xrightarrow[n\to \infty]{\:\mathcal{P}\:} \sigma.$$

Two possible paths to prove this are:

  • showing that $MSE_\sigma(\hat\sigma)\xrightarrow[n\to\infty]{}0$, or equivalently that $$E_\sigma(\hat\sigma)\xrightarrow[n\to\infty]{}\sigma$$ (that is, $\hat\sigma$ is asymptotically unbiased) and that $$Var_\sigma(\hat\sigma)\xrightarrow[n\to\infty]{};0$$ or
  • see if you can apply the Law of Large Numbers (LLN): this might work, for instance, if your estimator is the mean of certain $Z_i$ variables (here $\hat\sigma$ is the mean of the $\hat{u}_i^2$) satisfying the hypotheses of that theorem/property/law.

For this case, it may help you to remember that $\hat{\beta}_k$, $k=0,1$, are unbiased estimators of $\beta_k$, $k=0,1.$ Use that property to prove that $E(\hat{u}_i|X_i)=0$. Then try to calculate the expectation and variance of your estimator, or try to apply the LLN as told before (I don't know if this will work, since $E(\hat u_i^2)\neq \sigma$; in fact —spoiler alert— $E(\hat \sigma)=\tfrac{n-2}n \sigma$.)

It may also be useful to rewrite $$\hat{u}_i=Y_i-\hat\beta_0-\hat\beta_1 X_i=Y_i-\beta_0-\beta_1 X_i+\beta_0+\beta_1 X_i-\hat\beta_0-\hat\beta_1 X_i=$$ $$=u_i+(\beta_0-\hat\beta_0)+(\beta_1-\hat\beta_1) X_i,$$ among other possibly useful ideas or hints.