Inconsistency of linear regression model estimator

437 Views Asked by At

Suppose we have sequence of discrete random variables

$y_i = \beta i + u_i$,

where $\beta$ is model parameter, $u_i$ for all $i$ are equally uniformly distributed random variables with the values $-1, 1$.

Least Squares Method for parameter $\beta$ simply gives an estimator

$ \hat{\beta} = \frac{\sum_{i=1}^n iy_i}{\sum_{i=1}^n i^2} $

We need to show that the estimator is inconsistent. The problem is it is easy to show the opposite - consistency of the estimator.

PROOF

$ \hat{\beta} = \frac{\sum_{i=1}^n i(\beta i + u_i)}{\sum_{i=1}^n i^2} = \beta + \frac{\sum_{i=1}^n iu_i}{\sum_{i=1}^n i^2} $

therefore

$ P\left(\left|\hat\beta - \beta\right| > \varepsilon\right) = P\left(\left|\frac{\sum_{i=1}^n iu_i}{\sum_{i=1}^n i^2} \right| > \varepsilon\right) \leqslant P\left(\frac{C}{n} > \varepsilon\right) \rightarrow 0 $

because we have the estimate

$ \left|\frac{\sum_{i=1}^n iu_i}{\sum_{i=1}^n i^2} \right| \leqslant \frac{\sum_{i=1}^n i}{\sum_{i=1}^n i^2} \leqslant C\frac{n^2}{n^3} = \frac{C}{n} $

Where I was mistaken?