Linear regression model and the error vector

498 Views Asked by At

In the linear regression model, since the true error vector $U=Y-X\beta$ is based upon the true value of the unknown coefficient vector $\beta$ and the LS residual vector $U^* =Y-X\beta^*$ uses the LS estimator $\beta'$ of $\beta$ (Here, $U^*$ and $\beta^*$ are the estimators of $U$ and $\beta$), is it true that $U'U\lt U^*$'$U^*$?

1

There are 1 best solutions below

0
On BEST ANSWER

I think you should formulate little more clearly. As the "fact" that $Y=X\beta + \epsilon$ is an assumption, as such you are viewing your data points as $(Y, X\beta + e)$ where the OLS estimators of $\beta$, by definition, minimize $\sum e_i^2$. As such the question whether $\sum_{i=1}^n \epsilon^2 <\sum_{i=1}^n e_i^2$ is problematic as the former is random variable while the second is the model residuals which are constant for a given model or can be viewed as r.v as well (because of the variation of $\hat{\beta}$), hence you can ask $P( \sum_{i=1}^n \epsilon^2 <\sum_{i=1}^n e_i^2)$. Alternatively, if you mean whether $\sum_{i=1}^n \eta^2 <\sum_{i=1}^n e_i^2$, where $\eta$ are non-OLS model residuals from an estimated linear model, then, obviously $\sum_{i=1}^n \eta^2 \ge\sum_{i=1}^n e_i^2$ (for a given model) as the OLS estimators minimize $\sum_{i=1}^n e_i^2$.