Are $\beta_0$ and $\beta_1$ unbiased estimators of $\hat\beta_0$ and $\hat\beta_1$?

1.3k Views Asked by At

When we are discussing simple linear regression with: $$Y_i = \beta_0 + X_i\beta_1 +u_i$$

$\hat\beta_0$ and $\hat\beta_1$ are estimates of this model using OLS.

With a simple proof we get $E(\hat\beta_0) = \beta_0$ and $E(\hat\beta_1) = \beta_1$, thus proving $\hat\beta_0$ and $\hat\beta_1$ are unbiased of $\beta_0$ and $\beta_1$.

My question is whether this is true the other way around: Are $\beta_0$ and $\beta_1$ unbiased estimators of $\hat\beta_0$ and $\hat\beta_1$?

1

There are 1 best solutions below

0
On

In the context of simple linear regression, we are typically interested in estimating the parameters $\beta_0$ and $\beta_1$, which are by assumption fixed real numbers. The Ordinary Least Squares estimators can then be obtained by applying the usual formulae to the data points in our sample. What you don't seem to grasp is that an estimator - such as $\hat{\beta}_0$ and $\hat{\beta}_1$ - is indeed a function whose inputs come from our sample (roughly speaking), with a target in mind; whereas a parameter is some (potentially) unknown, fixed quantity which we are interested in approximating by means of some computation (such as an OLS estimate).