Show if the estimator is unbiased

173 Views Asked by At

Suppose that the random variables $Y_1... Y_n$ satisfy $Y_i = \beta x_i + ϵ_i$ , i = 1...n where $x_i$ are fixed constants and the $ϵ_i$ are iid Normally distributed random variables with mean zero and variance $\sigma^2$

I need to show if the estimator $$\beta_a = \frac{\sum_{i=1}^n x_iY_i}{\sum_{i=1}^n x_i^2}$$ is an unbiased estimator of $\beta$ or not. I understand that I have to show that $E(\beta_a) = \beta$ for $\beta_a $ to be an unbiased estimator of $\beta$. I have first simplified $\beta_a$ by replacing $Y_i$ with $\beta x_i + ϵ_i$ ,used some basic summation properties and took $E(\beta_a) $and arrived at the following equation: $$E(\beta_a) = \beta + \frac{E(\sum_{i=1}^n x_iϵ_i)}{\sum_{i=1}^n x_i^2}$$

Not sure how to proceed after this. I also need to find the variance of the estimator $\beta_a$.

3

There are 3 best solutions below

3
On

By linearity of expectation,

$$ E\left(\sum_{i=1}^nx_i\epsilon_i\right)=\sum_{i=1}^nx_iE\left(\epsilon_i\right)=0\;. $$

5
On

Since $E(\beta_a) = \beta $ from first part, I have managed to find the solution to the variance. I used the equation $$Var(\beta_a) = E(\beta_a^2) - (E(\beta_s))^2 = E(\beta_a^2) - \beta^2$$ and using summation rules + expectation rules ended up with the solution: $Var(\beta_a) = \sigma^2$

Please correct me if wrong. Thanks

0
On

Use the properties of expectations: You can write the numerator as $E[\sum_{I=1}^{n}x_i\epsilon]=\sum_{I=1}^{n}x_iE[\epsilon]$. Now since mean is the expectation and mean of the noise or $\epsilon$ is $0$, hence the numerator is $0$.

So, we get $E[\beta_a]=\beta$. This suggests that the estimator is unbiased. Let us now try to find the variance.

Note that the variance is given by $E[(\beta-\beta_a)(\beta-\beta_a)^T]$, so we can write this as $E[\beta^T\beta-\beta^T\beta_a-\beta_a^T\beta+\beta_a^T\beta_a]$.

Now, since this is a problem of linear regression, we substitute $\beta_a=X^{+}Y = (X^TX)^{-1}X^TY$ and $Y=\beta X+\epsilon$, then we have, $E[\beta^T\beta-\beta^T((X^TX)^{-1}X^T(\beta X+\epsilon))-((X^TX)^{-1}X^T(\beta X+\epsilon))^T\beta+((X^TX)^{-1}X^T(\beta X+\epsilon))^T((X^TX)^{-1}X^T(\beta X+\epsilon))]$.

This gives us variance as $E[(X^TX)^{-1}X^T\epsilon \epsilon^T X(X^TX)^{-1}]$ and by using the fact that $\epsilon\epsilon^T=\sigma^2$, we get $Var(\beta_a)=(X^TX)^{-1}\sigma^2$.