38 Views
Asked by
Bumbble Commhttps://math.techqa.club/user/bumbble-comm/detail
At
I'm trying to solve the following problem:
My thinking is that If the Standard error of the error term would then the error term of the regression would also fall (as the regression would start performing better).
You are right, e.g., for the slope
$$
Var(\hat{\beta}_1) =\frac{1}{(\sum(x_i - \bar{x})^2)^2}Var\left( \sum (x_i - \bar{x})y_i \right) = \frac{\sigma^2}{\sum(x_i - \bar{x})^2},
$$
where $Var(\epsilon_i) = \sigma^2$. For the intercept you have
$$
Var(\hat{\beta}_0) =\frac{\sigma^2}{n} + \frac{\sigma^2\bar{x}^2}{\sum(x_i - \bar{x})^2},
$$
thus reducing the variance of $\epsilon_i$ will reduce the standard error of the OLS estimators.
You are right, e.g., for the slope $$ Var(\hat{\beta}_1) =\frac{1}{(\sum(x_i - \bar{x})^2)^2}Var\left( \sum (x_i - \bar{x})y_i \right) = \frac{\sigma^2}{\sum(x_i - \bar{x})^2}, $$ where $Var(\epsilon_i) = \sigma^2$. For the intercept you have $$ Var(\hat{\beta}_0) =\frac{\sigma^2}{n} + \frac{\sigma^2\bar{x}^2}{\sum(x_i - \bar{x})^2}, $$ thus reducing the variance of $\epsilon_i$ will reduce the standard error of the OLS estimators.