Why does changing the value of the intercept in linear regression not affect variance of residuals?

274 Views Asked by At

In http://people.duke.edu/~rnau/regnotes.htm#constant, it states

Most multiple regression models include a constant term (i.e., an "intercept"), since this ensures that the model will be unbiased--i.e., the mean of the residuals will be exactly zero. (The coefficients in a regression model are estimated by least squares--i.e., minimizing the mean squared error. Now, the mean squared error is equal to the variance of the errors plus the square of their mean: this is a mathematical identity. Changing the value of the constant in the model changes the mean of the errors but doesn't affect the variance.

A residual is defined as $r = y - \hat{y}$. The above seems to be saying that if you have an equation for $\hat{y} = mx + b$, then keeping $m$ constant and changing the value of $b$ will not affect the variance of $r$. If this is correct, then I am confused by this statement. Changing the value of $b$ simplify shifts the $\hat{y}$ up or down, why wouldn't the variance of the residuals change?

I also know the following $$ \text{Var}(r) = E[r^2] - E[r]^2 \\ MSE = E[r]^2 \\ \therefore MSE = \text{Var}(r) + E[r]^2 $$

Changing $b$ will change $E[r]$. That's clear to me. But I still don't understand why the variance of the residual is not changing. How can I go about proving this?

1

There are 1 best solutions below

0
On

Let $c$ be some constant that shifts the intercept $b$. As $c$ it is independent from $y$, then $$E[(r+c)^2] = E[r]^2+c^2 \\ E[r+c]^2 = E[r^2]+c^2$$ Then $V(y)$ and $MSE(y)$ remain unchanged.