Linear Regression - Proof Sum Adds to Zero

2.2k Views Asked by At

In linear regression, why is $\sum(X_{i} - \mu_{x})$ = $0$? I understand that for ($\sum$ $Y_{i}$ minus the fitted value of Y) = $\sum$ $e_{i}$ this is true but why is this other fact true?

2

There are 2 best solutions below

1
On BEST ANSWER

Just break up the sum into two sums, and substitute the definition of $\mu_x$. More explicitly, $$\sum_{i=1}^n (X_i - \mu_X) = \sum_{i=1}^n X_i - \sum_{i=1}^n \mu_x = \left[\sum_{i=1}^n X_i\right] - n\mu_x = \left[\sum_{i=1}^n X_i\right] - n\times \left[\frac{1}{n}\sum_{i=1}^n X_i\right] = 0.$$

1
On

Let $\mu_x$ denote the mean of the observations $X_i$. Then $X_i-\mu_x$ is the deviation of $X_i$ from the mean value or "error". Positive value for $X_i-\mu_x$ shows that $X_i$ is above the mean value while negative value of $X_i-\mu_x$ shows that $X_i$ is below the mean. By a little consideration it can be seen that the sum of errors must be zero since mean is the value where errors above equal the errors below. Hence $\sum(X_{i} - \mu_{x})=0$.