Smart way to compute Residual Sum of Squares (RSS) in Multiple Linear Regression

3.3k Views Asked by At

Is there any smarter way to compute Residual Sum of Squares(RSS) in Multiple Linear Regression other then fitting the model -> find coefficients -> find fitted values -> find residuals -> find norm of residuals... If I need only RSS and nothing else. For example, in best subset selection, we need to determine RSS of many reduced models..

1

There are 1 best solutions below

0
On

[update] Upps, just saw your comment at @Nameless, that you have 10000 variables. So I think, that covariaton-approach is useless too. Should I delete the answer? [/update]

One method needs only the inversion of the covariation matrix, don't know whether this is already smart enough?

Construct the datamatrix $D$ with the top row from the rowvector of $Y$-values, then the rowvectors of $X$-variables/values. If $X$ and $Y$-variables are not centered append one more row containing only 1. (If you have, say 3 $X$-variables and $n$ cases, you have then a $4 \times n$ or $ 5 \times n$ matrix).

Then compute the dotproduct of D with itself $C= D \cdot D^t$ and the inverse $B=C^{-1}$ Then take the reciprocal of the top-left entry of $B$, say $s = 1/B_{1,1}$ Then s is the sum-of-squares of the residuals.