Constrained Linear Regression, how to find Regression SS?

205 Views Asked by At

I'm trying to understand the concept of Constrained Regression.

I have that $C\beta=d$ for some matrix $C$ $(с\times n)$ $\beta (n\times 1)$ and c-vector $d$. I need to minimize with respect to $\beta$

$(y-X\beta)^T(y-X\beta)-\lambda^T(C\beta-d)$ to find Residual SS.

I find estimated constrained beta to be $$\hat\beta_c=(X^TX)^{-1}X^Ty+(X^TX)^{-1}C^T(C(X^TX)^{-1}C^T)^{-1}(d-C(X^TX)^{-1}X^Ty)$$ or$$\hat\beta_c=\hat\beta +(X^TX)^{-1}C^T(C(X^TX)^{-1}C^T)^{-1}(d-C\hat\beta)$$

The Constrained Residula SS is: Defining $P_c=(X^TX)^{-1}C^T(C(X^TX)^{-1}C^T)^{-1}$

$$(y-X\hat\beta)^T(y-X\hat\beta)+(d-C\hat\beta)^TP_c^TX^TXP_c(d-C\hat\beta)$$

Now I'm failing to find the Residual SS

I want to split the above into (constrained) Total SS about mean = Regression SS + Residual SS

My book says that if $d=0$ we have that $\hat\beta^TX^TX\hat\beta$ is the Regression SS, but I don't understand why

1

There are 1 best solutions below

7
On BEST ANSWER

Note that from the F-test you know (or you can simply show) that $$ ResSS_r - ResSS_u = (C\hat{\beta} - d )'(C(X'X)^{-1}C)^{-1}(C\hat{\beta} - d). $$ Now, recall that $SST=ResSS + RegSS$ and for the null model, i.e, when $d=0$, $SST=ResSS$ because $\hat{y}=\bar{y}$. So, $$ ResSS_r - ResSS_u = SST - ResSS_u = RegSS_u. $$ Now, for the algebra $$ RegSS_u= (C\hat{\beta} - 0 )'(C(X'X)^{-1}C)^{-1}(C\hat{\beta} - 0) = \hat{\beta}'X'X\hat{\beta}. $$

EDIT


Note that $H_0: C\beta =d$. Namely, you are testing whether some linear transformation of the coefficients equals $d$. $C$ must be invertible. Where $d=0$ you have a homogeneous linear system of a form $C\beta=0$, that you are testing using $C\hat{\beta}$. As long as $C$ is invertible, it can be in variuos forms because anyway you will get only the trivial solution, i.e., $\beta=0$. Which is exactly what you testing when $d=0$.