Consider the linear model $Y=X\beta+\varepsilon$, where $Y$ is an $n$ by $1$ vector, $X$ is a known $n$ by $p$ matrix, and $\varepsilon$ is an $n$ by $1$ vector of random errors following normal distribution with mean zero and constant variance $\sigma^2$. Suppose that the number of model parameters equals the sample size; that is $n=p$. Prove that if $n=p$, the error sum of squares = $Y'(I-X(X'X)^{-1}X')Y=0$.
My thinking: if $n=p$, then $(X'X)^{-1}=X^{-1}X'^{-1}$. Then prove is finished. But it is a problem in past comprehensive exam. Did I miss something?
Consider the Singular Value Decomposition of $X=L D R^T$. Recall that the left and right matrices are unitary, so $L^{-1}=L^T$; same for $R$. Then,
$Y^T(I-X(X^T X)^{-1} X^T) y$
$Y^T (I- L D R^T ( R D L^T L D R^T)^{-1} R D L^T) y$
$Y^T (I- L D R^T ( R D^2 R^T) ^{-1} R D L^T ) y$
And if the matrix is full rank/invertible/all singular values are non-zero:
$Y^T (I - L^T D R^T (R D^{-2} R^T)R D L^T) Y$
$Y^T (I - I)Y= 0$