I have some formulas that I am trying to cancel down,
From $$+X[(X'X)^{-1}X'X](X'X)^{-1}X')$$
The formula cancels down to,
$$X[I](X'X)^{-1}X')$$
My question is what rule am I using to get to here,
My algebra is,
$$X^{-1}(X')^{-1}X'X$$ $$X^{-1}(X^{-1})'X'X$$
Then can I write,
$$X^{-1}X'X^{-1}X'$$
$$II$$ $$I$$
Is my algebra correct?
The full question I am answering if that adds clarity,
Let X be an $n$ × $k$ matrix, I an $n$ × $n$ identity matrix, and let $M = I − X(X'X)^{-1}X'$. Assume that the $k$ × $k$ matrix $X'X$ is invertible ($X$ has full column rank $k$, i.e. its $k$ columns are linearly independent). (i) Show that $M$ is symmetric and idempotent. (ii) Show that $MX = 0$
Your question is "why is $(X'X)^{-1}(X'X)=I$" apparently?
That is the definition of what the inverse does. $A^{-1}A=AA^{-1}=I$, where you're looking at $A=X'X$.
No.
However, there may be interesting misconceptions to address in the last two points above. Could you explain a little more why you made these two steps? Maybe I will see the problem and be able to explain a little better.
If, in fact, $A$ were nonsingular and square, then $(A'A)^{-1}A'A=A^{-1}(A')^{-1}A'A=A^{-1}IA=A^{-1}A=I$ would be a valid line of deduction. (This is a special case of the more general fact that Digitalis mentioned, that for invertible square matrices $A,B$, $(AB)^{-1}=B^{-1}A^{-1}$.) But as I mentioned, part of the usefulness of this pseudoinverse is that $A$ doesn't have to be square.