I do not understand why $(X^TX)^{-1}X^T * (X^TX)^{-1}X^T = (X^TX)^{=1}*I$
I keep trying to work it out and I just don't see it. This equality shows up when deriving the variance formula for coefficient estimates in linear regression (we have to square $(X^TX)^{-1}X^T$).
Would someone be able to show me/help really understand why this is true?
If you are interested in knowing the variance of the coefficients we have $$ \boldsymbol{Y} \sim \mathcal{N}(\boldsymbol{\beta}X,\sigma^2 I)\ $$ and $$ \hat{\boldsymbol{\beta}} = (X^T X)^{-1} X^T \boldsymbol{Y} $$ then we need the variance of $\hat{\boldsymbol{\beta}}$. \begin{align*} Var(\hat{\boldsymbol{\beta}} ) & = Var((X^T X)^{-1} X^T \boldsymbol{Y}) \\ & = [(X^T X)^{-1} X^T] Var(\boldsymbol{Y}) [(X^T X)^{-1} X^T]^T \\ & = [(X^T X)^{-1} X^T] \sigma^2 I (X^T)^T ((X^T X)^{-1})^T \\ & = \sigma^2 (X^T X)^{-1} X^T X((X^T X)^{-1})^T \\ & = \sigma^2 (X^T X)^{-1} (X^T X)((X^T X)^{-1})^T \\ &= \sigma^2 ((X^T X)^{-1})^T \\ & = \sigma^2 ((X^T X)^{T})^{-1} \\ & = \sigma^2 (X^T (X^T)^T)^{-1} \\ & = \sigma^2 (X^T X)^{-1} \end{align*} using $Var(A\boldsymbol{Y}) = A Var(\boldsymbol{Y}) A^T$