Suppose that X and Z are matrices with the same number of rows. Let $$ D = \left[\begin{array}{cc} X' X & X'Z \\ Z'X & Z'Z \end{array} \right]^{-1} - \left[\begin{array}{cc} (X' X)^{-1} & 0 \\ 0 & 0 \end{array} \right],$$ where all inverses are assumed to exist and the zeros represent zero matrices of suitable dimensions. How can we prove that $D$ is positive semidefinite?
Positive semidefinite ordering for covariance matrices
246 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
I will assume $Z$ is a column vector.
Let $P_{\perp} = I - X(X^TX)^{-1}X^T$. This is the orthogonal projector to the orthogonal complement of column space of $X$ and is idempotent and symmetric.
Let $\hat{\beta} = (X^TX)^{-1}X^TZ$. This is the vector of coefficients obtained from regressing $Z$ on $X$.
I will assume $Z$ does not lie completely in column space of $X$, so $S^2 = ||P_{\perp}Z||^2 = Z^TP_{\perp}Z = Z^TZ - Z^TX(X^TX)^{-1}X^TZ \neq 0$. Note $S^2$ is the residual sum of squares.
Recall the formula for inverting a partitioned matrix $$ \left( \begin{matrix} A & B\\ C & D \end{matrix}\right)^{-1} = \left( \begin{matrix} A^{-1} + A^{-1}B (D - CA^{-1}B)^{-1}CA^{-1} & -A^{-1}B(D - CA^{-1}B)^{-1}\\ -(D - CA^{-1}B)^{-1}CA^{-1} & (D - CA^{-1}B)^{-1} \end{matrix}\right). $$
Applying this formula we get $$ \left(\begin{matrix} X^TX & X^TZ\\ Z^TX & Z^TZ\end{matrix}\right)^{-1}=\left(\begin{matrix}(X^TX)^{-1} + \dfrac{\hat{\beta}\hat{\beta}^T}{S^2} & -\dfrac{\hat{\beta}}{S^2}\\ \dfrac{-\hat{\beta}^T}{S^2} & \dfrac{1}{S^2}\end{matrix}\right).$$
So $$ D = \left( \begin{matrix} \dfrac{\hat{\beta}\hat{\beta}^T}{S^2} & -\dfrac{\hat{\beta}}{S^2}\\ \dfrac{-\hat{\beta}^T}{S^2} & \dfrac{1}{S^2}\ \end{matrix} \right) = \left( \begin{matrix} \dfrac{\hat{\beta}}{S} \\ \dfrac{-1}{S} \end{matrix}\right) \left( \begin{matrix} \dfrac{\hat{\beta}^T}{S} & \dfrac{-1}{S} \end{matrix}\right) $$ and so D is clearly positive semidefinite as it is of the form $uu^T$.
$U = \left(\begin{matrix} X^{T}X & X^TZ \\ Z^TX & Z^TZ \end{matrix}\right) = \left( \begin{matrix} X^T \\ Z^T\end{matrix}\right) \left(\begin{matrix} X & Z \end{matrix} \right) $ is clearly positive semidefinite and since we are given it is invertible it must be positive definite.
Consequently for any invertible matrix $P$ the matrix $PUP^T$ is also positive definite and in particular for $P=\left( \begin{matrix} I & 0 \\ -Z^TX(X^TX)^{-1} & I \end{matrix} \right)$ the matrix $$PUP^T = \left( \begin{matrix} X^TX & 0 \\ 0 & Z^TZ - Z^TX(X^TX)^{-1}X^TZ\end{matrix}\right)$$ is positive definite and hence both $X^TX$ and $Z^TZ - Z^TX(X^TX)^{-1}X^TZ = W$(say) must be symmetric positive definite matrices.
Inverting the above we have $(P^{-1})^{T} \left(\begin{matrix} X^{T}X & X^TZ \\ Z^TX & Z^TZ \end{matrix}\right)^{-1} P^{-1} = \left( \begin{matrix} (X^TX)^{-1} & 0 \\ 0 & 0\end{matrix}\right) + \left(\begin{matrix} 0 & 0 \\ 0 & W^{-1}\end{matrix}\right). $
Since $P^{-1} = \left( \begin{matrix} I & 0 \\ Z^TX(X^TX)^{-1} & I \end{matrix} \right) $ we also have $(P^{-1})^T \left( \begin{matrix} (X^TX)^{-1} & 0 \\ 0 & 0\end{matrix}\right) P^{-1} = \left( \begin{matrix} (X^TX)^{-1} & 0 \\ 0 & 0\end{matrix}\right) $.
So we have the identity $$ (P^{-1})^{T}\left(\left(\begin{matrix} X^{T}X & X^TZ \\ Z^TX & Z^TZ \end{matrix}\right)^{-1} - \left( \begin{matrix} (X^TX)^{-1} & 0 \\ 0 & 0\end{matrix}\right) \right) P^{-1} = \left(\begin{matrix} 0 & 0 \\ 0 & W^{-1}\end{matrix}\right)$$
that is
$$ \left(\begin{matrix} X^{T}X & X^TZ \\ Z^TX & Z^TZ \end{matrix}\right)^{-1} - \left( \begin{matrix} (X^TX)^{-1} & 0 \\ 0 & 0\end{matrix}\right) = P^T \left( \begin{matrix} 0 & 0 \\ 0 & W^{-1} \end{matrix} \right) P. $$
Since $W$ is symmetric positive definite matrix, so is $W^{-1}$ and the right hand side is positive semi-definite.