Let $A$ be a matrix. What are some necessary/sufficient conditions for the Gram matrix $A^T A$ to be invertible?
This question came up when I was trying to learn about least-squares regression. Is it true that a regression matrix will always have $A^TA$ invertible?
I’m assuming that you’re talking about matrices over a field, e.g. $\mathbb R$ or $\mathbb C$, so that the various definitions of “rank” coincide.
$A^\top A$ is invertible iff it has full rank. It has the same rank as $A$ (since it annihilates the same vectors as $A$ on both sides). So if $A$ is $m\times n$ (so that $A^\top A$ is $n\times n$), then $A^\top A$ is invertible iff $m\ge n$ and $A$ has rank $n$.
I’m not sure what you mean by a “regression matrix”. If you perform linear regression, $A^\top A$ may be singular if you don’t have enough different points to work with or if some of the functions whose linear combination you’re considering are linearly dependent.