Suppose $X\sim \mathcal{N} _p (0, \Sigma )$. I am not sure why $X^{\top} \Sigma ^{-1} X$ follows a $\chi ^2$-distribution with $p$ degrees of freedom.
I think it has something to do with the square root of $\Sigma ^{-1}$, since $$\Sigma ^{-1/2} X \sim \mathcal{N} _p (0, \Sigma ^{-1/2} \Sigma {\Sigma ^{-1/2} }^{\top}) = \mathcal{N} _p (0, \Sigma ^{-1/2} \Sigma ^{1/2} \Sigma ^{1/2} \Sigma ^{-1/2} ) = \mathcal{N} _p (0, I_p),$$ but how do I know that $\Sigma$ and $\Sigma ^{-1}$ have square roots and how do I know that $\Sigma ^{-1/2}$ is symmetric?
In the case where $\Sigma$ is singular, the number of degrees of freedom in the chi-square distribution is smaller than $p;$ in any case it's the rank of $\Sigma.$
You have $\Sigma=\operatorname E((X-\mu)(X-\mu)^\top) = \operatorname E(XX^\top)$ where $\mu=0$ is the $p\times 1$ column vector $\operatorname E(X).$
From that it is obvious that $\Sigma$ is symmetric. It is easy to show $\Sigma$ is positive definite: $$ a^\top \Sigma a = \operatorname{var}(a^\top X) \ge 0 $$ if $a$ is any $p\times 1$ constant (i.e. non-random) vector. The random variable $a^\top X$ is scalar-valued so its variance is a non-negative scalar, strictly positive if $a\ne0$ (and that last uses the assumption that $\Sigma$ is non-singular).
A theorem of linear algebra says that since $\Sigma$ is symmetric and all of its entries are real, there is some orthogonal matrix $G$ (i.e. a matrix $G$ for which $G^\top G = GG^\top = I_p$) and some diagonal matrix $\Lambda$ such that $\Sigma = G\Lambda G^\top.$
The diagonal entries in $\Lambda$ must be positive since they are variances of components of $G^\top X.$
So now replace the positive numbers that are the diagonal entries of $\Lambda$ with their square roots and call that $\Lambda^{1/2}$ and try to show that $G\Lambda G^\top$ is a symmetric positive-definite square root of $\Sigma.$