I am hoping to find a general algorithm to decide if a quadratic form is positive definite. My approach has been as follows:
Let $q$ be a quadratic form, and let $q(h) = h^{T}Ah$. Since $A$ is symmetric, we can use the spectral theorem and let $A = P \Lambda P^{T}$ where $P$ is unitary. Letting $k = Ph$, we change basis and get $$q(h) = h^{T}P \Lambda P^{T}h = k^{T}P^{T}P \Lambda P^{T}Ph = k^{T} \Lambda k$$ Thus getting that $$q(h) = k_{1}^2\lambda_{1} +...+k_{n}^2\lambda_{n}$$ where $k=Ph$ and $\lambda_{i}$ are the eigenvalues.
What I want is a general method or an algorithm for determining whether the simplified $q(h)$ is always positive or not. I am not familiar with analysis, but this problem reminds me of intermediate value theorem, but I am not sure how to adopt it to this general case.
Possible solution I came up with while thinking about it more.
Theorem?: $q$ is positive definite iff there is no negative eigenvalue.
The first implication is easy. If there is no negative eigenvalues, then $q$ is positive definite since $k_{i}^2$ is always positive. Now suppose there is a negative eigenvalue, $\lambda_{j}$. We must show that $q$ is not positive definite.
Consider the vector $x$ s.t. $P(x) = e_{j}$. Where $e$ denotes the standard basis. We know this exists since $P$ is unitary. We get that $q(x) = \lambda_{j}<0$, and so $q$ is not positive definite, as desired.
It is well-known that a symmetric matrix (or quadratic form) is positive definite if and only if all its eigenvalues are positive. There is an even simpler criterion based on Descartes' rule of signs: Let $X^n+a_1X^{n-1}+\ldots+a_n$ be the characteristic polynomial of a real matrix $A$. Then all eigenvalues $A$ are real and positive if and only if the coefficients $a_i$ have alternating signs, i.e. $a_i(-1)^i>0$ for $i=1,\ldots,n$.