This has essentially been asked before here but I guess I need 50 reputation to comment. Also, here I have some questions of my own.
My Proof outline: (forward direction/Necessary direction): Call the symmetric matrix $A$. Write the quadratic form for $A$ as $x^{t}Ax$, where superscript $t$ denotes transpose. $A$ p.d. (positive definite) implies $x^{t}Ax >0 \ \forall x\neq 0$.
if $v$ is an eigenvector of A, then $v^t Av \ =v^t \lambda v \ =\lambda \ >0$ where $\lambda$ is the eigenvalue associated with $v$. $\therefore$ all eigenvalues are positive.
Any hints for the reverse direction? Perhaps I need to write $A$ as $PDP^{-1} $ where D is a diagonal matrix of the eigenvalues of A and the columns of P are eigenvectors?
Also, a more general question but one that is probably important, is that, since the statement does not assume that A is real (in addition to symmetric), does the possibility of complex entries introduce any complications? Do I need to show that the eigenvalues are real?
Thanks.
This is a (sketch of a) proof when the symmetric matrix $A$ is real. Let $u_1, \ldots, u_n$ be the linearly independent eigenvectors which correspond to the positive eigenvalues $\lambda_1, \ldots, \lambda_n$ of the real symmetric matrix $A$. Also, let $z = c_1 u_1 + \cdots + c_nu_n$ be a random $n \times 1 $ real vector with $z\neq \vec 0$. Thus, we have:
$z^TAz \begin{array}[t]{l}= (c_1 u_1^T + \cdots +c_n u_n^T) P D P^{-1}(c_1 u_1 + \cdots +c_n u_n)\\\\ = \begin{pmatrix} c_1 \|u_1\|^2_2 & c_2 \|u_2\|^2_2& \cdots & c_n \|u_n\|^2_2 \end{pmatrix}\cdot \begin{pmatrix} \lambda_1 & 0 & \cdots & 0\\ 0 & \lambda_2 & \cdots & 0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & 0 &\cdots & \lambda_n \end{pmatrix}\cdot \begin{pmatrix} c_1\|u_1\|^2_2 \\ c_2\|u_2\|^2_2 \\ \vdots \\ c_n\|u_n\|^2_2 \end{pmatrix}\\ =\lambda_1 c_1^2 +\cdots + \lambda_n c_n^2, \end{array}$
which is clearly positive due to the positive $\lambda_i$'s and it also holds $\|u_i\|_2^2 =u_i^T\cdot u_i= 1$.
I think you can fill in the details.
Just to clear things out..
Because matrix $A$ is a real symmetric one, it can be written in the form $$A = P \cdot D \cdot P^{-1} = P \cdot D \cdot P^T,$$ where the columns of $P$ contain the right hand eigenvectors of matrix $A$ and $P^{-1} (= P^T$) contain the left hand eigenvectors as its rows. Thus, if $u_i$ 's are the right hand eigenvectors, then $u_i^T $'s are the left hand eigenvectors of $A$. Then, it holds: $$u_i^T \cdot u_j = \begin{cases} 1, & i = j\\ 0,& i\neq j \end{cases}$$