My professor said this during my linear algebra class. But I am not sure how to prove this.
If A is any matrix, then it can be decomposed into a symmetric and anti-symmetric part like below-
$$ A = \frac{A+A^T}{2} + \frac{A-A^T}{2} $$
Then for any $x\neq 0$, the statement claims that if the following holds,
$$ x^T (\frac{A+A^T}{2}) x \geq 0 $$
then, $x^T A x \geq 0$. I am not sure why.
Distributing, we have $$ x^T (\frac{A+A^T}{2}) x =\frac{1}{2}\left(x^T A x + x^T A^Tx\right) $$ Now consider $x^T A^Tx$. Since it's a scalar, we must have $$ x^T A^Tx = \left(x^T A^Tx\right)^T = x^TAx $$ so $$ x^T (\frac{A+A^T}{2}) x =\frac{1}{2}\left(x^T A x + x^T A x\right) = x^T A x. $$