For a given $n \times n$-matrix $A$, the characteristic polynomial of $A$ is $\lambda^n+a_{n-1}\lambda^{n-1}+\cdots+a_1\lambda+a_0$. I am curious to know if we can upper bound the coefficients of this polynomial using the singular values of A. This is obviously possible when A is symmetric, due to the relation $|\lambda_i| = \sigma_i$.
When A is not symmetric, this is still possible for $|a_0| = |\text{det}(A)| = |\prod_{i=1}^n \lambda_i| \leq \prod_{i=1}^n \sigma_i$, as well as for $|a_{n-1}| = |\text{trace}(A)| = |\sum_{i=1}^n \lambda_i| \leq \sum_{i=1}^n \sigma_i$. The second inequality can be found in Theorem 2.4 of this document. What about the other coeffients? I am particularly interested in $a_{n-2} = \sum_{i<j}\lambda_i\lambda_j$.
Update: It would be very interesting if it holds that $|a_{n-2}| \leq \sum_{i<j}\sigma_i\sigma_j$. I am testing this conjecture with random matrices and so far I haven't found a counterexample.
Any insights will be greatly appreciated.
Update: The above conjecture is now proved in this post.
Generally speaking, you should be able to obtain bounds via the Gershgorin circle theorem (see https://en.wikipedia.org/wiki/Gershgorin_circle_theorem, or Horn and Johnson's book for a more detailed analysis).
For $\sum_{i=1}^n \sum_{j=1}^{i-1} \lambda_i \lambda_j$, we can obtain a more explicit expression by noticing that $\sum_{i=1}^n \sum_{j=1}^{n} \lambda_i \lambda_j=\big(\sum_{i=1}^n \lambda_i\big)\big(\sum_{i=j}^n \lambda_j\big)=tr(A)^2$, and therefore $\sum_{i=1}^n \sum_{j<i}\lambda_i \lambda_j =\frac{1}{2}tr(A)^2-\sum_{i=1}^n \lambda_i^2$. But $\lambda^2$ is an eigenvalue of $A^2$, so we can simplify this to: \begin{align*} a_{n-2}=\sum_{i=1}^n \sum_{j=1}^{i-1}\lambda_i \lambda_j=\frac{1}{2}tr(A)^2-tr(A^2). \end{align*}