Relation between Sum and Product of Matrix Eigenvalues

1k Views Asked by At

For any square matrix $A$, let $\alpha $ be the smallest eigenvalue of $% \frac{1}{2}\left( A^{T}+A\right) $, and let $\beta $ be the largest eigenvalue of $A^{T}A$. Does the following relationship hold \begin{equation} \alpha \leq \sqrt{\beta } \end{equation}

The result is clearly true for symmetric matrices. How about general square matrices?

1

There are 1 best solutions below

2
On BEST ANSWER

The inequality is true for all real square matrices. If $\alpha$ is negative, we have nothing to prove. If $\alpha$ is nonnegative, then it is also the minimum singular value of $H=\frac12(A+A^T)$ and $H$ is positive semidefinite in this case. So we are comparing the maximum singular value of $A$ with the minimum singular value of $H$. Since the $k$-th largest singular value of a matrix $A$ is always greater than or equal to the $k$-th largest singular value of $H$ when the latter is positive semidefinite (see my answer in another thread, for instance), the result follows.

Edit. The last part of above answer may an overkill as it makes use of a much stronger result. Below is a simpler answer. We only need to prove that $\lambda_\min(H)\le\sigma_1(A)$ when $H$ is positive semidefinite, but this is rather obvious because $$ \min_{\|x\|=1}x^THx = \min_{\|x\|=1}x^TAx\le\min_{\|x\|=1}\|x\|\|Ax\|=\min_{\|x\|=1}\|Ax\|\le\max_{\|x\|=1}\|Ax\|. $$ (All norms in the above line refer to the Euclidean 2-norm.)