Inequality for eigenvalues of two positive definite matrices

109 Views Asked by At

Let $A\in\mathbb{R}^{n\times n}$ such that $A+A^{\rm T}>0$. I am trying to prove (or disprove by counterexample) that \begin{equation} \big(\lambda_{\max}(A+A^{\rm T})\big)^2\geq4\lambda_{\min}(A^{\rm T}A). \end{equation} Numerical investigations suggest that the inequality holds, but I have no idea how to prove it. Any help is appreciated. Thanks

1

There are 1 best solutions below

2
On BEST ANSWER

The statement is false. Consider $$ A = t\pmatrix{0&-1\\1&0} + \epsilon I \qquad t,\epsilon > 0 $$ Note that $A + A^T = 2\epsilon I$, and $A^TA = (t^2 + \epsilon^2)I$. As such, we have $$ [\lambda_{max}(A + A^T)]^2 = 4\epsilon^2\\ 4\lambda_{min}(A^TA) = 4(t^2 + \epsilon^2) $$ Clearly, the desired inequality does not hold in general.