If $A=\begin{bmatrix}a&b\cr b&c\end{bmatrix}$ show $2|b|≤$difference of eigenvalues

57 Views Asked by At

If $A=\begin{bmatrix}a&b\cr b&c\end{bmatrix}$ and let $\alpha\ \&\ \beta$ be the eigenvalues where $\alpha\ge\beta$. Show $2|b|≤\alpha-\beta$

I first tried to find the eigenvalues by solving for:

$$(a-\lambda)(c-\lambda)-b^2=0$$ $$\lambda^2+(-a-c)\lambda+ac-b^2=0$$

we then could substitute the eigenvalues to get:

$$\alpha^2+(-a-c)\alpha=b^2-ac$$

and

$$\beta^2+(-a-c)\beta=b^2-ac$$

But I don't know how to proceed from here

1

There are 1 best solutions below

2
On BEST ANSWER

The (nonnegative) difference of the roots of a quadratic $$x^2 + Ax + B = 0$$ is given by $$\sqrt{A^2 - 4B}.$$ (Just use the quadratic formula.)
In this case, we get the difference $\alpha - \beta$ to be $$\sqrt{(a+c)^2 - 4ac + 4b^2} = \sqrt{(a - c)^2 + 4b^2} \ge \sqrt{4b^2} = 2|b|.$$