Proving a statement on linear algebra involving inequalities, determinants, and eigenvalues.

59 Views Asked by At

Suppose $A = \begin{bmatrix} x & 1\\ y & 0\end{bmatrix}, B = \begin{bmatrix} z & 1\\ w & 0\end{bmatrix}$, for $x,y,z,w \in \Bbb{R}$.

I have observed by considering many examples of $x,y,z,w$ that:

If all the eigen values of $A^2B$ and $AB^2$ are less than one in absolute value $\implies$ $\det(AB+A+I)<0$ and $\det(BA+B+I)<0$ is not possible.

I wonder how to prove it actually?

It was shown to be true for $y=x, w=z$ case in A pen-and-paper proof for a matrix implication. . Now it only remains to show the claim for the case $y \neq x$ or $w \neq z$.

A computational proof using a computer package was shown in https://mathoverflow.net/questions/435267/proof-of-a-matrix-implication/435689#435689

But I am wondering about the formal or analytical proof for this question which can be done using pen paper.