If the eigenvalues of matrix AB are eigenvalues of matrix BA, does that mean they have the same eigenvalues?

244 Views Asked by At

From this question: Do $AB$ and $BA$ have the same eigenvalues?, I am confused about this solution.

Why is it sufficient to show that every eigenvalue of AB is an eigenvalue of BA? That doesn't seem to prove the case that the eigenvalues of BA are eigenvalues of AB. Why don't you then need to show the other direction?

Also, is this true for the general case of matrices not composed of AB, for all matrices $C,D \in R^{nxn}$?

A more general result:

If $A$ and $B$ are $n \times n$ matrices, then we have for a scalar $\lambda \ne 0$:

$ \lambda$ is an eigenvalue of $AB$ iff $ \lambda$ is an eigenvalue of $BA$ .

Proof: if $ \lambda$ is an eigenvalue of $AB$ , then there is $x \ne > 0$ such that

$(*)$ $ABx= \lambda x$.

Let $y:=Bx$. Then $y \ne 0$ (otherwise we would get from $(*)$ that $ > \lambda =0$ or $x=0$).

Now we have

$$ BAy=BABx=B(ABx)=B( \lambda x)=\lambda Bx = \lambda y.$$

It follows that $\lambda$ is an eigenvalue of $BA$.

3

There are 3 best solutions below

1
On

Now it is proved that if $\lambda$ is an eigenvalue of $AB$ then it is an eigenvalue of $BA$.

We can apply this result but replacing $A$ by $B$ and $B$ by $A$. That is.

"If $\lambda$ is an eigenvalue of $BA$ then it is an eigenvalue of $AB$."

So, the proof is complete.

2
On

It is by symmetry:

we write it out explicitly by switchign the roles of $A$ and $B$.

Proof: if $ \lambda$ is an eigenvalue of $BA$ , then there is $x \ne > 0$ such that

$(*)$ $BAx= \lambda x$.

Let $y:=Ax$. Then $y \ne 0$ (otherwise we would get from $(*)$ that $ > \lambda =0$ or $x=0$).

Now we have

$$ ABy=ABAx=A(BAx)=A( \lambda x)=\lambda Ax = \lambda y.$$

It follows that $\lambda$ is an eigenvalue of $AB$.

0
On

Moreover matricies $AB$ and $BA$ have the same characteristic polynomials. Basic field $F$ assumed is assumed to be arbitrary

1) If one of the matricies, say $A$ is invertible then $BA=A^{-1}(AB)A$ and we are done

2) None of $A$ or $B$ is invertible. Consider transcendental expansion $F(\varepsilon)$ - that is simply the field of all rational functions $\frac{P(\varepsilon)}{Q(\varepsilon)}$ where $P$ and $Q$ are polinomials under $F$. Consider $A_\varepsilon=A-\varepsilon I$ it is invertible as $\varepsilon$ isn't alghebraic number. So $|A_\varepsilon B-\lambda I|= |BA_\varepsilon -\lambda I|$ from which we have $|AB-\lambda I-\varepsilon B|=|BA-\lambda I-\varepsilon B|$ Consider both sides of equality as polinomials with variable $\varepsilon$ (and fixing $\lambda$ wrom algebraic closure of $F$), they are equal iff all the coefficients are equal, in particular free terms must be equal but free terms are $|AB-\lambda I|$ and $|BA-\lambda I|.$

PS. If basic field $F$ is assumed real or complex the part 2) can be simplified. We can choose $\varepsilon>0$ sufficently small to have no nonzero eigenvalue of $A$ in $\varepsilon$-neighborhood of $0$ and getting equality $|AB-\lambda I-\varepsilon B|=|BA-\lambda I-\varepsilon B|$ send $\varepsilon$ to $0.$