$ST$ and $TS$ have the same eigenvalues.

7.3k Views Asked by At

I am required to prove that if $S$ and $T$ are linear operator on a vector space $V$ then $ST$ and $TS$ have the same eigenvalues could you please provide some hints to get me going without revealing the complete solution.

In addition it would be helpful if you did not refer to characteristic polynomials or determinants in your answer.

2

There are 2 best solutions below

6
On BEST ANSWER

Say $\lambda$ is an eigenvalue of $ST$; there exists $x\ne0$ such that $$STx=\lambda x.$$

If you let $y=Tx$ then it follows that $$TSy=\lambda y.$$

No, that's not a proof. Because $TSy=\lambda y$ does not show that $\lambda$ is an eigenvalue of $TS$. Exercise, that you should do before reading on: Why not?

Why not is because we need to know that $y=Tx\ne0$ to conclude that $\lambda$ is an eigenvalue.

The actual proof splits into two cases.

First assume $\lambda\ne0$. Then the argument above is ok: $STx=\lambda x\ne0$, hence $y=Tx\ne0$.

Now assume $0$ is an eigenvalue of $ST$. This says precisely that $ST$ is not invertible. Hence $S$ and $T$ cannot both be invertible, hence (at least in the finite-dimensional case) $TS$ is not invertible, so $0$ is an eigenvalue of $TS$.

(If $TS$ is invertible then $T$ must be surjective and $S$ must be injective; hence in the finite-dimensional case they are both invertible.)

Note We need to assume $V$ has finite dimension or the result is false. Let $V$ be the space of all one-sided sequences $v=(v_1,\dots)$; let $Sv=(v_2,v_3,\dots)$ and $Tv=(0,v_1,v_2,\dots)$. Then $ST$ is the identity but $TS$ has $0$ for an eigenvalue.

0
On

A proof for matrices is given in this answer.

Since the determinants of the matrices on the left are equal, the determinants on the right are as well. $$ \begin{bmatrix}I_n&-A\\0&\lambda I_m\end{bmatrix} \begin{bmatrix}\lambda I_n&A\\B&I_m\end{bmatrix} =\begin{bmatrix}\lambda I_n-AB&0\\\lambda B&\lambda I_m\end{bmatrix} $$ and $$ \begin{bmatrix}I_n&0\\-B&\lambda I_m\end{bmatrix} \begin{bmatrix}\lambda I_n&A\\B&I_m\end{bmatrix} =\begin{bmatrix}\lambda I_n&A\\0&\lambda I_m-BA\end{bmatrix} $$ This says that $$ \lambda^m\det(\lambda I_n-AB)=\lambda^n\det(\lambda I_m-BA) $$ Thus, for square matrices, the eigenvalues are identical.