prove Spectral theorem

179 Views Asked by At

let $V$ be a vector space of final dimension above $F$ and $T,S\colon V \to V$ diagonalizable linear operators for which holds $ST=TS$. I need to prove $S+T$, $TS$ are also diagonalizable.

For that I have the following guidance:

I need to show that from $ST = TS$ can be infered that $T, S$ simultaneously diagonalizable, i.e., there is a base $B$ of $V$ in which both operators represented by diagonal matrices simultaneously. That I need to show in the following way:

  • firstly, I need to show that for each $\lambda$ holds $S(T-\lambda I)=(T-\lambda I)S$

  • secondly, I need to show that $W_\lambda = \operatorname{ker}(T-\lambda I)$ is invariant subspace.

  • finally, show by induction on the space dimension $n$ that $T, S$ simultaneously diagonalizable, i.e show that $\dim(W_\lambda)\le\dim(V)$ and to use the induction assumption to get that the restrictions of $T,S$ on $W_\lambda$, i.e $T_{|W_{\lambda }}$, $T_{|W_{\lambda }}$ simultaneously diagonalizable.

    In the induction step I need to unite the bases of the sub-spaces of $W_\lambda$ which I got previously.

I easily did the first two points and I've got stuck in the final induction, because I don't understand very well what I should do there. Does anyone see what is meant there?

1

There are 1 best solutions below

2
On BEST ANSWER

For the final part, I would personally use an argument based on the minimal polynomial. (I honestly can't see how you would solve this by induction on the dimension of $V$.)

Let write $$ V = {\rm ker}(T - \lambda_1 I) \oplus \dots \oplus {\rm ker}(T - \lambda_k I),$$ where $\lambda_1, \dots, \lambda_k$ are the distinct eigenvalues of $T$.

You've already shown that each ${\rm ker}(T - \lambda_i I)$ is invariant under the action of $S$. So it makes sense to talk about the "restriction" of $S$ to each ${\rm ker}(T - \lambda_i I)$.

Now recall that a linear operator is diagonalisable if and only if its minimal polynomial is a product of distinct linear factors.

And observe that the minimal polynomial of $S$ over $V$ is the (monic) least common multiple of the minimal polynomials of the restrictions of $S$ to the individual subspaces ${\rm ker}(T - \lambda_i I)$.

Since $S$ is diagonalisable over $V$, its minimal polynomial over $V$ is a product of distinct linear factors, so it must be the case that the minimal polynomials of the restrictions of $S$ to the individual subspaces are also products of distinct linear factors. Hence the restrictions of $S$ to each ${\rm ker}(T - \lambda_i I)$ are diagonalisable.

Thus, each subspace ${\rm ker}(T - \lambda_i I)$ has a basis consisting of eigenvectors $v_{i, 1}, \dots,v_{i, d_i}$ of $S$ (where $d_i = {\rm dim}({\rm ker}(T - \lambda_i I))$). The vectors $v_{i, 1}, \dots, v_{i, d_i}$ are obviously also eigenvectors of $T$ too (with eigenvalue $\lambda_i$).

Combining together the basis vectors from these subspaces, we get a set of vectors $$ v_{1, 1}, \dots, v_{1, d_1}, v_{2, 1}, \dots, v_{2, d_2},\dots, v_{k, 1},\dots, v_{k, d_k},$$ which forms a basis for the whole of $V$, and each vector in this basis is both an eigenvector of $S$ and an eigenvector of $T$. Thus $S$ and $T$ are simultaneously diagonalisable.