Let $V$ be a vector space of finite dimension and let $T,S$ linear diagonalizable transformations from $V$ to itself. I need to prove that if $TS=ST$ every eigenspace $V_\lambda$ of $S$ is $T$-invariant and the restriction of $T$ to $V_\lambda$ ($T:{V_{\lambda }}\rightarrow V_{\lambda }$) is diagonalizable. In addition, I need to show that there's a base $B$ of $V$ such that $[S]_{B}^{B}$, $[T]_{B}^{B}$ are diagonalizable if and only if $TS=ST$.
Ok, so first let $v\in V_\lambda$. From $TS=ST$ we get that $\lambda T(v)= S(T(v))$ so $T(v)$ is eigenvector of $S$ and we get what we want. I want to use that in order to get the following claim, I just don't know how. One direction of the "iff" is obvious, the other one is more tricky to me.
This answer is basically the same as Paul Garrett's. --- First I'll state the question as follows.
Let $V$ be a finite dimensional vector space over a field $K$, and let $S$ and $T$ be diagonalizable endomorphisms of $V$. We say that $S$ and $T$ are simultaneously diagonalizable if (and only if) there is a basis of $V$ which diagonalizes both. The theorem is
If $S$ and $T$ are simultaneously diagonalizable, they clearly commute. For the converse, I'll just refer to Theorem 5.1 of The minimal polynomial and some applications by Keith Conrad.
EDIT. The key statement to prove the above theorem is Theorem 4.11 of Keith Conrad's text, which says:
[$F$ is the ground field, $T$ is an indeterminate, and $V$ is finite dimensional.]
The key point to prove Theorem 4.11 is to check the equality $$V=E_{\lambda_1}+···+E_{\lambda_r},$$ where the $\lambda_i$ are the distinct eigenvalues and the $E_{\lambda_i}$ are the corresponding eigenspaces. One can prove this by using Lagrange's interpolation formula: put $$f:=\sum_{i=1}^r\ \prod_{j\not=i}\ \frac{T-\lambda_j}{\lambda_i-\lambda_j}\ \in F[T]$$ and observe that $f(A)$ is the identity of $V$.