The following is an exercise:
Thm.: $A_{n \times n}$ is diagonalizable iff its minimal polynomial $m_A(t)$ splits into distinct linear factors.
Use this theorem to prove that if $A, B \in \mathbb{F}^{n \times n}$ are diagonalizable matrices such that $AB=BA$ then there exists a $P$ such that $P^{-1}AP,\, P^{-1}BP$ are diagonal matrices. In other words, $A$ and $B$ have $n$ common linearly independent eigenvectors.
Could anyone give me a hint on how to start this proof?
I'll answer in the language of linear maps which can be translated easily to the required result for matrices. If $T \colon V \rightarrow V$ is diagonalizable and $W \subseteq V$ is $T$-invariant then $T|_W$ is also diagonalizable. The reason is that the minimal polynomial of $T|_W$ must divide the minimal polynomial of $T$ and so if the minimal polynomial of $T$ splits into distinct linear factors, so does the minimal polynomial of $T|_W$ which implies that $T|_W$ is diagonalizable.
Now, if $T,S \colon V \rightarrow V$ are diagonalizable and commute ($TS = ST$), let $\lambda_1, \dots, \lambda_k$ denote the distinct eigenvalues of $T$ and by $V_i = \ker(T - \lambda_i I)$ the corresponding eigenspaces. If $v \in V_i$ then $Tv = \lambda_i v$ and $T(Sv) = S(Tv) = S(\lambda_i v) = \lambda_i Sv$ so $Sv \in V_i$. Thus, each $V_i$ is $S$-invariant and since $S$ is diagonalizable, so is $S|_{V_i}$ for each $1 \leq i \leq k$. If we choose a basis of eigenvectors in $V_i$ for $S|_{V_i}$ for each $1 \leq i \leq k$ we will obtain a basis of eigenvectors of both $S$ and $T$ which will imply that $S$ and $T$ are simultaneously diagonalizable.