Let $X_n$ be a set of $n\times n$ matrices with $[X_i, X_j] = X_i X_j - X_j X_i = 0$. A theorem by Schur shows they can be brought together to the form:
$$ X_i = \alpha \left[ \begin{matrix} I_{n/2} && M_i \\ 0 && I_{n/2} \end{matrix} \right]$$
With $M_i$ being an $\frac{n}{2} \times \frac{n}{2}$ matrix. How can one find the diagonalizing basis? How computationally hard is this calculation to perform?
After some work, I think I found something.
We can easily find $\alpha$, for every $i \le n$ we have $\alpha = tr[X_i]/n$.
after finding $\alpha$, we can normalize $X'_n = X_n/ \alpha $. I will assume $\alpha=1$ from now on.
There are at least $n/2$ vectors such that : $$ X_n v_i = v_i$$
We can find them by solving $(X_i - I)v$ for any $i\le n$. This is an $O(n^3)$ calculation as far as I can see.
Using the $n/2$ vectors we found, along with any $n/2$ independent vectors, all the $X_n$ matrices will be in the form: $$ P^{-1}X_iP = \left(\begin{matrix} I && M_i\\ 0 && S_i \end{matrix} \right)$$
With $S_i$ being a diagonalisable matrix with a single eigenvalue 1. We will get the final form of $X_i$ by mutually diagonalising $S_i$ (Which is a standard practice).