I am having some trouble finding a basis of eigenvectors that diagonalizes two matrices simultaneously.
I have found two bases of eigenvectors for two 3x3 matrices.
I can't seem to find an algorithm or any discussions on MSE on how to do this.
So, my idea is to arrange all 6 vectors into rows, place them in a matrix, and row-reduce (which is then equivalent to performing column operations on the 6 column vectors) until I find 3 linearly independent rows. These 3, new, row vectors form a new basis for $R^3$, but does not quite diagonalize my matrices anymore. So, they aren't eigenvectors anymore -- at least not all of them are.
But I feel this guess at an algorithm almost works -- using this new basis, my matrices A and B were almost diagonal. Off by one non-zero entry away from the main diagonal.
Do you know of an algorithm to find the explicit basis for simultaneously diagonalizability of matrices? I have only seen proofs on MSE of the existence of such a basis, which I already understand and have verified. But I want to compute this basis.
Thanks,
EDIT: the two matrices commute.
Consider for example
$$ A = \pmatrix{4 & 6 & 0\cr -1 & -1 & 0\cr 3 & 6 & 1\cr}, B = \pmatrix{8 & 6 & -2\cr -2 & 1 & 1\cr 4 & 6 & 2\cr} $$
which do commute, and are diagonalizable $A$ has eigenvalues $1$ (with multiplicity $2$) and $2$, and $B$ has eigenvalues $3$ and $4$ (with multiplicity $2$).
Eigenvectors for the simple eigenvalues should be in the basis: say $$ u = \pmatrix{3 \cr -1\cr 3\cr}, \ v = \pmatrix{1 \cr 0\cr 2\cr} $$ (for eigenvalue $2$ of $A$ and $3$ of $B$ respectively). $u$ is also an eigenvector of $B$ for eigenvalue $4$, and $v$ is also an eigenvector of $A$ for eigenvalue $1$.
The third member of the basis, say $w$, must be another eigenvector of $A$ for eigenvalue $1$ and of $B$ for eigenvalue $4$. Therefore it satisfies $(A - I) w = 0$ and $(B - 4I) w = 0$. Putting $A - I$ on top of $B-4I$ and row-reducing the resulting $6 \times 3$ matrix, we find a solution: $$ w = \pmatrix{2\cr -1\cr 1\cr}$$