Let A and B be nxn matrices, each with n distinct eigenvalues. Prove that A and B have the same eigenvectors if and only if AB=BA

10.2k Views Asked by At

I have been working on this problem for an hour and I need some help. I'm terrible at proving things and I was just hoping someone could help be refine what I have so far.

A= PD$P^{-1}$ for some diagonal matrix

The columns of P are precisely the eigenvectors of A

From AB=BA, I get that PD$P^{-1}$B = BPD$P^{-1}$

D$P^{-1}$BP = $P^{-1}$BPD

I'm going to let D' = $P^{-1}$BP such that DD' = D'D

Using that lemma (that I'm having trouble proving-- see below) D' is a diagonal so,

B = PD'$P^{-1}$ which implies that B has the same eigenvectors

I think what I'm missing is showing that - If D is a diagonal matrix with distinct diagonal elements and DX=XD, then X is diagonal as well. But I don't know how to go about this or even where in the proof I should put it.

2

There are 2 best solutions below

2
On BEST ANSWER

Let $v_1,\dots,v_n$ be a basis of the eigenvectors of $A$ with eigenvalues $\lambda_1,\dots,\lambda_n$, and assume $AB=BA$.
Then $BAv_i=\lambda_i\,Bv_i$ and suppose $Bv_i=\sum_j\beta_jv_j$, so that $$ABv_i=\sum_j\lambda_j\beta_jv_j\,. $$ Now $BAv_i=ABv_i$ implies $\sum_{j\ne i}(\lambda_j-\lambda_i)\beta_jv_j\ =\ 0$, hence by linear independency and $\lambda_i\ne\lambda_j$, we get $\beta_j=0$ for $j\ne i$, proving that $v_i$ is indeed an eigenvector of $B$.

0
On

If A and B have the same eigenvectors, $A=PD_AP^{-1}$ and $B=PD_BP^{-1}$. This implies $AB = PD_AD_BP^{-1}$, which is true if and only if $PD_BD_AP^{-1}$ since diagonal matrices commute. And thus $AB = PD_BP^{-1}PD_AP^{-1}=BA$.