Similarity of matrices based on polynomials

555 Views Asked by At

For $A,B\in \mathbb{R}^{n,n}$ we know that characteristic polynomilas $p_A(x)=p_B(x)=(x-\lambda_1)(x-\lambda_2)\cdot\ldots\cdot(x-\lambda_n)$, where $\lambda_i\neq\lambda_j$ for $i\neq j$. Prove that $A$ and $B$ are similar matrices.

4

There are 4 best solutions below

2
On BEST ANSWER

The criterion that

$p_A(x) = p_B(x) = \displaystyle \prod_1^n (x - x_i) \tag 1$

with

$i \ne j \Longrightarrow x_i \ne x_j \tag 2$

implies that the eigenvalues of $A$ and $B$ are distinct; hence each matrix is similar to the diagonal matrix

$D = [\delta_{kl}x_l]; \tag 3$

thus there exist invertible matrices $P$ and $Q$ such that

$PAP^{-1} = D = QBQ^{-1}; \tag 4$

but this implies

$A = P^{-1}QBQ^{-1}P = (P^{-1}Q)B(P^{-1}Q)^{-1}, \tag 5$

which shows that $A$ and $B$ are similar. $OE\Delta$.

Nota Bene: In a comment to this answer, our OP avan1235 asks why $A$ and $B$ are similar to $D$; this may be seen as follows; considering first the matrix $A$, we see that since the $x_i$ are distinct, each corresponds to a distinct eigenvector $\vec e_i$:

$A \vec e_i = x_i \vec e_i; \tag 5$

now we may form the matrix $E$ whose columns are the $\vec e_i$:

$E = [\vec e_1 \; \vec e_2 \; \ldots \; \vec e_n ]; \tag 6$

it is easy to see that

$AE = [A\vec e_1 \; A\vec e_2 \; \ldots \; A\vec e_n ] = [x_1 \vec e_1 \; x_2 \vec e_2 \; \ldots \; x_n \vec e_n ]; \tag 7$

it is also easy to see that

$ED = [x_1 \vec e_1 \; x_2 \vec e_2 \; \ldots \; x_n \vec e_n ]; \tag 8$

thus,

$AE = ED; \tag 9$

now since the $x_i$ are distinct, the $\vec e_i$ are linearly independent, whence $E$ is an invertible matrix; therefore we have

$E^{-1}AE = D; \tag{10}$

the same logic applies of course to $B$. End of Note.

2
On

As noted in the comments, $A$ and $B$ are similar to the same diagonal matrix, the matrix of eigenvalues. This is true whenever there is a basis consisting of eigenvectors.

It's easy to see that $P^{-1}AP=D$, where $P$ has columns the eigenvectors, and $D$ is diagonal with the e-values on the diagonal.

0
On

We can prove that if $p_A(t)=(t-\lambda_1)(t-\lambda_2)\cdots(t-\lambda_n)$ where $\lambda_i\ne \lambda_j$ for $i\ne j$, then eigenvectors $x_i\in\ker(A-\lambda_iI)$, $i\le n$ are linearly independent, hence form a basis.

We proceed by induction. Assume $x_i$, $i<k$ are linearly independent. Consider $$ \sum_{i=1}^k \alpha_i x_i =0.\tag{*} $$ By left-multiplying $A$, we get $$ \sum_{i=1}^k \alpha_i \lambda_i x_i =0. $$ With $\text{(*)}$, this leads to $$ \sum_{i=1}^{k-1} \alpha_i (\lambda_i-\lambda_k) x_i =0. $$ By the assumption that $x_i$, $i<k$ are linearly independent, it follows $\alpha_i(\lambda_i-\lambda_k)=0$ for all $i<k$. Since $\lambda_i\ne \lambda_k$, we have $\alpha_i=0$ for all $i<k$, hence $\alpha_k=0$. This shows $x_i$, $i\le k$ are linearly independent, and by induction, it holds $x_i$, $i\le n$ are linearly independent.

Now, since there exists a basis $\{x_i, i\le n\}$ consisting of eigenvectors of $A$, it follows $A$ is diagonalizable.

0
On

Matrices are similar whenever they are related by a change of basis, and similarity is an equivalence relation, so it suffices when a separate change of basis applied to each of the matrices results in the same matrix in both cases. A matrix is diagonalisable if and only if some change of basis (namely one to a basis of eigenvectors) results in a diagonal matrix, and the diagonal entries of that diagonal matrix then are the eigenvalues associated to the respective eigenvectors of the basis used.

You appear to know the result that whenever the characteristic polynomial of a matrix splits into distinct factors $x-\lambda_i$ as given in the question, then the matrix is diagonalisable with those $\lambda_i$ as eigenvalues. Since you are given that this holds for $A$ and $B$, both are similar to the same diagonal matrix, and you are done.