Matrix representation, linear transformation , general question on linear algebra

218 Views Asked by At

Let $X$ be a finite dimensional vector space over $K$ and define $T:X\rightarrow X$ to be a linear transformation on $X$. If $\alpha, \beta$ are two different basis for $X$ then we know that the matrix representation $A=[T]_\alpha$ is similar to $B=[T]_\beta$. Is the converse true? That is suppose $A$, $B$ are $n\times n$ ($n=\dim X$) matrix, then if $A=[T]_\alpha$ and $A$ is similar to $B$, is it true that there exists basis $\beta$ such that $B=[T]_\beta$.

1

There are 1 best solutions below

6
On BEST ANSWER

Yes. Let $A$ represent $T$ in the basis $e_1,e_2,\ldots,e_n$. Since $A$ and $B$ are similar, there is an invertible matrix $P$ such that $A=PBP^{-1}$. Then $P^{-1}(e_1),P^{-1}(e_2),\ldots,P^{-1}(e_n)$ is another basis since $P$ is invertible. $A$ is uniquely determined by the vectors $A(e_1),A(e_2),\ldots,A(e_n)$, and you can get the columns of $A$ from this. Namely, if $A=(a_{ij})$ then

$$A(e_j)=\sum_{i}{a_{ij}e_i}$$

Let $f_j=P^{-1}(e_j)$. Let $B=(b_{ij})$ be the matrix representation of $B$ in the basis $(f_i)$. Notice that

$$B(f_j)=\sum_{i}{b_{ij}f_i}$$

and

$$A(e_j)=P(B(f_j))=\sum_{i}{P(b_{ij}f_i)}=\sum_{i}{b_{ij}P(f_i)}=\sum_i{b_{ij}e_i}$$

so $a_{ij}=b_{ij}$ for all $i,j$.