What is the relation among similar linear transformations (matrices)?

1.2k Views Asked by At

Recently I found the following definition: Two linear transformations $\alpha$ and $\beta$ are said to be similar if there exists a third linear transformation $\gamma$ such that $\alpha = \gamma^{-1}\beta\gamma$. So I wonder what is the relation among similar linear transformations? What does it mean to be similar? Do they have the same eigenvalues?

3

There are 3 best solutions below

4
On BEST ANSWER

Let $V$ and $W$ be vector spaces over a field $\Bbb{F}$, and let $T:V \to V$ and $S: W \to W$ be linear transformations which are similar, so that there exists a linear isomorphism $\gamma: V \to W$ such that $T = \gamma^{-1} \circ S \circ \gamma $. Strictly speaking, $T$ and $S$ are of course different functions (unless $V=W$ and $\gamma = \text{id}$).

However, consider the following analogy: similarity of linear maps is kind of like wearing sunglasses vs if you don't. If you wear sunglasses, everything appears darker, but if you don't wear them, then things appear brighter. There may be instances when you might prefer to wear them (e.g. if its sunny outside) and on some days you might not want to wear them at all. But on the whole, roughly speaking the actual things you see (like cars, buldings etc) are the same, it's just a matter of "how" things appear to be. This is exactly what happens with similar linear maps; there may be times when you might prefer to work with $T$ as opposed to $S$ (or vice-versa), but on the whole, both $T$ and $S$ have many of the same properties. So, you should think of $S$ and $T$ as giving you information "from different perspectives".

Since $S$ and $T$ are linear maps which can be obtained from one another by composing with isomorphisms, pretty much any property of one map can be rephrased in terms of the map. For example, we can prove that the kernels and images are isomorphic: \begin{align} \ker(T) \cong \ker(S) \quad \text{and} \quad \text{im}(T) \cong \text{im}(S) \end{align} (use $\gamma$ or $\gamma^{-1}$ to explicitly construct the isomorphisms). It might be helpful to view the relationship in this commutative diagram:

$\require{AMScd}$ \begin{CD} V @>{T}>> V \\ @V{\gamma}VV @VV{\gamma}V \\ W @>>{S }> W \end{CD}

So far there is no need at all to even assume $V$ and $W$ are finite-dimensional. The finite-dimensional case is especially useful for applications because it says that the nullity (dimension of the kernel) and the ranks (dimension of the image) of similar maps are the same.

Continuing in this theme, for any $\lambda \in \Bbb{F}$, since $T$ and $S$ are similar, it follows that $(T-\lambda I_V)$ and $(S - \lambda I_W)$ are also similar (via $\gamma$). Hence, by the argument above, the kernels of these maps are isomorphic. But notice that the $\ker(T - \lambda I_V)$ is precisely the eigenspace for $T$ corresponding to $\lambda$. Hence, this argument shows that for any scalar $\lambda$, the eigenspaces of $S$ and $T$ are isomorphic (in finite-dimensions, this is equivalent to having same dimension). From this, you can conclude that $T$ is digonalizable if and only if $S$ is diagonalizable; in fact, my argument above regarding the eigenspaces being isomorphic gives you a nice computational recipe: if you have a basis $\beta_{V,T} = \{v_1, \dots, v_n\}$ of $V$ consisting of eigenvectors of $T$, then $\beta_{W,S} = \{\gamma(v_1), \dots, \gamma(v_n)\}$ will be a basis of $W$ consisting of eigenvectors of $S$.

We can keep this argument going: for any positive integer $k$, the maps $(T-\lambda I_V)^k$ and $(S-\lambda I_W)^k$ are also isomorphic (again using $\gamma$). Hence, their kernels are also isomorphic (i.e the generalized eigenspaces are isomorphic). A similar argument to the previous paragraph shows that if you have a Jordan basis $J_{V,T}$ of $V$ for the map $T$, then $J_{W,S} := \gamma(J_{V,T})$ will be a Jordan basis of $W$ for the map $S$.

So, just briefly summarizing some nice common properties, we can show that if $T$ and $S$ are similar maps, then they have the same characteristic polynomial, minimal polynomial, same diagonal form (if they're diagonalizable), same Jordan Canonical Form (if it exists), and the same rational canonical form (and a lot of other properties).


BTW, you may be interested to take a look at this answer of mine, where I explain how this particular idea is used in the context of differential geometry and tangent spaces to perform a lot of computations (even if you do not know anything about manifolds, I think the linear-algebraic parts of my answer should be understandable).

8
On

A linear operator is represented in different bases by similar matrices. Therefore, similar matrices have the same eigenvalues, dimensions of eigenspaces, characteristic and minimal polynomials.

0
On

Two linear map $\alpha$ and $\beta$ are similar i.e. $\exists \gamma$ invertible such that $\beta =\gamma ^{-1}\alpha \gamma$

if and only if they represent the same linear map in different set of basis.

Let $A$ and $C$ be the matrices representing $\alpha$ and $\gamma$ respectively (with respect to a basis $\mathbf e_1,...,\mathbf e_n$).

Consider a new set of basis $$\mathbf f_j=\sum_{i=1}^nC_{i,j}\mathbf e_i$$ for $j=1,...,n$.

Suppose that $B$ is the matrix representing the map $\alpha$ with respect to the new basis $\mathbf f_1,...,\mathbf f_n$

On the one hand, $$\alpha (\mathbf f_j)=\sum_{i=1}^nC_{i,j}\alpha(\mathbf e_i)$$

$$\qquad=\sum_{i=1}^n \sum_{k=1}^nC_{i,j}A_{k,i}\mathbf e_k$$

$$\;\;=\sum_{k=1}^n[AC]_{k,j}\mathbf e_k.$$

On the other hand, $$\alpha (\mathbf f_j)=\sum_{i=1}^nB_{i,j}\mathbf f_i$$

$$\qquad\qquad=\sum_{i=1}^n\sum_{k=1}^nB_{i,j}C_{k,i}\mathbf e_k$$

$$\qquad\;\;=\sum_{k=1}^n[CB]_{k,j}\mathbf e_k.$$

Since this is true for all $j=1,...,n$, it follows that $AC=CB$. Since $C$ invertible, $B=C^{-1}AC$. Therefore $\beta$ is identical to the map $\alpha$ under changes of basis from $\mathbf e_1,...,\mathbf e_n$ to $\mathbf f_1,...,\mathbf f_n$.