Finding the associated matrix of a linear transformation to calculate the characteristic polynomial

1.8k Views Asked by At

Let $T : M_{n \times n}(\Bbb R) \to M_{n \times n}(\Bbb R)$ be the function given by $T(A)=A^t$ (the transpose of $A$).

I need to find the minimal polynomial and the characteristic polynomial of $T$. So, to find the characteristic polynomial, I'm trying to find the associated matrix of $T$. I did it for the case $n=2$. The coordinates of a matrix $\left( \begin{array}{ccc} a & b \\ c & d \end{array} \right)$ in the canonical basis $\beta$ is $[X]_\beta=\left( \begin{array}{c} a \\ b \\ c \\ d \end{array} \right)$. Let

$$A = \left( \begin{array}{ccc} 1 & 0 & 0 & 0 \\ 0 & 0 & 1 &0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 &0 & 1 \end{array} \right)$$

Then

$$A[X]_\beta=[A^t]_\beta==\left( \begin{array}{c} a \\ c \\ b \\ d \end{array} \right).$$

So, $A$ is the associated matrix of $T$. I know I can do the same for any $n$, but I don't know how to generalize it, and I need it to find the characteristic polynomial, solving $\det (\lambda I-A)=0$. Maybe there's an easier way to find the characteristic polynomial, if you know it, please let me know. If not, how can I generaize this matrix $A$ for any $n$ to find $\det (\lambda I-A)=0$? Thanks so much for your help,

3

There are 3 best solutions below

0
On BEST ANSWER

Simply note that $T^2=I$ but $T\ne I$.

This gives you the minimal polynomial of $T$. The characteristic polynomial has the same irreducible factors as the minimal polynomial and degree equal to the dimension of the space $T$ acts on.

0
On

It might be interesting to write down the matrix of $T$. Any $A\in M_{n}$ has the form $$ A=\sum_{i,j}A_{ij}e_{ij}\in M_{n} $$ which is identified with $$ A=\sum_{i,j}A_{ij}e_{i}\otimes e_{j}\in\mathbb{R}^{n^{2}}. $$ Now, $T:\mathbb{R}^{n^{2}}\rightarrow\mathbb{R}^{n^{2}}$ by $e_{j}\otimes e_{i}\mapsto e_{i}\otimes e_{j}$, i.e., $$ T=\sum_{i,j}e_{ij}\otimes e_{ji} $$ For example, when $n=2$, \begin{eqnarray*} e_{11}\otimes e_{11} & = & \left[\begin{array}{cccc} 1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 \end{array}\right]\\ e_{12}\otimes e_{21} & = & \left[\begin{array}{cccc} 0 & 0 & 0 & 0\\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 \end{array}\right]\\ e_{21}\otimes e_{12} & = & \left[\begin{array}{cccc} 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\\ 0 & 1 & 0 & 0\\ 0 & 0 & 0 & 0 \end{array}\right]\\ e_{22}\otimes e_{22} & = & \left[\begin{array}{cccc} 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 1 \end{array}\right] \end{eqnarray*} $\sum_{i,j=1}^{2}e_{ij}\otimes e_{ji}$ is the matrix of $T$ in the original post. The general case is similar.

0
On

As in lhf's answer, for minpoly, simply observe $T\ne{\rm Id}$ but $T^2={\rm Id}$.

The key here is to first consider the simplest transformations under the transpose. The two simplest cases are those of symmetric and antisymmetric matrices: $A=A^t$ and $A=-A^t$ respectively.

Show that every matrix can be written uniquely as a sum of a symmetric and antisymmetric matrix (the operative word here is "polarization," in particular the maps $A\mapsto (A\pm A^t)/2$.)

What are the eigenvalues of the transpose map $A\mapsto A^t$ on the space of symmetric matrices? And on the space of antisymmetric matrices? What are the dimensions of these two subspaces?