Linear transformation given by matrix multiplication

87 Views Asked by At

enter image description here

I don't think I quite understand this linear transformation. To me, it just looks as if $T_A = A$. Is there an application of cayley hamilton which can help me with this problem? Or a way to conceptualize the matrix of linear transformation?

1

There are 1 best solutions below

2
On BEST ANSWER

.$T_A = A$ is a statement that makes sense when you think of their actions on matrices, i.e. $T_A$ acts like right multiplication by $A$. They are definitely not equal, since $T_A$ is a linear transformation and $A$ is a matrix. Even if you argue that $T_A$ is a matrix via choice of some basis, even then it is an $n^2 \times n^2$ matrix, hence in no way is it clear that $T_A$ and $A$ must have any more similarities, let alone $T_A = A$.

How you'd solve this question is then completely by definition of what $f(A)$ and $f(T_A)$ mean. If $f(x) = \sum_{k=0}^n c_k x^k$ where $c_k$ are constant scalars, then clearly $f(A) = \sum_{k=0} c_k A^k$, and $f(T_A) = \sum_{k=0} c_k (T_A)^k$, where $(T_A)^k$ is composition $T_A \circ ... \circ T_A$.

Hence, we have for all $B$, $$ f(T_A)(B) = \sum_{k=0}^n c_k (T_A)^k B = \sum c_k (T_A \circ ... \circ T_A) B \\ = \sum c_k (A \times A ... \times A) B = \sum c_k A^k B = \left(\sum c_k A^k\right) B = f(A) B $$

where $f(T_A)$ is a linear transformation, and $f(A)$ is a matrix.

Now, if $f(T_A) = 0$ then $f(T_A) B = 0$ for all $B$, then from above $f(A) B = 0$ for all $B$, so $f(A) = 0$. Arguing reverse, one sees that $f(T_A) = 0 \iff f(A) = 0_{n \times n}$. (Note : I have indicated the zero matrix by $0_{n \times n}$. The other is the zero linear transformation).

Now, it should be clear that $A$ and $T_A$ have the same minimal polynomial, since the set of polynomials they satisfy are the same. However, the characteristic polynomials need not match. Look at the degree of the characteristic polynomials of $A$ and $T_A$ to see why.