Given $\Delta_{T}(x)=m_{T}(x)=(x-\alpha)^{n}$ and $ TS = ST $ where $T,S$ are linear maps. Prove there is a polynomial $f(x)$ such that $S=f(T)$.

52 Views Asked by At

Problem: Let $V$ denote a vector space over $\mathbb{C}$. Let $T: V \rightarrow V$ denote a linear map such that $\Delta_{T}(x)=m_{T}(x)=(x-\alpha)^{n}$ for some $\alpha \in \mathbb{C} .$ Let $S: V \rightarrow V$ denote a linear map such that $T S=S T$. Prove that there is a polynomial $f(x)$ such that $S=f(T)$.

Note:

  • $ \Delta_T $ means characteristic polynomial of $ T $
  • $ m_T $ means the minimal polynomial of $ T $.

Attempt: Since the vector space is over $ \mathbb{C}$. There exists Jordanizing basis $ E $ s.t. $ [T]_E $ is in jordan form. Also note that since the degree of minimal polynomial is $ n $ and its only root is $ n $ then $ [T]_E = diag( J_n(\alpha) ) $. Suppose $\alpha \neq 0 $. Since $ \alpha $ is an eigenvalue then there exists a corresponding eigenvector $v \neq 0$ s.t. $ T v = \alpha v $. Since $ rank [T]_E = n $ I can extend a basis of $V$ which we can denote as $B = \{ v, v_1,v_2,...,v_{n-1} \} $ ( I am stuck on this question for a long time and I have no idea what to do. How do I continue and use $ TS=ST $ to prove the existence of the polynomial? )

Thanks for help in advance!

1

There are 1 best solutions below

0
On BEST ANSWER

Hint:

By putting $T$ in Jordan form and conjugating both matrices with the appropriate invertible matrix, we can assume that $$T = \begin{bmatrix} \alpha & 1 & 0 & \cdots & 0 & 0\\ 0 & \alpha & 1 & \cdots & 0 & 0\\ 0 & 0 & \alpha & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ 0 & 0 & 0 & \cdots & \alpha & 1\\ 0 & 0 & 0 & \cdots & 0 & \alpha \end{bmatrix} = \alpha I + N$$

where $N$ is the matrix with ones on the superdiagonal. Hence $SN = NS$ and now show that $S$ has to be upper-triangular and constant on the diagonals $$j = i, \quad j=i+1,\quad j=i+2,\quad \ldots, \quad j=i+(n-1)$$ i.e. $$S = \begin{bmatrix} a_0 & a_1 & a_2 & \cdots & a_{n-2} & a_{n-1}\\ 0 & a_0 & a_1 & \cdots & a_{n-3} & a_{n-2}\\ 0 & 0 & a_0 & \cdots & a_{n-4} & a_{n-3}\\ \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ 0 & 0 & 0 & \cdots & a_0 & a_1\\ 0 & 0 & 0 & \cdots & 0 & a_0 \end{bmatrix}$$ Compute the powers of $N$ and then notice that $$S= a_0 I + a_1 N + a_2 N^2 + \cdots + a_{n-2}N^{n-2} + a_{n-1}N^{n-1}$$ and therefore $S = p(N)$ for some polynomial $p$.

Now, we wish $S$ to be a polynomial in $T = \alpha I + N$, but this is easy. The desired polynomial is $$q(x) := p(x-\alpha)$$ since then $$q(T) = q(\alpha I + N) = p(\alpha I + N - \alpha I) = p(N) = S$$ which concludes the proof.