Let $B$ be a matrix of the vector space $\mathcal{M}_{n\times n}(\mathbb{F})$, where $\mathbb{F}$ is a field, and let $L_B:\mathcal{M}_{n\times n}(\mathbb{F})\to \mathcal{M}_{n\times n}(\mathbb{F})$ be the linear map defined by $L_B(A)=BA$. Show that $\det(L_B)=(\det(B))^n$, and that both $L_B$ and $B$ have the same minimal polynomial.
The first part is an exercise of Hoffman & Kunze Linear Algebra. I found out that we can use the ordered basis $\mathcal{B}=\{E_{1,1},\ldots,E_{n,1},\ldots,E_{1,n},\ldots,E_{n,n}\}$ to obtain $$[L_B]_\mathcal{B} = \begin{bmatrix} B & 0 & \dots & 0\\ 0 & B & \dots &0\\ \vdots&\vdots&\ddots & \vdots\\ 0 & 0 & \dots &B \end{bmatrix}.$$ Therefore, $\det(L_B) = (\det(B))^n$.
Now, let $m_{L_B}$ and $m_B$ be the minimal polynomials of $L_B$ and $B$, respectively. Since $$0_{n^2\times n^2}=m_{L_B}(L_B)=m_{L_B}\left( \begin{bmatrix} B & 0 & \dots & 0\\ 0 & B & \dots &0\\ \vdots&\vdots&\ddots & \vdots\\ 0 & 0 & \dots &B \end{bmatrix} \right) = \begin{bmatrix} m_{L_B}(B) & 0 & \dots & 0\\ 0 & m_{L_B}(B) & \dots &0\\ \vdots&\vdots&\ddots & \vdots\\ 0 & 0 & \dots &m_{L_B}(B) \end{bmatrix},$$ we have that $m_{L_B}(B)=0_{n\times n}$. Hence, $m_{L_B}\mid m_B$.
Similarly, $$m_B(L_B)=m_{B}\left( \begin{bmatrix} B & 0 & \dots & 0\\ 0 & B & \dots &0\\ \vdots&\vdots&\ddots & \vdots\\ 0 & 0 & \dots &B \end{bmatrix} \right) = \begin{bmatrix} m_{B}(B) & 0 & \dots & 0\\ 0 & m_{B}(B) & \dots &0\\ \vdots&\vdots&\ddots & \vdots\\ 0 & 0 & \dots &m_{B}(B) \end{bmatrix}=0_{n^2\times n^2}$$ implies that $m_B\mid m_{L_B}$.
We conclude that $m_B = m_{L_B}$, because both of them are monic polynomials.
I have three questions:
- Is this solution correct?
- If we have a block diagonal matrix, can we always say that $$p\left( \begin{bmatrix} A&0\\0&B\end{bmatrix} \right) = \begin{bmatrix} p(A)&0\\0&p(B)\end{bmatrix}?$$
- If $X$ is a matrix that represents a linear map $T$, why $m_T(X)=0$? I know that $m_T(T)=0$, but I don't understand why the value of $X$ in that polynomial is also $0$.
To answer your individual questions:
To elaborate on the remark I made in point 1, you don't need to work with matrices at all to show that $L_B$ has the same minimal polynomial as $B$ does (well of course $B$ is a matrix, but we can work equivalently with the linear operator $F^n\to F^n$ that consists of left-multiplying a vector by$~B$). Since operator composition/matrix multiplication is associative, it is clear that $(L_B)^k$ is the map $A\mapsto B^k*A$ and by linearity this extends to the fact that $P[L_B]: A\mapsto P[B]*A$. To show that $P[L_B]=0$ if and only if $P[B]=0$, the direction "if" is now obvious. For the "only if" direction, suppose $P[B]\neq0$, then taking $A=I$ on gets $P[L_B](I)=P[B]*I=P[B]\neq0$ which suffices to establish $P[L_B]\neq0$, and we are done. (To complete the proof, with the same set of polynomials annihilating $L_B$ and $B$, of course their minimal polynomials are the same.)