When are minimal and characteristic polynomials the same?

25.6k Views Asked by At

Assume that we are working over a complex space $W$ of dimension $n$. When would an operator on this space have the same characteristic and minimal polynomial?

I think the easy case is when the operator has $n$ distinct eigenvalues, but what about if it is diagonalizable? Is that sufficient, or can there be cases (with repeated eigvals) when char poly doesn't equal min poly? What are the general conditions when the equality holds? Is it possible to define them without use of determinant? (I am working by Axler and he doesn't like it.)

Thanks.

4

There are 4 best solutions below

4
On BEST ANSWER

Theorem. Let $T$ be an operator on the finite dimensional complex vector space $\mathbf{W}$. The characteristic polynomial of $T$ equals the minimal polynomial of $T$ if and only if the dimension of each eigenspace of $T$ is $1$.

Proof. Let the characteristic and minimal polynomial be, respectively, $\chi(t)$ and $\mu(t)$, with $$\begin{align*} \chi(t) &= (t-\lambda_1)^{a_1}\cdots (t-\lambda_k)^{a_k}\\ \mu(t) &= (t-\lambda_1)^{b_1}\cdots (t-\lambda_k)^{b_k}, \end{align*}$$ where $1\leq b_i\leq a_i$ for each $i$. Then $b_i$ is the size of the largest Jordan block associated to $\lambda_i$ in the Jordan canonical form of $T$, and the sum of the sizes of the Jordan blocks associated to $\lambda_i$ is equal to $a_i$. Hence, $b_i=a_i$ if and only if $T$ has a unique Jordan block associated to $\lambda_i$. Since the dimension of $E_{\lambda_i}$ is equal to the number of Jordan blocks associated to $\lambda_i$ in the Jordan canonical form of $T$, it follows that $b_i=a_i$ if and only if $\dim(E_{\lambda_i})=1$. QED

In particular, if the matrix has $n$ distinct eigenvalues, then each eigenvalue has a one-dimensional eigenspace.

Also in particular,

Corollary. Let $T$ be a diagonalizable operator on a finite dimensional vector space $\mathbf{W}$. The characteristic polynomial of $T$ equals the minimal polynomial of $T$ if and only if the number of distinct eigenvalues of $T$ is $\dim(\mathbf{W})$.

Using the Rational Canonical Form instead, we obtain:

Theorem. Let $W$ be a finite dimensional vector space over the field $\mathbf{F}$, and $T$ an operator on $W$. Let $\chi(t)$ be the characteristic polynomial of $T$, and assume that the factorization of $\chi(t)$ into irreducibles over $\mathbf{F}$ is $$\chi(t) = \phi_1(t)^{a_1}\cdots \phi_k(t)^{a_k}.$$ Then the minimal polynomial of $T$ equals the characteristic polynomial of $T$ if and only if $\dim(\mathrm{ker}(\phi_i(T)) = \deg(\phi_i(t))$ for $i=1,\ldots,k$.

Proof. Proceed as above, using the Rational Canonical forms instead. The exponent $b_i$ of $\phi_i(t)$ in the minimal polynomial gives the largest power of $\phi_i(t)$ that has a companion block in the Rational canonical form, and $\frac{1}{d_i}\dim(\mathrm{ker}(\phi_i(T)))$ (where $d_i=\deg(\phi_i)$) is the number of companion blocks. QED

2
On

The following equivalent criteria, valid for an arbitrary field, are short to state. Whether or not any one of the conditions is easy to test computationally may depend on the situation, though 2. is in priciple always doable.

Proposition. The following are equivalent for a linear operator on a vector space of nonzero finite dimension.

  1. The minimal polynomial is equal to the characteristic polynomial.
  2. The list of invariant factors has length one.
  3. The Rational Canonical Form has a single block.
  4. The operator has a matrix similar to a companion matrix.
  5. There exists a (so-called cyclic) vector whose images by the operator span the whole space.

Point 1. and 2. are equivalent because the minimal polynomial is the largest invariant factor and the characteristic polynomial is the product of all invariant factors. The invariant factors are in bijection with the blocks of the Rational Canonical Form, giving the equivalence of 2. and 3. These blocks are companion matrices, so 3. implies 4., and by the uniqueness of the RCF 4. also implies 3 (every companion matrix is its own RCF). Finally 4. implies 5. (take the first basis vector as cyclic vector) and 5. implies 4. by taking a basis consisting of $n$ successive images (counting from $0$) of the cyclic vector.

0
On

Here is a generalization to principal ideal domains.

Let $A$ be a principal ideal domain, $p$ an irreducible element of $A$, and $M$ a finitely generated $A$-module annihilated by some power of $p$.

Then there is a unique nondecreasing tuple $(n_1,\dots,n_k)$ of positive integers such that $M$ is isomorphic to the direct sum of the $A/(p^{n_i})$.

The characteristic ideal of $M$ is $(p^s)$, where $s$ is the sum of the $n_i$; and the annihilator of $M$ is $(p^{n_k})$. Let $\phi$ be the endomorphism $x\mapsto px$ of $M$.

The following conditions are clearly equivalent:

  • $k=1$,

  • $s=n_k$,

  • $\text{Ker }\phi\simeq A/(p)$,

  • $\text{Coker }\phi\simeq A/(p)$.

0
On

Let $A$ be any $n\times n$ matrix over an arbitrary field $K$, and denote by $f(x)$ and $m(x)$ its characteristic and minimal polynomials, respectively. It is clear these two polynomials coincide iff $\deg m(x)=n$. The following theorem involves some interesting properties of the centralizer $C(A)=\{X\in K^{n\times n}\mid AX=XA\}$.

Theorem. The following statements are equivalent for $A$:

(1) $\deg m(x)=n$;

(2) $A$ being placed in an algebraic closure $L$ of $K$, its Jordan normal form comprises Jordan blocks of distinct eigenvalues;

(3) $C(A)=\{p(A)\mid p$ is a polynomial over $K$};

(4) $\dim C(A)=n$.

(Statement 2 is inspired by an article in the comment section of Hamou's answer, which proved that $\dim C(A)\geqslant n$ always holds and $\dim C(A)$ is irrelevant to the base field.)

Proof. (2)$\Rightarrow$(1) is easy and (1)$\Rightarrow$(2) is proved by contradiction (omitted). To show equivalence among (2)(3)(4), we must derive a formula of $\dim C(A)$ with regard to the Jordan normal form of $A$. Suppose that $A$ has $s$ distinct eigenvalues $\lambda_1,\cdots,\lambda_s$ and $$A=\mathop{\mathrm{diag}}(J_{\lambda_1}(n_{11}),\cdots,J_{\lambda_1}(n_{1r_1}),\cdots,J_{\lambda_s}(n_{sr_s})),$$ where $J_\lambda(n)$ is the $n\times n$ Jordan block $\lambda$: $$J_\lambda(n)=\begin{bmatrix}\lambda&1\\&\lambda&1\\&&\ddots&\ddots\\&&&\lambda&1\\&&&&\lambda\end{bmatrix}.$$ Let $J_i=\mathop{\mathrm{diag}}(J_{\lambda_i}(n_{i1}),\cdots,J_{\lambda_i}(n_{ir_i}))$ and denote its size by $d_i=n_{i1}+\cdots+n_{ir_i}$.

Viewing $A$ as a linear transformation on $L^n$, one decomposes $L^n$ into a direct sum of root subspaces of $A$, each being invariant under $A$. If $B\in C(A)$, it leaves those root subspaces invariant, too. So $B$ is a block diagonal matrix having $s$ blocks: $$B=\mathop{\mathrm{diag}}(B_1,\cdots,B_s),$$ where $B_i$ is of size $d_i$, i.e., dimension of the root subspace of eigenvalue $\lambda_i$. Partion $B_i$ into the same shape of $J_i$: $$B_i=\begin{bmatrix}B_{i,11}&\cdots&B_{i,1r_i}\\\vdots&\ddots&\vdots\\B_{i,r_i1}&\cdots&B_{i,r_ir_i}\end{bmatrix},$$ where $B_{i,jk}$ is of size $n_{ij}\times n_{ik}$. The equation $B_iJ_i=J_iB_i$ is equivalent to $r_i^2$ smaller and independent equations $$J_{\lambda_i}(n_j)B_{i,jk}=B_{i,jk}J_{\lambda_i}(n_k),\quad 1\leqslant j,k\leqslant r_i.$$ The solution space of the above equation is of dimension $n_k$, and hence $\dim C(J_i)=d_ir_i$. Consequently $$\dim C(A)=\sum_{i=1}^sd_ir_i,$$ where $s$ is number of distinct eigenvalues of $A$, $d_i$ is dimension of root subspace belonging to eigenvalue $\lambda_i$, and $r_i$ is number of Jordan blocks belonging to $\lambda_i$.

Now one easily sees that $\dim C(A)=n$ iff each $r_i$ equals 1 (this is Statement 2). Thus we proved (4)$\Rightarrow$(2), and hence equivalence among (1)$\sim$(4).