Let $A$ be an $n \times n$ matrix with entries in $\mathbb{C}$. Suppose that the characteristic polynomial of $A$ is the same as the minimal polynomial of $A$ such that $$p_{A}(t) = m_{A}(t) = (t - \lambda)^{n}$$ What can be said about the number of linear independent eigenvectors for the eigenvalue $\lambda$?
What is the number of linear independent eigenvectors of a complex matrix when the characteristic and minimal polynomials are the same?
73 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
If minimal and characteristic polynomials are equal, then all eigenvalues have an eigenspace of dimension$~1$, and the number of linearly independent eigenvectors one can choose is therefore equal to the number of distinct eigenvalues. In the example of the question this number is$~1$ (provided that $n>0$).
To see why eigenspaces must have dimension (no more than)$~1$, I'll show the more general statement that, under the hypothesis of equal minimal and characteristic polynomials, any divisor $D$ of the characteristic polynomial $\chi_T$ of $T$ satisfies $\dim(\ker D[T])\leq\deg D$ (one actually has equality, but this inequality suffices here). This is more general since taking $D=X-\lambda$ for some eigenvalue$~\lambda$ one gets the statement about eigenspaces. For the proof decompose the quotient polynomial $\chi_T/D$ into irreducible factors $P_i$, so that $\chi_T=P_1\ldots P_kD$; since we are working over$~\Bbb C$ all factors $P_i$ are of degree$~1$ (but the argument can be made to work for irreducible factors of higher degree as well, if working over more general fields). In the sequence of subspaces $$\def\Im{\operatorname{Im}} \Im(D[T])\supseteq\Im(P_kD[T])\supseteq\cdots \supseteq\Im(P_2\ldots P_kD[T])\supseteq\Im(P_1\ldots P_kD[T])=\{0\} $$ (the final equality is the Cayley-Hamilton theorem), no two successive image subspaces can be equal: if adding the factor $P_i$ would leave the image unchanged, then one could also remove the factor $P_i$ in the subsequent expression without changing the denoted image subspace, leading to the conclusion that $P_1\ldots\widehat{P_i}\ldots P_kD[T]=0$ (the hat meaning removal of that factor), which would contradict the hypothesis that $\chi_T$ is the minimal polynomial. Therefore the dimension decreases by at least$~1$ for each "$\supseteq$" in the chain, giving the inequality $\dim(\Im(D[T]))\geq k$, and hence by rank nullity $\dim(\ker(D[T]))\leq n-k=\deg(D)$, as desired.
On
I would look at $\dim \ker (A-\lambda)^m$ for $m\in 0\ldots n$.
If $\dim \ker (A-\lambda)^{k+1}=\dim \ker (A-\lambda)^k$ then $\dim \ker (A-\lambda)^{k+1+j}=\dim \ker (A-\lambda)^{k+j}$.
And hence, if the the minimal polynomial is $(t-\lambda)^n$ then $\dim \ker (A-\lambda)^m=m$ for all $m\le n$.
One way of answering this question would be by looking at the jordan normal form of the matrix.
Since the characteristical polynomial of your matrix is able to be written as a product of linear factors, one is able to construct the jordan normal form. Note that the eigenvectors and the eigenspace is the same when looking at the matrix A in its given form or its jordan normal form (so its invariant regarding changes of base).
One might now look at the exponent of the eigenvalue $\lambda$ in the factorization of the minimal polynomial, since it corresponds to the biggest jordan block in the jordan normalform of said eigenvalue. As the exponent of the eigenvalue $\lambda$ is $n$ in this case, the biggest jordan block has to have the size n, which is the size of the whole matrix. That means that the jordan normal form of the matrix A consists of only one jordan block.
This is what the jordan normalform for the Matrix A would look like:
$$S \cdot A \cdot S^{-1} = \begin{pmatrix} \lambda & 1 & 0 & \ldots & 0\\ 0 & \lambda & 1 & \ldots & \vdots\\ \vdots & \vdots & \ddots & \ddots \\ \vdots & \vdots & & \lambda & 1\\ 0 & 0 & \ldots & & \lambda \end{pmatrix} $$
Now the number of jordan blocks of an eigenvalue $\lambda$ in a jordan normal form corresponds to the dimension of the eigenraum of said eigenvalue, ($\text{dim}(\text{Eig}(A,\lambda))=1$), which gives us the number of linear independet eigenvectors and them being only one.