Proving that the spectrum of $A\in End(V)$ is contained in the the set of roots of polynomials for which $p(A)=0$

27 Views Asked by At

I want to prove the lemma:

$$\tag{1}\sigma(A)\subseteq R(p)\ \forall\ p:p(A)=0$$

Where $\sigma(A)$ is the spectrum of eigenvalues of $A\in End(V)$ and $R(p)$ the set of roots of the polynomial $p$.

My attempt:

We can write: \begin{equation} \tag{2} p(A)=\sum_{i=0}^{n} \alpha_{i} A^{i} \end{equation} where $\alpha_i\in \mathcal{F}$ are scalars and $A^i$ is $A$ multiplied by itself $i$ times. Let $v\in V\backslash\{0\}$ be an eigenvector of $A$ and $\lambda$ the eigenvalue. We then have: $$A^iv=\lambda^iv\\\Longrightarrow p(A) v=\sum_{i=0}^{n} \alpha_{i} A^{i} v=\sum_{i=0}^{n} \alpha_{i} \lambda^{i} v=p(\lambda) v$$ Where the polynomials in the last equation is defined in an analogous way to eq. $(2)$ but with $\mathcal{F}\mapsto\mathcal{F}$. Since $p(A)=0$ and $v\in V\backslash \{0\}$ we can conclude: $$p(\lambda)=0$$ Which shows that the eigenvalue is in $R(p)$

Are there any flaws in this proof, and is it extensive?

1

There are 1 best solutions below

0
On

The Hamilton-Cayley theorem answers to your question, whenever a polynomial evalued in A gives back the zero matrix you can say that the minimum polynomial of A divides it. The minimum polynomial has already all the roots you need and you can conclude.