$f(\lambda)$ eigenvalue of $f(A)$ always implies that $\lambda$ eigenvalue of $A$?

1k Views Asked by At

Let $A$ be a $n \times n$ matrix over the field $\mathbb{F}$. If $f(A)$ is the polynomial matrix given by the ansatz $$ f(A)=\sum_{j=0}^k a_j A^j ~~(k\leq n ~~~\&~~~a_k \neq 0).$$

It is rather obvious that in case that $(\lambda,v)$ is an eigenpair of $A$ implies that $(f(\lambda),v)$ is an eigenpair of $f(A)$.


Is the converse implication also true? (i.e. $f(\lambda)$ eigenvalue of $f(A)$, implies $\lambda$ eigenvalue of $A$)? Comments and suggestions are very welcome.

3

There are 3 best solutions below

1
On BEST ANSWER

Here is a general result:

If $F$ is algebraically closed field or $A$ is triangularizable, then $\mu$ is an eigenvalue of $f(A)$ iff $\mu=f(\lambda)$, where $\lambda$ is an eigenvalue of $A$.

See a proof here.

4
On

Just take $A=\text{diag}\{1, 1\}$ (i.e. the identity matrix) and $f(x) = x^2$. We have that $f(-1)$ is an eigenvalue of $f(A) = A$, though $-1$ is not an eigenvalue of $A$.

0
On

No, this is not true. Suppose that $f$ is a polynomial of degree $n+1,$ that has $n+1$ district roots.

Then $|f^{-1}(\lambda)|=n+1$, but $A$ has at most $n$ eigenvalues.