Show that a linear map on a finite dimensional complex vector space always have an eigenvalue.

238 Views Asked by At

What is an alternative proof that a linear map $T$ on a finite dimensional complex vector space $V$ with dimension $n$ always has an eigenvalue?

Here is the original proof idea:

We take a no zero vector $v\in V$, then $\{v,T(v),...,T^n(v)\}$ is linearly dependent. So there exists $a_i$, not all zero such that $\sum_{i=0}^n a_iT^i(v)=0$.

Pick $m$ smallest such that $a_m$ is not zero. $m\neq 0 $ because $v\neq 0$. Then the polynomial $a_0+a_1z+...+a_mz^m$ can be factored as linear terms $c(z-c_1)...(z-c_m)$. So then $c(T-c_1I)...(T-c_mI)(u)=0$ and so one of $(T-c_iI)$ is not surjetive for some $i$. And then $c_i$ would be an eigenvalue.

2

There are 2 best solutions below

3
On BEST ANSWER

The assertion that you are trying to prove is equivalente to the fundamental theorem of Algebra. To be more precise:

Proposition: Given a field $k$, the following assertions are equivalent:

  • Every non-constant polynomial with coefficients in $k$ has a root in $k$.

  • For each $n\in\mathbb N$, every endomorphism $T$ of $k^n$ has an eigenvalue.

In fact, you already know that the first condition implies the second one. On the other hand, if the second condition holds, then let $P(x)$ be a non-constant polynomial with coefficients in $k$. You want to prove that it has a root in $k$ and you can assume without loss of generality that it is a monic polynomial. But then $p(x)$ is the characteristic polynomial of its companion matrix. Since this matrix has an eigenvalue, $P(x)$ has a root in $k$.

So, if you whish to avoid the fundamental theorem of Algebra, you will have to use something basically equivalent to it.

1
On

A short proof would be: consider T as an $n\times n$ matrix $M$ consider $f(\lambda)=\det(I\lambda-M)$ this is a polynomial of order $n\geq1$ over $\mathbb{C}$ hence by the fundamental theorem of algebra has a root, namely an eigenvalue.

The fundamental theorem of algebra is essential here as one needs the polynomial $f(\lambda)$ to have a root. Without this:

$M=\begin{pmatrix} 0&-1 \\ 1&0\end{pmatrix}$ becomes a counterexample as it only has complex eigenvectors.