same eigenvalues gives addition of eigenvectors being an eigenvector.

77 Views Asked by At

Let V be a finite dimensional vector space over $ \Bbb C$, and let $T \in L(V )$.

Let v be an eigenvector for T with eigenvalue $ \lambda $, and let $w \in V $ be an eigenvector for T with eigenvalue $\mu $, and suppose that $ v \neq -w $. Prove that $v + w$ is an eigenvector for T if and only if $ \lambda = \mu $

I understand the condition that $ v \neq -w $. but there is something fundamental missing for my understanding of eigenvalues why is it that $v + w$ is not an eigenvector? i mean if $\lambda, \mu $ are the same then $v+w$ points in the same direction of v or of w (possibly they both point in the same direction) but if not i see no reason why $v+w$ isnt an eigenvector?

1

There are 1 best solutions below

7
On BEST ANSWER

Suppose there is $\;\alpha\in\Bbb C\;$ s.t.

$$\alpha(v+w)=T(v+w)=Tv+Tw=\lambda v+\mu w\implies (\alpha-\lambda)v=-(\alpha-\mu)w$$

Can you take it from here?