Suppose $v,w, v + w$ are all eigenvectors of the linear operator $\phi:V \to V$. Prove that $v, w, v + w$ all have the same eigenvalue.

67 Views Asked by At

The Problem:

Just as it is in the title: Suppose $v,w, v + w$ are all eigenvectors of the linear operator $\phi:V \to V$. Prove that $v, w, v + w$ all have the same eigenvalue.

My Approach:

Let $\phi(v) = \alpha v$, $\phi(w) = \beta w$, and $\phi(v + w) = \gamma (v + w)$. We then have that $$ \phi(v) + \phi(w) = \gamma v + \gamma w \implies \alpha v + \beta w = \gamma v + \gamma w \implies (\alpha - \gamma)v = (\gamma - \beta)w. $$ This means that $v$ and $w$ are scalar multiplies of one another; say, $w = \lambda v$...

I feel like this is supposed to tell me something. My thought is that, since $v$ and $w$ are linearly dependent, they must occupy the same eigenspace; but I can't seem to prove that...

4

There are 4 best solutions below

0
On BEST ANSWER

From $(\alpha-\gamma)v=(\gamma-\beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:

  1. If $\alpha-\gamma=0$, then the LHS is the zero vector. Since $w \neq 0$ (it is an eigenvector), it must also be that $\gamma-\beta=0$. So $\alpha=\beta=\gamma$.
  2. If $\alpha-\gamma \neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $\gamma-\beta \neq 0$). So $v=\frac{\gamma-\beta}{\alpha-\gamma} w = cw$ where $c=\frac{\gamma-\beta}{\alpha-\gamma} \neq 0$. Now re-write the equation $\phi(v)=\alpha v$ using $v=cw$, and similarly rewrite $\phi(v+w)=\gamma(v+w)$. This will show you $\alpha=\beta=\gamma$.
1
On

If $v,w$ are not linearly independent, the result is trivial $w=cv,\phi(w)=\phi(cv)=c\phi(v)=c\alpha v=\alpha w$,...

$(\alpha-\gamma)v+(\beta-\gamma)w=$ since $v,w$ are linearly independent, $\alpha=\beta=\gamma$.

0
On

Apply $\phi$ once more on $(\alpha-\gamma)v = (\gamma-\beta)w$ to obtain $$(\alpha-\gamma)\alpha v = (\gamma-\beta)\beta w$$ On the other hand, multiplying the first identity by $\alpha$ yields $$(\alpha-\gamma)\alpha v = (\gamma-\beta)\alpha w$$ so $$(\gamma-\beta)\alpha w = (\gamma-\beta)\beta w \implies (\gamma-\beta)(\alpha-\beta)w = 0$$

Since $w$ is an eigenvector, we have $w \ne 0$ so $(\gamma-\beta)(\alpha-\beta)=0$.

Hence $\alpha = \beta$ or $\beta = \gamma$. From either of those it easily follows $\alpha=\beta=\gamma$.

0
On

Suppose $\phi(v) = \lambda v$, $\phi(w) = \mu w$ and that $\phi(v + w) = \kappa(v + w)$, where $\lambda$, $\mu$, and $\kappa$ are all scalars. Then we ahve $$ \lambda v + \mu w = T(v + w) = \kappa(v + w).$$ We therefore deduce that $$ (\kappa - \lambda) v + (\kappa - \mu) w = 0.$$ If $v$ and $w$ are linearly independant, then $\kappa=\lambda=\mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.