Some insight regarding a difficult problem on Linear Operators.

85 Views Asked by At

I am required to prove the following Theorem howver despite thinking about the problem for some time i have not been able to come up with a dignified solution.

Could you please provide some hints to get me going, please do not present the complete solution.

Theorem. Given that $V$ is a finite dimensional vector space and $T$ is a linear operator on $V$ such that given any subspace of $V$ say $U$ with $\dim U = \dim V-1$ is invariant under $T$ then $T$ is a scalar multiple of the identity operator.

2

There are 2 best solutions below

1
On BEST ANSWER

The case $\dim V\le 1$ is trivial. Suppose $\dim V\ge2$ and consider a basis $\{v_1,v_2,\dots,v_n\}$ of $V$.

Consider $T(v_1)=\alpha_1v_1+\alpha_2v_2+\alpha_3v_3+\dots+\alpha_nv_n$; by assumption, $\langle v_1,v_3,\dots,v_n\rangle$ is $T$-invariant, so $\alpha_2=0$.

Similarly, $\langle v_1,v_2,v_4,\dots,v_n\rangle$ is $T$-invariant, so $\alpha_3=0$ and, doing the same for the other vectors, $T(v_1)=\alpha_1v_1$.

On the other hand, $v_1$ can be any nonzero vector of $V$. Thus every nonzero vector of $V$ is an eigenvector.

The conclusion should now be easy.

0
On

It might be easier for you to handle the dual problem. Assume $W$ is a finite dimensional vector space and $S \colon W \rightarrow W$ is a linear operator such that each $0 \neq w \in W$ is an eigenvector of $S$. Show that $S$ is a scalar multiple of the identity.

How is this related to your problem? Take $W = V^{*}$ and $S = T^{*}$. Given a linear functional $0 \neq \varphi \in W^{*}$, the subspace $U = \ker \varphi$ has codimension one and so by assumption it is $T$-invariant which means that if $u \in U$ (so that $\varphi(u) = 0$) then $Tu \in U$ so $\varphi(Tu) = T^{*}(\varphi)(u) = 0$ so we have $U = \ker(\varphi) \subseteq \ker T^{*}(\varphi)$. This implies that $T^{*}(\varphi)$ must be a multiple of $\varphi$ so that $\varphi$ is an eigenvector of $T^{*}$.

Now, how to approach the dual problem? The assumption implies that $Sw = \lambda(w) w$ for all $w \in W$ and some $\lambda \colon W \setminus \{ 0_W \} \rightarrow \mathbb{F}$. By linearity, we have

$$ S(cw) = \lambda(cw) (cw) = cS(w) = c\lambda(w) w, \\ S(w + w') = \lambda(w + w')(w + w') = S(w) + S(w') = \lambda(w)w + \lambda(w')w'. $$

Try to use those identities to deduce that $\lambda$ must actually be constant.