Small question about Eigenvectors as Basis for $V$

51 Views Asked by At

If $\beta = (v_{1}, .. v_{n})$ is a basis for vectorspace $V$, such that $v_{j}$ is an Eigenvector of operator $T \in \text{End}(V)$ for $j = 1,..,n$

Is it correct then to say that all the vectors in $V$ are eigenvectors from $T$ since they are all in the Span of $(v_{1}, .. v_{n})$ and so the entire vector space is $T$-invariant?

If we have another operator $G \in \text{End}(V)$ such that each $v_{j} \in \beta$ is also an Eigenvector of $G$, but perhaps with different Eigenvalues than those of $T$, then we can say that $T$ and $G$ are commutative, because on all $v \in V$, the action of both operators is really just a scaling, so it doesn't matter by which scalar I multiple first?

I am trying to come up with a proof for a problem, but feel like I am missing something.

1

There are 1 best solutions below

0
On BEST ANSWER

As noticed in the comment, your mistake came from the fact that the sum of two eigenvectors is not necessarily an eigenvector.

To wrap it up, we can show that if the whole space is made of eigenvectors of $T$, then $T$ is an homothetic transformation. With quantifiers, we have: $$ \forall x\in V, \exists\lambda_x\in \mathbb{K},T(x)=\lambda_x x $$ and we want to show that: $$ \exists\lambda\in \mathbb{K},\forall x\in V,T(x)=\lambda x $$ Then, all we have to do is to show that $\lambda_x=\lambda_y$ for all $x$ and $y$ in $V$.

  • If $(x,y)$ is linearly dependent, $\exists \mu\in \mathbb{K}, x=\mu y$. Then $T(x)=\lambda_x x = \lambda_x \mu y$ but $T(x)=\mu T(y) = \mu \lambda_y y$. So $\lambda_x=\lambda_y$.

  • If $(x,y)$ is linearly independent, then $T(x+y)=\lambda_{x+y} (x+y) = \lambda_{x+y} x + \lambda_{x+y} y$ and $T(x+y)=T(x)+T(y)=\lambda_x x + \lambda_y y$ thus $\lambda_x =\lambda_{x+y} =\lambda_y$.


To come back on what you wrote on two endomorphisms sharing the same eigenvectors, your intuition is right. That works because, since there is a basis of eigenvectors for each, both $T$ and $G$ are diagonalizable (both in the same basis, whose matrix we will denote by $P\in \mathcal{GL}_n(\mathbb{K})$).

Then $TG =PD_TP^{-1} \times PD_GP^{-1} = PD_TD_GP^{-1}= PD_GD_TP^{-1} =GT$ because two diagonal matrices commute.