Proving if $U$ is $T$-invariant if and only if $U$ is $S$-invariant

406 Views Asked by At

Not very familiar with this section.
Given that $T$ is a linear map on a finite-dimensional vector space $V$, and $S = T - \lambda I$, prove that
a subspace $U$ of $V$ is invariant under $S$ if and only if it is invarian under $T$.

This is what I did (very little).
Assume $U$ is $T$-invariant. Then we have $T(U) \subseteq U$.
Let $x \in T(U)$ \implies $x \in U$.
Now using the given definition, $x \in T(U) \implies x \in (S+\lambda I)(U)$.
So
$x \in S(U) + \lambda I(U) \subseteq U$... and I don't know how to go on from here.

3

There are 3 best solutions below

2
On BEST ANSWER

Let $x\in U$. We have

$$S(x)\in U\iff T(x)-\lambda x=y\in U\iff T(x)=y+\lambda x\in U$$ so $U$ is $S$-invariant iff $U$ is $T$-invariant.

1
On

On the other hand, $S(U)\subset U$ implies that $x\in U$ implies $S(x)=T(x)+\lambda x\in U$, this implies that $T(x)=S(x)-\lambda x\in U$ since $U$ is a vector subspace and $S(x),\lambda x\in U$.

0
On

Let $U$ be $T$-invariant. Let $x \in U$. Then $Tx \in U$ since $U$ is $T$-invariant. Also, $-\lambda x \in U$ since U is a linear space. Hence, $Sx= (T-\lambda I)x = Tx + (-\lambda)x \in U$ (again, since $U$ is a linear subspace).

For the other direction, consider $\mu = - \lambda$ and apply the same argument for $T= S - \mu I$.