Determing whether or not the range of a linear transformation is linearly independent.

37 Views Asked by At

Let $W$ be a vector space and let $T: W \to W$. Suppose that $n,m$ are positive integers.

(True or False) If there exists a $w\in W$ such that $T^nw\neq0$ and $T^{n+1}w=0$, then $\{w,Tw,...,T^nw\}$ is linearly independent.

(True or False) Let $V$ be the subspace formed by the span of the vectors in the previous question. If there exists a $v \in W$ such that $T^mv\notin V$ and $T^{m+1}v \in V$, then $\{w,Tw,...,T^nw,v,Tv,...,T^mv\}$ is linearly independent.

I was hoping for some hints on how to get started. I've been thinking that the first question is an application of Cayley-Hamilton. But I'm not sure how to start the second one at all.

1

There are 1 best solutions below

13
On BEST ANSWER

Both statements are true. Assume $$0 = \alpha_0w + \alpha_1 Tw + \alpha_2T^2w\cdots + \alpha_n T^nw$$

and apply $T^{n}$ to both sides of the equality. We get $$0 = \alpha_0T^n w + \alpha_1 \underbrace{T^{n+1}w}_{= 0} + \alpha_n \underbrace{T^{2n}w}_{= 0} = \alpha_0\underbrace{T^n w}_{\ne 0}$$ We conclude $\alpha_0 = 0$ and proceed inductively:

Assume that $\alpha_0 = \cdots = \alpha_{k-1} = 0$ and let's show that $\alpha_{k} = 0$.

We have

$$0 = \alpha_kT^{k}w + \cdots + \alpha_nT^{n}w$$

Applying $T^{n-k}$ we obtain

$$0 = \alpha_kT^nw + \alpha_{k+1}\underbrace{T^{n+1}w}_{=0} + \cdots + \alpha_n\underbrace{T^{2n-k}w}_{=0} = \alpha_k\underbrace{T^nw}_{\ne 0}$$

and thus $\alpha_k = 0$. Hence we conclude $\alpha_0 = \cdots = \alpha_n = 0$ so the set $\{w, Tw, \ldots, T^nw\}$ is linearly independent.


For the second stament, assume

$$0 = \alpha_0w + \alpha_1 Tw + \alpha_2T^2w\cdots + \alpha_n T^nw + +\beta_0v + \beta_1 Tv + \beta_2 T^2v + \cdots + \beta_m T^mv$$

and apply $T^{m}$ to both sides of the equality. Because $\{w, Tw, \ldots, T^nw\}$ is $T$-invariant, we get

$$0 = (\text{something in $V$}) + \beta_0 \underbrace{T^{m}v}_{\notin V} $$

Since $T^{m}v \notin V$ we conclude $\beta_0 = 0$ and proceed inductively:

Assume $\beta_0 = \cdots = \beta_{k-1} = 0$ so we have

$$0 = \alpha_0w + \alpha_1 Tw + \alpha_2T^2w\cdots + \alpha_n T^nw + \beta_kT^kv + \cdots + \beta_m T^mv$$

Applying $T^{m-k}$ we obtain

$$0 = (\text{something in $V$}) + \beta_k \underbrace{T^{m}v}_{\notin V} $$

Since $T^{m}v \notin V$ we conclude $\beta_k = 0$. Therefore $\beta_0 = \cdots = \beta_m = 0$.

Now we have

$$0 = \alpha_0w + \alpha_1 Tw + \alpha_2T^2w\cdots + \alpha_n T^nw$$

and by the linear independence of $\{w, Tw, \ldots, T^nw\}$ we finally conclude $\alpha_0 = \cdots = \alpha_n = 0$ so all scalars are $0$.

Hence, $\{w, Tw, \ldots, T^nw, v, Tv, \ldots, T^mv\}$ is linearly independent.