Linear Algebra - Proof on Linear Transformations and Independence

114 Views Asked by At

Let $T: \Bbb{R}^n \rightarrow \Bbb{R}^n$ be an invertible linear transformation. Prove that if {${ \vec{v_1},\vec{v_2},...,\vec{v_n}}$} is a linearly independent set of $\Bbb{R}^n$ if and only if {$T( \vec{v}_1),T( \vec{v_2}),...,T( \vec{v_k})$} is a linearly independent set of $\Bbb{R}^n$.

So I know that for a set of vectors to be linearly independent then they cannot be written as a linear combination of one another. So can I say something along the lines of $t_1 v_1 + t_2 v_2 + ... + t_k v_k = 0$ then taking the transformations the set I get is {$T( t_1 \vec{v}_1),T( t_2 \vec{v_2}),...,T( t_k \vec{v_k})$} which given the definition of linear independence can be written as $ t_1 T( \vec{v}_1) + t_2 T( \vec{v_2}) +,..., + t_k T( \vec{v_k}) = 0$. So then is that it?

1

There are 1 best solutions below

0
On

For $\Longrightarrow$ : You are on the right way!

$t_1T(v_1)+t_2T(v_2)+\cdots+t_nT(v_n)=0 $ implies $$T(t_1v_1+\cdots+t_nv_n)=0=T(0)$$ and so by one-one of $T$, $$t_1v_1+\cdots+t_nv_n=0$$

and by independence of $v_i$, $$t_1=t_2=\cdots=t_n=0$$


For $\Longleftarrow$ : Let $\alpha_1v_1+\alpha_2v_2+\cdots+\alpha_nv_n=0$ and apply $T$ on both sides and use independence of $T(v_i)$'s to get the result!