Let $L:V\mapsto W$ be an injective linear transform and $U$ a linearly independent subset of the vector space $V$. Show that $L(U)$ is linearly independent.
Proof : Since $U$ is $LI$, then if $s \in U$ $$s_1x_1 + ...s_nx_n=0$$
has only the trivial solution. So every term $s_nx_n=0$, and since $L$ is linear we have that $L(s_nx_n)=L(0)=0$. From injectivity it follows that $$L(s_1x_1)=L(s_nx_n) \rightarrow s_1x_1=s_nx_n$$
Now each vector $s_nx_n$ is mapped into $0$, and so the image of the set $U$
$$L(U)=L(s_1x_1)+...L(s_1x_1)=L(s_1x_1+...s_nx_n)=0$$
when $x_i=0,(i=1,...,n)$. Therefore $L(U)$ is linearly independent. What is the significance of $U \subset V$? Other than the fact that since $V \mapsto W$ is injective, certainly this implies injectivity for elements of $U$ as well. Does this show that a $LI$ set is mapped onto another $LI$ set iff the linear transform is injective? (... if not injective then $L(s_1x_1)=L(s_nx_n) \rightarrow s_1x_1 \neq s_nx_n$ so we would have a non-trivial solution )
It's not enough to use that $L$ restricted to $U$ is injective, that is that $Ls_i$ are pairwise distinct for $s_i\in U$, since e.g. for $V=W=\Bbb R^2$ and $L(x,y) =(x+2y,x+2y)$, the standard basis $(1,0),\ (0,1)$ are mapped to different but linearly dependent (parallel) vectors.
You should start the proof by assuming $$Ls_1x_1+Ls_2x_2+\dots +Ls_nx_n=0$$ for some distinct elements $s_i\in U$ and $x_i\in\Bbb R$, and you should deduce all $x_i=0$, using that $s_i$ are linearly independent and that $\ker L=\{0\}$.