I have this question from a practice exam I was studying with.
Let $v_1,\dots,v_k$ be linearly independent vectors in $\mathbb{R}^n$ and suppose $A \in M_{m \times n}(\mathbb{R})$ has linearly independent columns. Prove that the vectors $Av_1,\dots,Av_k \in \mathbb{R}^m$ are linearly independent.
I feel like this proof is very simple but I can't quite piece it together.
$Av_1,\dots,Av_k$ implies I can write these vectors as:
$$[A]\begin{pmatrix} v_1 \\ \vdots \\v_k\end{pmatrix} $$
but I'm not sure where to go ahead from here. I am VERY tempted to say that since the columns of $A$ and the $v_1,\dots,v_k$ are linearly independent, the product has to be too but I don't think this is a valid proof and I'm not really using any sort of definitions. I can't use the idea of invertibility since the matrix isn't guaranteed to be square since it's not an $n \times n$ matrix.
Can anyone help me figure this out? Thanks!
Asserting that the columns of $A$ are linearly independent is the same thing as asserting that, if $\{e_1,e_2,\ldots,e_n\}$ is the standard basis of $\Bbb R^n$, then the set $\{Ae_1,Ae_2,\ldots,Ae_n\}$ is linearly independent. But then $A$ is the matrix of an injective linear map (if$$v=\alpha_1e_1+\alpha_2e_2+\cdots+\alpha_ne_n$$is such that $Av=0$, then$$\alpha_1Ae_1+\alpha_2Ae_2+\cdots+\alpha_nAe_n=0$$and therefore each $\alpha_k$ is $0$; in other words, $v=0$), and so it maps linearly independent sets into linearly independent sets.