I want to prove the following lemma:
Let $B$ be a basis for $V$ and let $T_B: B \rightarrow W$ be a map. Then there exists a unique linear map $T_V: V \rightarrow W$ which extends $f,$ that is, such that $T_V(b)=T_B(b)$ for all $b \in B$
My attempt:
Any vector $v\in V$ can be written as a linear combination of the basis elements: $$v=\sum_{i}^{n} \alpha_{i} b_{i}, \quad b_{i} \in B$$
Where $\alpha_i$ are the coordinates of $v$ with respect to $B$. We can therefore write the transformation of $v$ under $T_V$ as: $$T_{V}(v)=T_{V}\left(\sum_{i}^{n} \alpha_{i} b_{i}\right)=\sum_{i}^{n} \alpha_{i} T_{V}\left(b_{i}\right)$$
Where in the last step I exploited the assumed linearity of A. This means that if $T_V(b_i)=T_B(b_i)\ \forall\ b_i\in B$ then the map $T_B$ has been 'extended' to $T_V$. To prove that $T_V$ is unique we consider another map: $T^{'}_V:V\rightarrow W$ defined over the basis elements $T_V^{'}(b_i)=T_V(b_i)=f(b_i)\ \forall\ b_i \in B$: $$ T_{V}^{'}(v)=\sum_{i}^{n} \alpha_{i} T^{'}_{V}\left(b_{i}\right)=T_V(v) $$ Is this sufficient to prove the Lemma? Is something missing/unclear/incorrect in my intend of a proof?
All you must to do, is recalling the fact that the basis is linearly independent and then each vector is a unique linear combination of elements of the basis. It is easy to verify the linearity of $T_{V}$, and the uniqueness will follow from the fact that the Zero map is the unique linear map that vanishes at every element of the basis.