I read a result :
A list $(v_1,v_2,...,v_n)$ of vectors is linearly independent if and only if each $v$ in their span has a unique representation as their linear combination
I wish to make one direction of the statement stronger. I propose that :
If there is a vector $b$ which has a unique representation as a linear combination of the $v_i$'s, then $(v_1,v_2,...,v_n)$ is linearly independent
The motivation comes from the very definition of linear independence, where this vector $b$ is taken as the vector $0$.
The argument is as follows:
Fix the vector $b\in\operatorname{span}(v_1,v_2,\ldots,v_n)$. If the vectors were linearly dependent, then the vector $0$ will have more than one representation as their linear combination. As a result, the vector $b=b+0$ will have more than one representation. Contraposition yields the result.
Is this result true after all? Or did I make a false argument? Any comment on this will be much appreciated.
Your statment is true and your proof is correct.
As an alternative proof, let $\beta_1,\ldots\beta_n$ be scalars such that $$b=\beta_1v_1+\cdots+\beta_nb_n\tag1$$is the only way of expressing $b$ as a linear combination of the $v_k$'s. I will (directly) prove that the $v_k$'s are linearly independent. Let $\alpha_1,\ldots,\alpha_n$ be such that $\alpha_1v_1+\cdots+\alpha_nv_n=0$. Then$$b=b+0=(\alpha_1+\beta_1)v_1+\cdots+(\alpha_n+\beta_n)v_n.$$But we are assuming that $(1)$ is the only way of expressing $b$ as a linear combination of the $v_k$'s. Therefore$$\alpha_1+\beta_1=\beta_1,\alpha_2+\beta_2=\beta_2,\ldots,\alpha_n+\beta_n=\beta_n$$and, of course, this means that $\alpha_1=\cdots=\alpha_n=0$.