My strategy to prove this part is I assume columns of Matrix are linearly independent and A is not invertible,
$AX=V$ has non unique solutions, say $X_{1}$ and $X_{2}$.
Then, There are two solutions $X_{1} \neq X_{2}$ such that $AX_{1}=V$ and $AX_{2}=V$. Then if follows that $A(X_{1}-X_{2})=0$.
From which we desire a series of linear combinations of
It implies that each $\alpha_{i}$ is zero since vectors of A are linearly independent and thus $X_{1}=X_{2}$
Can I conclude from this result that A must have unique solution $X$ satisfying $AX=B$ and therefore $A$ is invertible?
If columns are lineraly independent $Ax=y$ is injective and surjective thus the transformation is invertible, that is exists $B=A^{-1}$ such that $x=By$.