This is my first time asking a question on stack exchange so please bear with me.
Sometimes I see vectors in matrices in row form i.e. for simultaneous equations but when checking for linear independence the vectors are inserted as columns. I know that this is because aV1 + aV2 + aV3 = 0 where a is a scalar multiple of the vectors so I am adding the x1 from each vector, the x2 accordingly etc... But how do I know if given a co-efficient matrix, whether they are implemented in a row or column format or in other words if they are vertical or horizontal within the matrix?
This is one of the challenges of linear algebra, so you're asking a good question. Of course, given a question you want to solve, you can always set it up so that either way works, but I agree that it is more natural one way than the other.
You already pointed out that when you think of a system of equations $A\vec x = \vec b$, we treat $\vec x$ and $\vec b$ as column vectors and the coefficients of each equation become the row vectors of $A$. On the other hand, the columns of $A$ still have a meaning: The vectors $\vec b$ for which this system of equations is consistent (i.e., has a solution) are precisely the column space (span of the columns) of $A$.
Indeed, solving for $\vec x$ in $A\vec x=\vec b$ tells you the coefficients of the respective column vectors that gives you the linear combination $\vec b$: If $\vec a_j$ are the columns of $A$, then the equation $A\vec x=\vec b$ can be rewritten as $\sum x_j\vec a_j = \vec b$. This then leads naturally to the question of linear dependence/independence to which you already alluded.
Suppose I hand you vectors $\vec v_1,\dots,\vec v_m\in \Bbb R^n$ and ask you to find the orthogonal complement of the subspace they span; that is, I want you to find all vectors $\vec x\in\Bbb R^n$ so that $\vec v_i\cdot\vec x=0$ for all $i=1,\dots,m$. Well, then, if you put the $\vec v_i$ in as the rows of a $m\times n$ matrix $A$, you're looking for the nullspace of $A$ (i.e., all solutions of $A\vec x=\vec 0$). But you could put them in as the columns of an $n\times m$ matrix and find the so-called left nullspace (i.e., all solutions of $\vec x{}^\top\!\!A = \vec 0{} ^\top$).
Here's one for you. Suppose I give you two planes in $\Bbb R^3$, one spanned by $(1,1,0)$ and $(0,1,-1)$, and the other spanned by $(1,0,1)$ and $(2,-1,1)$. Can you find all the vectors that are in both planes?