I apologize for the trivialness of my question but it has been bugging me as to why the standard for multiplying a matrix by a vector that will give a column matrix mean that the vector has to be a column matrix? To me it seems more natural to write the vector horizontal and then match the same components to the matrix which will give the same results. Was it just the person's preference who defined it or is their some reason for the choosing of the notation? Not only does it make matrix by column matrix (vector) multiplication awkward but it also seems to produce an unintuitive way of multiplying matrices by matrices and vectors by vectors (specifically the dot product). Please respond without using matrix multiplication in your answer, if possible, because matrix multiplication is often defined by vector notation.
Thanks, Jackson
The main idea is that matrix multiplication needs to represent composition of linear functions.
I think you will have some trouble making matrix multiplication satisfy basic function composition properties like associativity if you do "row-against-row" matrix multiplication rather than "row-by-column" matrix multiplication. Doing "column-by-row" would work though.
Consider that if you do row-against-row multiplication, then you have less flexibility in chaining together matrices of different sizes. All the matrices in a product like $ABCD$ would have to have the same number of columns in such a scenario.