In matrices, we justify row operations by drawing parallels with solving a system of equations i.e.:
1.Interchanging rows = Interchanging equations \
2.Adding one multiple of a row to another = Adding one multiple of an equation to another \
3.Multiplying all terms of a row by a constant = multiplying both sides of an equation by that constant.
Now, we can do much the same with columns as well. What's the logic supporting column operations? Can we draw a parallel to solving a system of equations for column operations as well? Thanks!
The system of equations traditionally represented by the matrix equation $A\overline x = b$ can also be represented by the equation $\overline x^tA^t = b^t$. Now the variable $\overline x^t$ is a row, it's the columns of $A^t$ that correspond to equations, and column operations correspond to elementary operations on the original system of equations.
So doing column operations on a matrix corresponds to doing elementary operations on the system of equations that is traditionally represented by the transpose of that matrix.