I hadn't realized it until I read the most shocking line while studying linear algebra:
"Row operations CAN change the column space of a Matrix."
Basically, I have accustomed myself in viewing solution of matrix equations as a series of transformation. Say for example, I come accross an eqution for $Ax=b$, then I would interpret this as if the vector x was tranformed by Matrix A, then what would that vector x be?
Normally, in linear algebra we'd do such solutions by reducing the matrix into echelon form or reduced echelon form. Note that the column space is also the Span of the Matrix A.
Then, this is where the confusion pin point is. Say $A$ spanned a space where $b$ was in the span of A, so, $Ax=b$ has a solution. Then we row reduce $A$ to echelon form, then possibly the span of $A$ has changed and it may no longer include $b$ in it's span, on top of which the vector b itself has row reduced equivalently. What guarantee is there that $b$ is in the span of $A$ ~ $B$?
Now there is a second confusion, if row reductions do change the span of a Matrix, then, do the entries in the reduced row echelon of $A$, that are pivots in $A$, not span the same space as A? Then, how are these the bases for the space spanned by $A$?
EDIT: more confusion here:, when we're finding the null space of $A$, then we row reduce it and find the linear combinations of free variables on pivot columns, and this is the Null Space of A. But, how does this all make sense, or how is this even equivalent, when the span of A has changed?
There are three kinds of row operation: (1) multiply a row by a number, (2) swap two rows, (3) add a multiple of one row to another row. Each of those is equivalent to multiplication by an "elementary matrix", the matrix we get by applying that row operation to the identity matrix. For example, multiplying the second row of a 3 by 3 matrix by "a" is equivalent to multiplying that matrix by $\begin{bmatrix}1 & 0 & 0 \\ 0 & a & 0 \\ 0 & 0 & 1\end{bmatrix}$. Similarly swapping the second and third rows is equivalent to multiplying by $\begin{bmatrix}1 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0\end{bmatrix}$ and adding a times the third row to the first row is equivalent to multiplying by $\begin{bmatrix}1 & 0 & a \\ 0 & 1 & 0 \\ 0 & 0 & 1\end{bmatrix}$. So doing "row reduction" to a matrix, A, to reduce it to the identity matrix is equivalent to multiply by a series of elementary matrices that give the identity matrix. The product of those elementary matrices is $A^{-1}$. Applying those same row operations to the right side of the equation is, then, the same as multiplying both sides of the equation by $A^{-1}$.