In the book of Linear Algebra by Werner Greub at page 96, it is given that;
The elementary transformations on a matrix are:
(1.1.) Interchange of two vectors $X_i$ and $X_j(i\not =j).$
(1.2.) Interchange of two vectors $Y_k$ and $Y_l(k\not =).$
(11.1.) Adding to a vector $X_i$ an arbitrary multiple of a vector xj(j\not =i).
(11.2.) Adding to a vector $Y_k$ an arbitrary multiple of a vector $Y_l (l \not = k)$.
It is easy to see that the four above transformations have the following effect on the matrix $M (\phi)$:
(1.1.) Interchange of the rows i and j.
(1.2.) Interchange of the columns k and I.
(11.1.) Replacement of the row-vector $a_i$ by $a_i+$ $\lambda a_j(j\not =i).$
(11.2.) Replacement of the column-vector $b_k$ by $b_k+\lambda b_l(l\not=k).$
I have no problem in understanding the operations, but what I'm asking is that what do those operations preserve, especially, in the view that they are matrix representations of linear mappings of vector spaces ?
For example, though I haven't studied in details, in Gaussian elimination, basically we are applying these elementary transformations to the coefficient matrix of a system of equations, and in the book, it says that after those operations, the given system is equivalent to the old one. I don't know how because I don't know what the transformations do and do not preserve when we applied them to a matrix.
Another example, in 11.2 when we add $\lambda b_l$ to the kth row, $\phi$ maps the kth bases to $\phi_{old} (x_k) + \lambda \phi (x_l)$, so the image has change, hence the map has changed, at least according to me.
Edit:
For those who stumbled with these elementary transformation as I was, please see this question for insight.