I came across a problem to find $A^{-1}$ if a matrix $A$ satisfies
$$A\begin{bmatrix} 3 &1 & 2\\ 3 & 0 &2 \\ -1 & 1 &0 \end{bmatrix}=\begin{bmatrix} 0 & 1 &0 \\ 0 & 0&1 \\ 1&0 &0 \end{bmatrix}$$
I just used simple column vector Interchanging in right hand side matrix as follows
Interchange $C_1$ and $C_2$ and then Interchange $C_2$ and $C_3$ to get Identity matrix.
If we perform same operations to left hand side matrix we get a new matrix $$C=\begin{bmatrix} 1 &2 &3 \\ 0& 2 &3 \\ 1 &0 & -1 \end{bmatrix}$$
and magically that is $A^{-1}$.
Can i know what is the concept behind this?
But if i do same interchange to Rows, i am getting a different matrix.
interchanging columns is equivalent to post-multiplying by a matrix, so you are reducing $AX = Y$ into $AXZ = YZ$ with $YZ=I$, so $A^{-1} = XZ$.
interchanging rows is pre-multiplying but that does not help here since you must do that to $A$, not to $X$...