I need to solve a series of equations where $A$ is a matrix, $x$ is an unknown vector, and $b$ is a known vector. I initially set up an augmented matrix $[A|b]$ and thought I could solve for $x$ by row-reducing the augmented matrix. My professor told me I should just multiply $A^{-1}b$ to get $x$, which makes sense... but why wouldn't row-reducing the augmented matrix work, and how would you prove it?
Thank you for your help!
Row reduction works for solving a system of equations like you mentioned. You can prove using the equations themselves $$a_{11}x_1+...+a_{1n}x_n=b_1$$ $$\vdots$$ $$a_{n1}x_1+...+a_{nn}x_n=b_n.$$ Each reduction step results in a set of equations which have equivalent solutions. So if you can get the matrix into row echelon form and you have a system that is still solvable, then any of its solutions will be a solution to your original equation. In particular if the matrix is invertable then you'll arrive at $$x_1+0x_2+...+0x_n=b'_1$$ $$0x_1+x_2+0x_3+...+0x_n=b_2'$$ $$\vdots$$ $$0x_1+...+0x_{n-1}+x_n=b_n'.$$ So $x=b'$ where $b'$ is the vector you get after row reduction.