Is there some way to get the solutions to a matrix without using Gaussian elimination?
In other words, if we have a matrix $A$, we can multiply it by a change-of-basis matrix $P$, to get $PA=B$. Suppose that I don't know the solutions to $A$. I imagine somehow that I can change the basis of $A$ to get a matrix $B$ where I can read off the solutions. Is something like this possible? I'm mainly wondering if there is some way to solve $A$ without Gaussian elimination.
Suppose we have a linear system in $\mathrm x \in \mathbb R^n$
$$\mathrm A \mathrm x = \mathrm b$$
where $\mathrm A \in \mathbb R^{m \times n}$ and $\mathrm b \in \mathbb R^m$ are given. We build the objective function
$$f (\mathrm x) := \frac 12 \| \mathrm A \mathrm x - \mathrm b \|_2^2$$
whose gradient is
$$\nabla f (\mathrm x) = \mathrm A^{\top} (\mathrm A \mathrm x - \mathrm b)$$
which vanishes at the solution to the famous "normal equations"
$$\mathrm A^{\top} \mathrm A \mathrm x = \mathrm A^{\top} \mathrm b$$
which is the least-squares solution. Doing continuous-time gradient descent,
$$\dot{\mathrm x} = -\nabla f (\mathrm x)$$
we obtain the ODE
$$\dot{\mathrm x} + \mathrm A^{\top} \mathrm A \mathrm x = \mathrm A^{\top} \mathrm b$$
We can now use numerical methods for ODEs to find the least-squares solution. If the original linear system, $\mathrm A \mathrm x = \mathrm b$, is consistent, then the least-squares solution is also a solution to $\mathrm A \mathrm x = \mathrm b$.