Is there a way to solve a linear system without using Gaussian elimination?

1.7k Views Asked by At

Is there some way to get the solutions to a matrix without using Gaussian elimination?

In other words, if we have a matrix $A$, we can multiply it by a change-of-basis matrix $P$, to get $PA=B$. Suppose that I don't know the solutions to $A$. I imagine somehow that I can change the basis of $A$ to get a matrix $B$ where I can read off the solutions. Is something like this possible? I'm mainly wondering if there is some way to solve $A$ without Gaussian elimination.

3

There are 3 best solutions below

3
On BEST ANSWER

Suppose we have a linear system in $\mathrm x \in \mathbb R^n$

$$\mathrm A \mathrm x = \mathrm b$$

where $\mathrm A \in \mathbb R^{m \times n}$ and $\mathrm b \in \mathbb R^m$ are given. We build the objective function

$$f (\mathrm x) := \frac 12 \| \mathrm A \mathrm x - \mathrm b \|_2^2$$

whose gradient is

$$\nabla f (\mathrm x) = \mathrm A^{\top} (\mathrm A \mathrm x - \mathrm b)$$

which vanishes at the solution to the famous "normal equations"

$$\mathrm A^{\top} \mathrm A \mathrm x = \mathrm A^{\top} \mathrm b$$

which is the least-squares solution. Doing continuous-time gradient descent,

$$\dot{\mathrm x} = -\nabla f (\mathrm x)$$

we obtain the ODE

$$\dot{\mathrm x} + \mathrm A^{\top} \mathrm A \mathrm x = \mathrm A^{\top} \mathrm b$$

We can now use numerical methods for ODEs to find the least-squares solution. If the original linear system, $\mathrm A \mathrm x = \mathrm b$, is consistent, then the least-squares solution is also a solution to $\mathrm A \mathrm x = \mathrm b$.

0
On

Given $\mathrm A \in \mathbb R^{m \times n}$, we can use Gram-Schmidt to compute the QR decomposition $\mathrm A = \mathrm Q \mathrm R$, where $\mathrm Q \in \mathbb R^{m \times m}$ is orthogonal and $\mathrm R \in \mathbb R^{m \times n}$ is upper triangular. Hence, the linear system

$$\mathrm A \mathrm x = \mathrm b$$

becomes $\mathrm Q \mathrm R \mathrm x = \mathrm b$. Since $\mathrm Q$ is orthogonal, i.e., $\mathrm Q^{\top} \mathrm Q = \mathrm I_m$, we have the upper triangular system

$$\mathrm R \mathrm x = \mathrm Q^{\top}\mathrm b$$

which can be solved via back substitution.

0
On

As a joke, some say that the people who study numerical linear algebra all they do is solving $Ax=b$. And numerical analysts also make fun of people who think Gaussian elimination is the way to go. Some of the most prominent methods for linear systems are projective methods, in particular Krylov subspace methods. Take a look at the text Numerical Methods in Matrix Computations by Åke Björck for an extensive review of a large number of methods to solve linear systems.