How is the Gram-Schmidt process valid? Doesn't it alter the original problem that we wanted to solve?

59 Views Asked by At

In our numerical analysis class, we learned about orthonormal vectors and why they're so convenient for solving the matrix equation $A\vec{x} = \vec{b}$:

$\vec{b} = x_1\vec{a_1} + x_2\vec{a_2} + ... + x_k\vec{a_k}$

If the columns of $A$ form an orthonormal set of vectors, then we can find each $x_i$ in a stepwise fashion by "abusing" the fact that $\vec{a_i}\cdot\vec{a_j} = 0$ if $i \neq j$ and $1$ if $i = j$. For example, we can easily find $x_1$ by multiplying both sides by $a_1$:

$\vec{a_1}\cdot\vec{b} = x_1(\vec{a_1}\cdot\vec{a_1}) + x_2(\vec{a_1}\cdot\vec{a_2}) + ... + x_k(\vec{a_1}\cdot\vec{a_k}) = x_1(1) + 0 + ... + 0$.

But often times, the columns of $A$ are not orthogonal (or orthonormal). We learned the Gram-Schmidt process for transforming a given set of non-orthogonal vectors into a set of orthonormal vectors. Our instructor then told us that we can use this to solve the original problem.

Here's my issue with that: Don't we end up solving a different problem? The original problem asks for $A\vec{x} = \vec{b}$. If we replace $A$ with another matrix whose column vectors are orthonormal, have we not changed the problem that we're working with?