I wanna show that if I have a matrix $A \in Mat_{m,n}(\mathbb{R})$ of rank n, and we let $(b_1,b_2,b_3,...,b_n)$ be an orthonormal basis for the space Col(A) (w.r.t. the standard dot product on $\mathbb{R}^m$), then there exists a set of vectors $(c_1,c_2,c_3,...,c_n)$, such that $A\cdot c_i = b_i$ for $i=1,2,3,...,n$.
I'm pretty sure that the c vectors should be somehow comprised of some kind of inverse b that takes the part of $a_i$ going in the direction of $b_i$, but I'm not really sure how to proceed with the task of showing this, or how to make this a bit more concrete.
Here is another way of thinking about it.
Let $ B $ be an orthonormal basis for the column space of $ A $. Then each column of $ A $ can be written as a linear combination of the columns of $ B $. If $ A = \left( \begin{array}{c c c} a_1 & a_2 & \cdots \end{array} \right) $ exposes the columns of $ A $ then there is a vector $ x_i $ such that $ B x_i = a_i $. But then $$ \left( \begin{array}{c c c} B x_1 & B x_2 & \cdots \end{array} \right) = \left( \begin{array}{c c c} a_1 & a_2 & \cdots \end{array} \right) $$ or, equivalently, $$ B \left( \begin{array}{c c c} x_1 & x_2 & \cdots \end{array} \right) = \left( \begin{array}{c c c} a_1 & a_2 & \cdots \end{array} \right) $$ So, the matrix you want is given by $$ X = \left( \begin{array}{c c c} x_1 & x_2 & \cdots \end{array} \right) $$.