Showing that there exists a set of vectors, such that multiplied by a matrix, we'll get the orthonormal basis for the matrix

56 Views Asked by At

I wanna show that if I have a matrix $A \in Mat_{m,n}(\mathbb{R})$ of rank n, and we let $(b_1,b_2,b_3,...,b_n)$ be an orthonormal basis for the space Col(A) (w.r.t. the standard dot product on $\mathbb{R}^m$), then there exists a set of vectors $(c_1,c_2,c_3,...,c_n)$, such that $A\cdot c_i = b_i$ for $i=1,2,3,...,n$.

I'm pretty sure that the c vectors should be somehow comprised of some kind of inverse b that takes the part of $a_i$ going in the direction of $b_i$, but I'm not really sure how to proceed with the task of showing this, or how to make this a bit more concrete.

2

There are 2 best solutions below

7
On BEST ANSWER

Here is another way of thinking about it.

Let $ B $ be an orthonormal basis for the column space of $ A $. Then each column of $ A $ can be written as a linear combination of the columns of $ B $. If $ A = \left( \begin{array}{c c c} a_1 & a_2 & \cdots \end{array} \right) $ exposes the columns of $ A $ then there is a vector $ x_i $ such that $ B x_i = a_i $. But then $$ \left( \begin{array}{c c c} B x_1 & B x_2 & \cdots \end{array} \right) = \left( \begin{array}{c c c} a_1 & a_2 & \cdots \end{array} \right) $$ or, equivalently, $$ B \left( \begin{array}{c c c} x_1 & x_2 & \cdots \end{array} \right) = \left( \begin{array}{c c c} a_1 & a_2 & \cdots \end{array} \right) $$ So, the matrix you want is given by $$ X = \left( \begin{array}{c c c} x_1 & x_2 & \cdots \end{array} \right) $$.

1
On

If you think about the linear map $L: \mathbb{R}^n \to \mathbb{R}^m$ that corresponds to your matrix $A$, you have the identity that $\text{Col}(A) = \text{Im}(L)$, or, in words, that the column space of your matrix is equal to the image of the map. Since $b_1, \dotsc,b_n \in \text{Im}(L)$ there are, by definition of $\text{Im}(A)$, vectors $c_1, \dotsc, c_n$ in the domain $\mathbb{R}^n$ that map to them, i.e. that $b_j = L(c_j)$ for each $j$ (this is a property of maps in general, not just of linear maps on vector spaces).

Translate $L$ back into its matrix representation and you get that $b_j = A\cdot c_j$ for all $j = 1, \dotsc, n$, as desired.