Orthornomal matrices

168 Views Asked by At

Is there a more direct reason for the following:

If the columns of $n\times n$ square matrix are orthonormal, then its rows are also orthonormal.

The standard proof involves showing that left inverse of a matrix is same as the right inverse and thereby concluding that if $Q^TQ = I$, then $QQ^T = I$. This seems to be more of an algebraic manipulation. Can someone offer me a geometric insight?


Thanks

2

There are 2 best solutions below

0
On

This is not really a geometric insight, but at least it is a proof at the vector level.

Let $v_1,\ldots,v_n$ be the columns of $Q$, and $w_1,\ldots,w_n$ its rows. Let also $e_1,\ldots,e_n$ be the canonical basis. Then we have $$ v_j=Qe_j,\ \ w_j=Q^Te_j,\ \ \ j=1,\ldots,n. $$ And, for all $j,k$, $$ \langle v_j,e_k\rangle = \langle Qe_j,e_k\rangle=\langle e_j,Q^Te_k\rangle=\langle e_j,w_k\rangle. $$ Note that for any $x,y$ we have, using that $(e_j)$ is an orthonormal basis, $$ \langle x,y\rangle = \sum_{s,t}\langle x,e_s\rangle\langle y,e_t\rangle\langle e_s,e_t\rangle = \sum_t\langle x,e_t\rangle\langle y,e_t\rangle. $$ Below we will use this equality first for the basis $(e_j)$ and then for the basis $(v_j)$. Then $$ \langle w_j,w_k\rangle=\sum_t\langle w_j,e_t\rangle \langle w_k,e_t\rangle =\sum_t\langle e_j,v_t\rangle\langle e_k,v_t\rangle=\langle e_j,e_k\rangle, $$ showing that $w_1,\ldots,w_n$ is orthonormal.

0
On

If $Q$ is orthonormal, its transpose is its inverse, and vice versa. Understanding this is key to your question. You are asking for a geometric understanding of this, rather than a symbolic algebra understanding. So let's give that a try.

Because columns are orthonormal, then what does $Q^T$ do to the $i$th column of $Q$? If you are in the habit of seeing matrix multiplication with a vector as a sequence of dot products of rows with that vector (which captures the degree to which the vector is parallel to each of the rows), then you see that $Q^T$ takes the $i$th column to the $i$th standard basis vector.

But then what does $Q$ do to the $i$th standard basis vector? Again if you see matrix multiplication with a vector as a sequence of dot products, then you see that $Q$ applied to the $i$th standard basis vector captures the projection of each of $Q$'s rows onto the $i$th standard basis vector, yielding $Q$'s $i$th column.

We've just established that $Q$ and $Q^T$, as actions on vectors, exchange the set of columns of $Q$ with the set of standard basis vectors. So they are inverse actions.