Does $AA^T$ = I iff A is an orthogonal matrix?

25.6k Views Asked by At

I know that if $A$ is an orthogonal matrix, then $AA^T = I$.

However, is it possible to have a non orthogonal square matrix but $AA^T = I$ as well?

A square matrix of size $n$ is orthogonal if the row spaces and column spaces form an orthogonal basis of ${\bf R} ^n$

4

There are 4 best solutions below

0
On

No, the rows (or columns) of $A$ are normalised and orthogonal if and only if $AA^T=I$.

Let the rows of $A$ be $v_i$. Then $a_{ij} = (v_i)_j$, the $j$th component of $v_i$ so $$ (AA^T)_{ij} = \sum_{k=1}^n a_{ik} a_{jk} = \sum_{k=1}^n (v_{i})_k (v_j)_k = v_i \cdot v_j, $$ which is $(I)_{ij}=\delta_{ij}$ if and only if the rows of $A$ are normalised and are (pairwise) orthogonal.

The left- and right-inverses of a matrix coincide, so $A^TA=I$ as well, and one can use the same argument to show that the columns of $A$ are orthogonal and normalised.

1
On

Hint: $$ AA^T=\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}a&c\\b&d\end{bmatrix}= \begin{bmatrix}a^2+b^2&ac+bd\\ac+bd&c^2+d^2\end{bmatrix}\\ =\begin{bmatrix}v_1\cdot v_1 &v_1\cdot v_2\\v_1\cdot v_2&v_2\cdot v_2\end{bmatrix} $$ where $$ A=\begin{bmatrix}-&v_1&-\\ -&v_2&-\end{bmatrix}=\begin{bmatrix}a&b\\c&d\end{bmatrix} $$

0
On

First of all, let's get the definitions straight:

  • A basis $b_1, \dots, b_n$ of an innerproduct space is called orthogonal if each pair $b_i, b_j$ with $i \neq j$ is orthogonal, i.e., if $\langle b_i, b_j \rangle = 0$.

  • A basis $b_1, \dots, b_n$ of an innerproduct space is called orthonormal if it is orthogonal and every $b_i$ is a unit vector, i.e., if $\langle b_i, b_j \rangle = \delta_{ij}$.

  • A square matrix $(A_{ij})_{1 \leq i,j \leq n}$ over an innerproduct space is called orthogonal if its columns and rows both form orthonormal bases.

Now, if you work out what it means for the columns of $A$ to be orthogonal, then that comes out as $A^T A = I_n$. And, the rows of $A$ are orthogonal if and only if $A A^T = I_n$. Both are equivalent to $A$ is invertible and $A^{-1} = A^T$.

So, if a square matrix satisfies $A A^T = I_n$ (i.e., its rows form an orthonormal basis), then $A^{-1} = A^T$. Therefore, also $A^T A = I_n$ (i.e., its columns form an orthonormal basis) and hence $A$ is an orthogonal matrix.

(Of course, the argument is now hidden in the fact that a left-inverse of a square matrix is automatically a right-inverse and vica-versa. For that, see for instance If $AB = I$ then $BA = I$)

0
On

The $(i,j)$th element of the product $AA^T$ is just the dot product of the $i$th and $j$th rows of $A$. Thus, the condition $A^TA=I$ means that $A_iA_j=\delta_{ij}$, that is, that the (square of the) norm of each row is $1$ and that the dot product of different rows is $0$, i.e., they are orthogonal.