I was fiddling with some $n\times n$ square matrices and I came across a matrix where the transpose of $A$ multiplied by $A$ gives me the diagonal matrix with values of determinant of $A$. That is $A^TA = \det(A)I$. Does this mean anything significant? Would love to hear your thoughts.
Does $A^TA = \det(A) I$ imply anything significant?
245 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
I'll assume we are working over $\mathbb{R}$. Let me try and solve a slightly more general problem and then see what this implies for your problem. Assume we have a matrix $A \in M_n(\mathbb{R})$ such that $AA^T = cI$ for some $c \in \mathbb{R}$. What can we say about $A$?
First, the matrix $AA^T$ is positive semi-definite and so $c \geq 0$. Let us denote the columns of $A$ by $A_i$. The $(i,i)$-entry of $AA^T$ is $\left< A_i, A_i \right> = \| A_i \|^2$ (the length of $A_i$ squared with respect to the standard inner product on $\mathbb{R}^n$). If $c = 0$ then this implies that $A_i = 0$ for $1 \leq i \leq n$ so $A = 0$. If $c > 0$, we can define $B := \frac{A}{\sqrt{c}}$ and then $BB^T = I$.
Real matrices $B \in M_n(\mathbb{R})$ which satisfy $BB^T = I$ are called orthogonal matrices. They are invertible and preserve the length and angles of vectors in $\mathbb{R}^n$ (with respect to the standard notion of length and angle). We have shown that $A = \sqrt{c} B$ so $A$ is a positive scalar times an orthogonal matrix. Such matrices preserve the angles of vectors but not necessary the length (they scale the lenght by $\sqrt{c})$.
Let us return now to your question. Assume $AA^T = \det(A)I$. By what we have shown, we either have $A = 0$ or $\det(A) > 0$. Taking the determinant of both sides, we get
$$ \det(AA^T) = \det(A)\det(A^T) = \det(A)^2 = \det(\det(A)I) = \det(A)^n. $$
If $n \neq 2$ then this implies that $\det(A) = 1$ so $AA^T = I$ and $A$ is an orthogonal matrix (or maybe $A = 0$). If $n = 2$ then this implies nothing but our previous discussion shows that $A$ is an orthogonal matrix times a non-negative number so geometrically it represents a matrix which preserves angles but not necessarily lengths of vectors (unless again $A = 0$). To summarize:
- If $n \neq 2$ then $A$ must be orthogonal with $\det(A) = 1$ or $A = 0$.
- If $n = 2$ then $A$ must be of the form $A = cB$ where $c \geq 0$ and $B$ is orthogonal with $\det(B) = 1$.
On
Actually, yes, in my opinion. It means that $A$ is an angle-preserving linear transformation (or more rigorously, $A$ is a matrix representation of one), which is pretty neat. In general, we can characterize all angle-preserving linear transformations on a real finite dimensional Euclidean Space as scalar multiples of an orthogonal transformation. $A$ is clearly such a matrix. I'll prove the characterization below.
Proof. For suppose the angle between $Ay$ and $Aw$ is that between $y$ and $w$, that is, if $\displaystyle \frac{\langle Ay, Aw \rangle}{\|Ay \| \|Aw \|} = \frac{\langle y, w \rangle}{\|y\| \|w\|}$. Now, select an orthonormal basis $\{v_i\}_{i=1}^n$ for the Euclidean space $V$. Then $\{Av_i\}_{i=1}^n$ forms an orthogonal basis. This follows simply from the angle-preserving nature of $A$, for $0 = \langle v_i, v_j \rangle \Rightarrow \langle Av_i, Av_j \rangle = 0$, and $n=\dim(V)$ orthogonal vectors forms a basis for $V$. Denote the $Av_i$ by $u_i$.
Now, we have $A^TA = \begin{pmatrix} u_1 \\ u_2 \\ \vdots \\ u_n \end{pmatrix} \begin{pmatrix} u_1 & u_2 & \ldots & u_n \end{pmatrix} = {\begin{pmatrix} \|u_1\|^2 & 0 & \cdots & 0 \\ 0 & \ddots & \ddots & \vdots \\ \vdots & \ddots & \ddots & 0 \\ 0 & \cdots & 0 & \|u_n\|^2 \end{pmatrix}}$. The claim is that $\|u_i\|^2 = \|u_j\|^2$, for then $A^TA$ is a scalar multiple of the identity, and hence $A$ is a scalar multiple of an orthogonal transformation.
Now, we have $ \displaystyle \frac{\langle v_i, v_i + v_j \rangle}{\|v_i\|\|v_i + v_j\|} = \frac{\langle v_i , v_i \rangle + \langle v_i, v_j \rangle}{\|v_i\| \|v_i + v_j\|} = \frac{1}{1*\|v_i + v_j \|} = \frac{1}{\sqrt{\langle v_i + v_j, v_i + v_j \rangle}} = \frac{1}{\sqrt{\langle v_i, v_i \rangle + 2\langle v_i, v_j \rangle + \langle v_j, v_j \rangle}} = \frac{1}{\sqrt{2}}$. Since $A$ is angle preserving, we have $\displaystyle \frac{1}{\sqrt{2}}=\frac{\langle u_i, u_i + u_j \rangle}{\|u_i\|\|u_i + u_j\|} = \frac{\langle u_i, u_i \rangle + \langle u_i, u_j \rangle}{\|u_i\|\|u_i + u_j\|} = \frac{\|u_i\|^2}{\|u_i\|\|u_i + u_j\|} = \frac{\|u_i\|}{\|u_i + u_j\|}$. Now, multiplying out and squaring gives $\|u_i + u_j \|^2 = 2 \|u_i\|^2 \Rightarrow 2\langle u_i, u_i \rangle = \langle u_i + u_j , u_i + u_j \rangle = \langle u_i, u_i \rangle + 2 \langle u_i, u_j \rangle + \langle u_j, u_j \rangle = \langle u_i, u_i \rangle + \langle u_j, u_j \rangle \Rightarrow \langle u_i, u_i \rangle = \langle u_j, u_j \rangle \Rightarrow \|u_i\|^2 = \|u_j\|^2$. Thus the claim is proven.
It does have some implications for properties of $\bf A$. (source: wikipedia)
It is now possible to show that:
$${\bf A} \text{adj}({\bf A}) = \det({\bf A}){\bf I}$$
And rewriting, assuming $\bf A$ is invertible:
$$\text{adj}({\bf A}) = \det({\bf A}){\bf A}^{-1}$$
Now maybe you can translate what implications your question has for these $\bf C$ and $\bf M$.