An invertible matrix is orthogonal if and only if the inverse is equal to the transpose on nonzero elements

480 Views Asked by At

Let $A$ be an invertible real matrix, and suppose that $(A^{-1})_{i,j} = (A^{T})_{i,j}$ whenever $(A^{T})_{i,j}\ne 0$. Is it true that $A$ is orthogonal?

I found this statement in a paper without proof. The converse is true for obvious reasons.

Trying to write $AA^{-1} = A^{-1}A= I$ one can find that $A$ has all columns and rows of unitary norm, but not much more. This is sadly not enough to conclude that $A$ is orthogonal (there are explicit counterexamples). Can you help?


For those who asked, notice that for all $i,j$, $$(A^{-1})_{i,j}(A)_{j,i} = (A^{-1})_{i,j}(A^{T})_{i,j} = (A^{T})^2_{i,j}$$ so $$1 = (A^{-1}A)_{i,i} = \sum_j (A^{-1})_{i,j}(A)_{j,i} = \sum_j (A^{T})^2_{i,j}$$ meaning that the norm of any column of $A$ is 1. From $AA^{-1}=I$ you get that also the norm of the rows is 1.

1

There are 1 best solutions below

1
On BEST ANSWER

Your claim is false. Let $$A=\frac{1}{\sqrt{2}}\begin{bmatrix}0&-1&-1\\-1&0&-1\\-1&-1&0\end{bmatrix}$$ Then $A$ is self-adjoint and $A^{-1}=\frac{1}{\sqrt{2}}I_3+A$, so that $A^{-1}=A^{\mathsf{T}}$ except where the entries of the latter vanish.

I found this just by asking Mathematica to search for a $3\times3$ counterexample; there's no insight to it. On the other hand, Mathematica doesn't think there are any $3\times3$ counterexamples with determinant $1$. Perhaps you misinterpreted the paper or dropped a condition?