Whenever I read some use of the term “orthogonal”, I have been able to find some way in which it is at least metaphorically similar to the idea of two orthogonal lines in euclidean space.
E.g. orthogonal random variables, etc.
But I cannot see how $A^{-1}=A^T$ captures the idea of “orthogonality”. What is “orthogonal” about a matrix that satisfies this property?
$A^{-1} = A^T \iff A^T A = I$, so let's explore what this latter property means. (Assume that we are working in a real vector space, specifically $\mathbb{R}^n$.) Write
$$A = \begin{pmatrix} \uparrow & \uparrow & \dots&\uparrow \\ v_1 &v_2&\dots&v_n \\ \downarrow&\downarrow&\dots&\downarrow \end{pmatrix} $$ then $$A^TA = \begin{pmatrix}\leftarrow & v_1 & \rightarrow \\ \leftarrow & v_2 & \rightarrow \\ \vdots & \vdots & \vdots \\ \leftarrow & v_n & \rightarrow \end{pmatrix} \begin{pmatrix} \uparrow & \uparrow & \dots&\uparrow \\ v_1 &v_2&\dots&v_n \\ \downarrow&\downarrow&\dots&\downarrow \end{pmatrix} =\begin{pmatrix} v_1\cdot v_1 & v_1\cdot v_2&\dots&v_1 \cdot v_n \\ v_2\cdot v_1 & v_2\cdot v_2&\dots&v_2 \cdot v_n \\ \vdots & \vdots & \ddots & \vdots \\ v_n\cdot v_1 & v_n\cdot v_2&\dots&v_n \cdot v_n \\ \end{pmatrix}. $$ So, $A^T A = I$ corresponds exactly to $v_i \cdot v_i = 1$ and $v_i \cdot v_j = 0$ for $i \neq j$, i.e. $\{v_i\}$ are an orthonormal system. So, if a matrix is orthogonal then its columns are orthogonal. Similarly, you can see that its rows are also orthogonal, since if $A$ is orthogonal then $A^T$ is also.