The Problem. Given an $n\times n$ matrix $A$, I'm looking to find all possible $n\times n$ matrices $D$ such that for all $\mathbf u, \mathbf v \in \mathbb R^n$ we have
$$(A\mathbf u)^T D A\mathbf v = \mathbf u^T D \mathbf v. \tag{1}$$
Ultimately, my reason for posting this is that I am having trouble making substantial progress here. So, I would like to ask: are there any existing methods for determining possible matrices $D$ based on $A$, or if not, is there a way to determine whether or not a matrix $A$ lends to nontrivial solutions $D$?
Motivation. I'm quite interested in the idea of a scalar quantity computable based on two vectors that is invariant under a particular linear transformation $A$.
To take some examples:
- In 2D space, dot products $\mathbf u \cdot \mathbf v = u_1 v_1 + u_2 v_2$ are preserved under rotations.
- The area spanned by the parallelogram formed by the vectors $\mathrm{Area}(\mathbf u, \mathbf v)= u_1 v_2 - u_2 v_1$ is invariant under rotations too.
- In special relativity there exists an inner product of the form $\mathbf u \odot \mathbf v = u_1 v_1 - u_2 v_2$ that is invariant under Lorentz transformations.
I'm generalizing these sorts of products through the following decomposition, which motivates the existence of a matrix $D$ which basically serves as a way to encode the method of computation of the desired invariant scalar quantity. I will illustrate with each of the examples I mentioned above. Let $\phi, \gamma, \beta$ be constants.
\begin{align} \mathbf u \cdot \mathbf v &= u_1 v_1 + u_2 v_2 = \begin{bmatrix} u_1 & u_2 \end{bmatrix} \underbrace{\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}}_{D} \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = \mathbf u^T D\mathbf v & \text{example 1}\\ & \text{so if } A \text{ is a rotation matrix } \begin{bmatrix} \cos \phi & -\sin \phi \\ \sin \phi & \cos \phi \end{bmatrix}, \\ &\text{then } A \mathbf u \cdot A \mathbf v = \mathbf u \cdot \mathbf v \Leftrightarrow (A\mathbf u)^T D A\mathbf v = \mathbf u^T D \mathbf v \\ \\ \\ \mathrm{Area}(\mathbf u, \mathbf v) &= u_1 v_2 - u_2 v_1 = \begin{bmatrix} u_1 & u_2 \end{bmatrix} \underbrace{\begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix}}_{D} \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = \mathbf u^T D\mathbf v & \text{example 2} \\ & \text{so if } A \text{ is a rotation matrix } \begin{bmatrix} \cos \phi & -\sin \phi \\ \sin \phi & \cos \phi \end{bmatrix}, \\ &\text{then } \mathrm{Area}(A\mathbf u, A\mathbf v) = \mathrm{Area}(\mathbf u, \mathbf v) \Leftrightarrow (A\mathbf u)^T D A\mathbf v = \mathbf u^T D \mathbf v \\ \\ \\ \mathbf u \cdot \mathbf v &= u_1 v_1 - u_2 v_2 = \begin{bmatrix} u_1 & u_2 \end{bmatrix} \underbrace{\begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix}}_{D} \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = \mathbf u^T D\mathbf v & \text{example 3} \\ & \text{so if } A \text{ is a Lorentz transformation} \begin{bmatrix} \gamma & - \beta \gamma \\ - \beta \gamma & \gamma \end{bmatrix}, \\ &\text{then } A \mathbf u \odot A \mathbf v = \mathbf u \odot \mathbf v \Leftrightarrow (A\mathbf u)^T D A\mathbf v = \mathbf u^T D \mathbf v \end{align}
In all cases, the transformation $A$ and the method of computing the scalar product $D$ all satisfy the relationship indicated by equation (1). This is what motivates my interest in equation (1) specifically.
Progress and notes so far.
- One way to satisfy equation (1) is for $A$ and $D$ to satisfy $A^T D A = D$ (perhaps this is the only way?).
- If indeed $A^T D A = D$, then $D$ can be diagonal only if $A$ is orthogonal, as is the case in example 1 and example 3. However, example 2 suggests that a restriction like this may not capture the full variety of matrices $D$ for a given $A$.
- I've also already noticed that if $D$ is a solution to equation (1), then so is $kD$ for any constant $k$, so if at least one nontrivial matrix $D$ exists to satisfy equation (1), then $D$ is not unique.
- I attempted to gain some intuition of solutions to $A^T D A = D$ through brute force in what I thought would be the easiest case: the $2\times 2$ case. Specifically, if $A=\begin{bmatrix} a & b\\ c & d \end{bmatrix}$ and $D=\begin{bmatrix} w & x\\ y & z \end{bmatrix}$, then $A^T D A = D$ implies $$ \begin{bmatrix} a & c\\ b & d \end{bmatrix}\begin{bmatrix} w & x\\ y & z \end{bmatrix}\begin{bmatrix} a & b\\ c & d \end{bmatrix} =\begin{bmatrix} w & x\\ y & z \end{bmatrix}$$ which can be reformatted as $$\begin{bmatrix} a^{2} & ac & ac & c^{2}\\ ab & ad & bc & cd\\ ab & bc & ab & cd\\ b^{2} & bd & bd & d^{2} \end{bmatrix}\begin{bmatrix} w\\ x\\ y\\ z \end{bmatrix} =\begin{bmatrix} w\\ x\\ y\\ z \end{bmatrix}$$ which at least indicates that for $D$ to be nontrivial, this $4\times 4$ matrix must have an eigenvalue of $1$, though I couldn't make any significant progress with this knowledge. To make matters worse in general, if $A$ is an $n\times n$ matrix, we will end up with a $n^2\times n^2$ matrix to analyze, higher dimensional cases get complicated very fast. Sadly, little intuition has been gained thus far even in this simplest case—perhaps it's just a very, very complicated problem no matter how you slice it?
Any further advice or insight would be greatly appreciated!
First, if for all $\boldsymbol u,\boldsymbol v$, we have $\boldsymbol u^\mathsf T(A^\mathsf TDA)\boldsymbol v=\boldsymbol u^\mathsf TD\boldsymbol v$, then $A^\mathsf TDA=D$. (You can set $\boldsymbol u=\boldsymbol e_i$ and $\boldsymbol v=\boldsymbol e_j$ to see their $(i,j)$-th elements are equal.)
Given $A\in\mathbb R^{n\times n}$. The matrix $D$ can be computed by the following method. We vectorize both sides and then we will obtain $$ (A^\mathsf T\otimes A^\mathsf T)D^\mathsf V=D^\mathsf V, $$ where $\otimes$ denotes Kronecker product and $D^\mathsf V$ denotes the vectorization of $D$. You can refer to the Wikipedia page for details.
Thus, $D^\mathsf V$ is an eigenvector of $A^\mathsf T\otimes A^\mathsf T$, corresponding to eigenvalue $1$. Solve $D^\mathsf V$ and transform it into an $n\times n$ matrix. The desired $D$ is obtained.