Finding a matrix from its product with its transpose

1.2k Views Asked by At

Suppose $A$ is a $3 \times 3$ matrix. If $A A^T = B$ and $A^T A = C$, where $B$ and $C$ are known and $B \neq C$, can I uniquely determine A?

$A$ has 9 independent elements. Since $B$ and $C$ are symmetric, they have 6 independent entries each. Thus I have an overdetermined nonlinear system of equations with 9 variables and 12 equations. Does a unique solution exist for this system? How can I prove that it does or doesn't?

3

There are 3 best solutions below

4
On

No. Let $AA^T=A^TA=I$. But many matrices are orthogonal.

1
On

Assuming that $B$ and $C$ are compatible enough to yield any solutions at all, there are $8$ solutions, except in degenerate cases I'll describe at the end. ($B=C=I$ is one such degenerate case, but having $B=C$ is not actually the problem; the problem is that $I$ has repeated eigenvalues.)

The matrices $B = AA^{\mathsf T}$ and $C = A^{\mathsf T}A$ must have the same eigenvalues. Suppose that the eigenvalues are $\lambda_1, \lambda_2, \lambda_3$, and that $\mathbf u_1,\mathbf u_2, \mathbf u_3$ are the corresponding orthonormal eigenvectors of $B$ while $\mathbf v_1, \mathbf v_2, \mathbf v_3$ are the corresponding orthonormal eigenvectors of $C$. For now, let's assume that all three eigenvalues are distinct.

Note that $BA = AC = A A^{\mathsf T}A$. As a result, $A\mathbf v_i$ for $i=1,2,3$ is a vector with the property that $BA \mathbf v_i = AC\mathbf v_i = \lambda_i A \mathbf v_i$, so $A\mathbf v_i$ is an eigenvector of $B$'s eigenvalue $\lambda_i$. This makes it a multiple of $\mathbf u_i$: $A \mathbf v_i = d_i \mathbf u_i$ for some underdetermined $d_i$.

Which multiple? Once we choose these three multiples $d_1, d_2, d_3$, we have $AV = UD$, where $V$ is the matrix whose columns are $\mathbf v_1, \mathbf v_2, \mathbf v_3$, $U$ is the matrix whose columns are $\mathbf u_1, \mathbf u_2, \mathbf u_3$, and $D$ is the diagonal matrix with $d_1, d_2,d_3$ on the diagonal. So we get $A = UDV^{\mathsf T}$ (since $V$ is orthogonal).

Then $C = A^{\mathsf T}A = VDU^{\mathsf T}UDV^{\mathsf T} = V D^2 V^{\mathsf T}$: in other words, diagonalizing $C$ yields the matrix $D^2$. This means that $D$ can only be one of eight matrices, given by $d_1 = \pm \sqrt{\lambda_1}$, $d_2 = \pm \sqrt{\lambda_2}$, and $d_3 = \pm \sqrt{\lambda_3}$. It's easy to check that any of the resulting matrices $UDV^{\mathsf T}$ satisfy the equations $A$ is supposed to satisfy.

(In the case where $d_1, d_2, d_3$ are positive, they are the singular values of $A$, and $UDV^{\mathsf T}$ is the singular-value decomposition of $A$.)

We run into cases with infinitely many solutions when not all of the eigenvalues are distinct. In that case, we know much less about $A \mathbf v_i$: for example, if $\lambda_1 = \lambda_2 \ne \lambda_3$, we know that $A \mathbf v_1$ and $A \mathbf v_2$ are both linear combinations of $\mathbf u_1$ and $\mathbf u_2$, but not which ones. We can generate infinitely many solutions by choosing any of infinitely many orthonormal eigenbases for $B$ and $C$, then pretending that this problem doesn't exist and proceeding as we did earlier.

Also, we will get only $4$ solutions in cases where one eigenvalue $\lambda_i$ is $0$. (If more than one eigenvalue is $0$, we're in the infinite-solutions case above.) In this case, choosing $d_i = \sqrt{\lambda_i}$ or $d_i = -\sqrt{\lambda_i}$ is the same: $d_i = 0$ in both cases.

1
On

After requiring $B \neq C$ still no.

Consider $A =\begin{pmatrix}0 & a & 0 \\ b & 0 & 0 \\ 0 & 0 & c\end{pmatrix}$.

This satisfies your requirements, and yet the only non-zero elements of $AA^T$ and $A^T A$ are $a^2, b^2$ and $c^2$, so their sign cannot be determined.

Conjugation of $A \to O A O^T$ with any orthogonal will maintain this structure and generate a larger family of such cases.