Solve matrix equation sum

240 Views Asked by At

Let $D$ be an $(n \times n)$ matrix. Let $A,B,C$ be $(q \times n)$ matrices with $q<n$. Furthermore, assume $A'B=B'C=A'C=\mathbb{0}$ where $\mathbb{0}$ is the $(q \times q)$ matrix of zeros, where ' represents the matrix transpose.

Can the below be expressed in the form $FDF'$ for some matrix $F$ of size $(q \times n)$?

$$ADA'+BDB'+CDC'$$

How would one find such an $F$?

3

There are 3 best solutions below

5
On BEST ANSWER

Let $M = A D A' + B D B' + C D C'$ be a $q \times q$ matrix.

Let $U \Sigma V = A$ be the (slim) singular value decomposition of A, where $\Sigma (r \times r) $ is diagonal and full rank ($r \leq q)$ and $U (q \times r), V (r \times n)$ are orthogonal matrices.

Then we can do:

$$\begin{matrix} A D A' + B D B' + C D C' = M \\ A' A D A' + A' B D B' + A' C D C' = A' M & \text{premultiply by A'}\\ A' A D A' = A' M & \text{since A'B = A'C = 0} \\ A' A D A' A = A' M A & \text{post multiply both sides by A} \\ V' \Sigma^2 V D V' \Sigma^2 V = V' \Sigma U' M U \Sigma V & \text{substitute A's SVD} \\ \Sigma V D V' \Sigma = U' M U & \text{cancel what we can} \\ A D A' = UU' M UU' & \text{premultiply by U, postmultiply by U'} \\ \end{matrix}$$

This is almost in the form you want, but this is as far as we can go if we don't know the rank of A, since $UU' \neq I$ if $r < q$.

Now, big assumption time. Let's assume $r = q$. This requires that A has rank q. Then we can do $A D A' = UU' M UU' = M$, and then we just have $F = A$. But then if $A$ is full rank then $B, C$ must be $0$, since $(AA')^{-1}(AA')B = B$ and $(AA')^{-1}A (A'B) = 0$ (likewise for C), and matrix multiplication is associative. So the full rank case isn't particularly useful in practice.

5
On

In the general form you have stated the question, the answer is no. There are a couple of different obstructions (even without the third term). For non-invertible $D$, the sum you mention may have rank strictly larger than that of $D$ whence also of $FDF'$. As an example take : $$ A=\begin{pmatrix} 1 & 1 & 0 \\ 0 & 0 & 0 \end{pmatrix}, B=\begin{pmatrix} 0 & 0 & 0 \\ 1 & -1 & 0 \end{pmatrix} \ \mbox{ and} \ \ D=\begin{pmatrix} 1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0\end{pmatrix}, $$ for which $ADA'+BDB' = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$ which can never equal $FDF'$.

In the case of symmetric matrices there is a constraint coming from matrix signatures. Take the invertible matrix $$ D=\begin{pmatrix} 1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1\end{pmatrix}, $$ which has signature $(+,-,-)$. By a variant of Sylvester's inertia theorem the signature of $FDF'$ can have at most one positive sign. But calculating $ADA'+BDB'$ you find twice the identity, thus of signature $(+,+)$, a contradiction.

In the above I assume that we are looking at real vector spaces. In the complex case, if you replace transpose by adjoint (which makes more sense in that case) the above analysis carries through.

1
On

We work over $\mathbb{C}$. I don't see how to use the assumptions about the RHS. Anyway, we are facing an equation in the unknown $F$: $FDF^T=E$. We may assume that $D$ is upper-triangular.

If $F$ exists, then necessarily $rank(E)\leq rank(D)$, what we assume in the sequel.

Let $F=[U_{q,q},V_{q,n-q}],D=\begin{pmatrix}P_{q,q}&Q\\0&S_{n-q,n-q}\end{pmatrix}$. Then

$(1)$ $UPU^T+UQV^T+VSV^T=E$. It's a system of $q^2$ equations in the $qn$ unknowns $u_{i,j},v_{i,j}$. A priori, even with the condition about the ranks, that does not imply that there are complex solutions.

Assume that $P$ is generic (randomly choose it). Then $P$ is invertible. If we choose $V=0$, then the equation reduces to $UPU^T=E$ where $rank(E)\leq rank(P)$.

One knows how to solve the equation $UPU^T=P$. cf. my post in

Find $X \in \mathbb{M}_n $ such that $ AX + X^TA = 0 $.

The algebraic set $\{U;UPU^T=P\}$ has dimension $floor(q/2)$. Then, roughly speaking, the dimension of the image of the function $U\mapsto UPU^T$ is $\approx q^2-q/2$.

Thus, except for rare values ​​of $E$, the equation $UPU^T=E\not= P$ has no solution. Even if there is some solution, I don't know if there is a method to calculate an exact solution or if one has to be content with looking for approximations.

Unfortunately, we must solve $(1)$ in all its generality.