Recovering one factor of a symmetric matrix product

35 Views Asked by At

Suppose you observe some matrix $\Sigma$ and you know it is of the form $$\Sigma = H\sigma \sigma^T H^T,$$ where $H^TH=I$ and $HH^T=P$ satisfies $P^2=P$, $P^T=P$ (i.e. it is an orthogonal projection), $H\in \mathbb{R}^{D\times d}$, $\sigma\in \mathbb{R}^{d\times d}$, but $\sigma$ and $\sigma\sigma^T$ are unknown. We want to estimate $H$. Are there any standard methods for doing so?

If $\sigma\sigma^T=I$, we can just use SVD. Since in that case $\Sigma = P=HH^T$ if we perform SVD we get $\Sigma = USV^T$ and assuming the singular values are in increasing order, the first $K<D$ entries of the diagonal of $S$ are zero, so if $A=U\sqrt{S}$ and the columns are $\vec{a}_1,\dotsc, \vec{a}_D$, we take the matrix $$H = [\vec{a}_K, \vec{a}_{K+1}, \dotsc, \vec{a}_D].$$

I suspect in the case $\sigma\sigma^T\neq I$, this is too much to ask for without knowing/observing $\sigma$ or $\sigma\sigma^T$. Seems like the best we can do is try to recover $B = H\sigma$. If we know $\sigma \sigma^T$ and it is invertible, then we could get $B\sigma^T = H\sigma\sigma^T$ so that $H = B\sigma^T(\sigma\sigma^T)^{-1}$.

It might be a naive question hoping for too much but I figured it can't hurt to ask for advice. I apologies if this is something basic, and would love references for these sorts of things in linear algebra/matrix factorization.

1

There are 1 best solutions below

0
On BEST ANSWER

As noted in the comments, it suffices to extract the first $d$ columns of $U$ from the SVD $\Sigma = U\Sigma V^T$. Here's some Python code illustrating this approach.

import numpy as np
import numpy.linalg as la
from numpy.random import randn

# parameters
d = 3
D = 10

# generate example
sigma = randn(d,d)
H = la.qr(randn(D,d))[0]
Sigma = H @ sigma @ sigma.T @ H.T
P = H @ H.T

# apply method
H = la.svd(Sigma)[0][:,:d]

# check accuracy of corresponding P
print(f"Relative error: {la.norm([email protected] - P)/la.norm(P):.3g}")